
AI Chat: ChatGPT, AI News, Artificial Intelligence, OpenAI, Machine Learning Teaching AI to Read Braille with Robyn Hughes
Feb 9, 2026
Robyn Hughes, a Braille instructor and UEB-certified consultant who researches teaching Braille to large language models. She describes her lifelong visual-impairment journey and how AI aids daily independence. She explains teaching the Braille alphabet to LLMs, tokenization experiments, hands-on training methods, and broader implications for reducing errors and improving accessibility.
AI Snips
Chapters
Transcript
Episode notes
Lifelong Braille Reader Shapes Research
- Robyn Hughes learned Braille and assistive tech from childhood and used early devices like Braille and Speak.
- Her lived experience as a lifelong Braille reader shaped her later AI experiments teaching ChatGPT Braille.
LLMs Extend Sight Through Camera Workflows
- Robyn uses ChatGPT with a phone camera (and potentially AR glasses) to inspect physical objects like roof shingles.
- LLM-driven vision workflows can replace routine visual checks for people with low vision when reliable.
Use AI To Monitor Home Safety Regularly
- Use LLMs to inspect hard-to-see household features like basement pipes and food labels to prevent big problems.
- Pair phone cameras or AR glasses with an assistant to perform frequent safety and orientation checks.
