Welcome to 2026, readers. I hope that you are all enjoying the frigid weather and the dark nights!
This is the last in the set of articles that we have been making regarding low vision and assistive technology. Unlike the other articles, this one is more of a conceptual overview as opposed to a brief how-to guide.
The program that we are talking about today is called Voiceover. It allows a person who is completely blind to use and access an iPhone or an iPad. This program is also native to the iPhone and the iPad. So just like the rest of the apps highlighted in this series, this is exclusive to Apple products.
Typically, devices like a computer or an iPhone rely on a person’s sight to navigate them. Using hand-eye coordination to look at the element we want to interact with and then clicking with the mouse or finger. If you don’t have any vision, this process is impossible.
Enter Voiceover, the solution to this barrier! Voiceover is called a Screen Reader. A screen reader is a tool that turns what is on the screen into spoken words. It perceives text, buttons, links and other elements on the screen and reads them aloud for the user in a clear organized way. In this context, the name of the program, Voiceover, makes sense!
One of the key components to learning technology that speaks to you is interpreting the information you are receiving. The information Voiceover conveys when it is on an element is typically imparted in this order:
- What type of element it is (such as a button, or an application)
- Information related to the item (how many notifications on the application)
- Is the item interactable and how to do that.
Each piece of information that Voiceover chooses to convey is crucial. Because we are not using the device with our sight, it is important to rely on our other senses to make up the difference.
One of the biggest hurdles for learning this software is that we must be patient in trying to understand a new way of being presented information. To go from using your eyes to your ears and hands is a tough transition, but a doable one!
Now that we know what voiceover conveys to us when its focus is on an element, let’s use a real-life example.
If I turn on voiceover and its focus is on the message app (denoted by a box surrounding the app the focus is on), it will say, “messages, 2 unread messages, double tap to open”. It is telling us everything we need to know and giving us all the information that someone with vision would get from navigating the home screen.
The whole screen is the input surface so you can double tap with one finger anywhere to activate what the focus is on. Once the application opens, the focus will tell you where it is, and where it has landed.
As far as how to interact with Voiceover, the gestures are simple!
If you swipe from left to right with one finger, it will move the focus to the right. If you swipe from right to left with one finger, it will move the focus to the left. To activate elements, you double tap with one finger. There are multi-finger gestures, so when I say one finger, I really mean it!
This technology is hard to learn on your own. As always, I am at Coastline and the New Bedford Public Library two times a month if you have any questions or need help.
Thank you for your time, energy, and engagement with this series. It has been a pleasure and an honor to write for the Senior Scope, and I look forward to working with some of you in the future!!
Charles Meyers is an assistive technology trainer for the Massachusetts Association for the Blind and Visually Impaired. He works with older adults in greater New Bedford to help them regain independence through technology and adaptive strategies. Call Coastline at 508-742-9160 or the New Bedford Public Library at 508-991-6275 to make an appointment with him as available days and times vary
Recent Comments