Submitted by Lysette Chaproniere on Wednesday, June 19, 2019 - 07:17.

I flick around the screen, double-tapping when I want to select something. As I do so, VoiceOver's speech comes through my Aftershokz headset, which I wear most of the time. That makes me a cyborg, or so I'm told. Nothing is showing on my screen. "Is your phone on?" One of the students on my course at university once asked me, as I called a taxi. "The screen is black." Or as one train manager once remarked, as I was searching for my E-ticket to show him, "That phone doesn't look very awake." Of course, I turned my screen on, so he could see it, once I had my ticket displayed in the app.

It's a strange thing, the iPhone. On the one hand, there's nothing different about the iPhones blind people use. People are often surprised to hear that VoiceOver and other accessibility tools are on every handset. It has given me access to more mainstream services than I had previously. I buy books from stores that don't cater specifically to blind people, and I require no special equipment to read them. Even apps for the blind such as Seeing AI run on the same hardware every iPhone user has.

And yet, the way we use our iPhones is completely different to the way a sighted person would use them. We double-tap where the rest of the world taps once. We flick around the screen, and perform common actions by using a strange thing called the rotor. And as we use our specialist gestures, possibly with nothing showing on the screen, VoiceOver reports back to us. What VoiceOver reads to us isn't necessarily the same as what would normally appear on screen. VoiceOver says "Share button" or "settings button" but the words "share" and "settings" don't appear on screen. Instead, those buttons are represented by icons, and VoiceOver doesn't tell us what those icons are.

All of this can make it rather difficult to communicate about our technology. What if you want to help a sighted friend with their iPhone? You might be an expert VoiceOver user, but if you don't know how the buttons are represented visually, you're going to find it difficult to explain what to do. "Tap the menu button." "What menu button?" And what if you're a sighted person wanting to help a blind friend? You won't be able to tell, from what's displayed on the screen, what VoiceOver calls each icon.

We use the same devices as everybody else, but the way we use them is so different that it can be hard for people to understand how VoiceOver works, or how a blind person could possibly use a touchscreen, and, though I'm not a developer myself, I imagine it must be a challenge to make apps that work equally well for both kinds of users and write documentation that explains the app in a way that makes sense for both use cases.

What, if anything, should we do about this? Perhaps the gap is inevitable, because the only way Apple could've made their touchscreen devices accessible was to invent a new way for blind people to use them. I wouldn't want Apple to make the experience of blind and sighted users more similar if it meant making devices less accessible. Still, I think it's worth trying to find a way to bridge the communication gap. I don't have a definite answer, but I'm writing this post with the aim of starting a discussion.

Is there anything Apple, or app developers, could do? It might help if VoiceOver users had access to visual descriptions of controls. It would probably be best not to add this information to the standard button labels because most of the time, you just want to know about a button's function, not its physical appearance. It would be better to have a gesture that would give this information. However, adding this information would probably take a lot of effort, and there might be easier ways to get the same result.

My second suggestion would be for a group of us to collaborate on writing a guide that comprehensively explains the differences in the two ways of using Apple devices. It would include basic information such as the fact that a single tap with VoiceOver turned off is equivalent to a double-tap with VoiceOver turned on, and more specific information about which icons go with which Voiceover labels. That would have to be done by a person or group of people who are knowledgeable about the use of Apple devices both with and without VoiceOver. Perhaps we can make a start on getting this kind of information together in the comments.

So, what do you wish your sighted friends knew about how VoiceOver works? What do you think it would be useful for VoiceOver users to know about how people use the iPhone without VoiceOver? Developers, have you found it difficult to create apps that are easy to use both with and without VoiceOver? What are the challenges of making an app work for both types of users, and do you think Apple could do anything to make it easier, or are they already doing a good job? Please also feel free to share your stories, positive or negative, funny, frustrating or enlightening, about communicating about your use of the iPhone and other Apple devices.