

A few years ago, backstage at a conference, I spotted a blind woman using her phone. The phone was speaking everything her finger touched on the screen, allowing her to tear through her apps. My jaw hit the floor. After years of practice, she had cranked the voice’s speed so high, I couldn’t understand a word it was saying.

And here’s the kicker: She could do all of this with the screen turned off. Her phone’s battery lasted forever.

Ever since that day, I’ve been like a kid at a magic show. I’ve wanted to know how it’s done. I’ve wanted an inside look at how the blind could navigate a phone that’s basically a slab of featureless glass.

This week, I got my chance. Joseph Danowsky offered to spend a morning with me, showing me the ropes.

Joe majored in economics at the University of Pennsylvania, got a law degree at Harvard, worked in the legal department at Bear Stearns, became head of solutions at Barclays Wealth, and is now a private-client banker at US Trust. He commutes to his office in Manhattan every morning from his home in New Jersey.

Joe was born with cone-rod dystrophy. He can see general shapes and colors, but no detail. (Only about 10 or 15 percent of visually impaired people see no light or color at all.) He can’t read a computer screen or printed materials, recognize faces, read street signs or building numbers, or drive. And he certainly can’t see what’s on his phone.

Yet Joe spends his entire day on his iPhone. In fact, he calls it “probably the number one assistive device for people who can’t see,” right up there with “a cane and a seeing eye dog.”

The key to all of this is an iPhone (AAPL) feature called VoiceOver. At its heart, it’s a screen reader—software that makes the phone speak everything you touch. (Android’s TalkBack feature is similar in concept, but blind users find it far less complete; for example, it doesn’t work in all apps.)

You turn on VoiceOver in Settings -> General -> Accessibility. If you turn on VoiceOver, you hear a female voice begin reading the names of the controls she sees on the screen. You can adjust the Speaking Rate of the synthesized voice.

There’s a lot to learn in VoiceOver mode; people like Joe have its various gestures committed to muscle memory, so that they can operate with incredible speed and confidence.

But the short version is that you touch anything on the screen—icons, words, even status icons at the top; as you go, the voice tells you what you’re tapping. “Messages.” “Calendar.” “Mail—14 new items.” “45 percent battery power.” You can tap the dots on the Home screen, and you’ll hear, “Page 3 of 9.”

You don’t even have to lift your finger; you can just slide it around, getting the lay of the land.

Once you’ve tapped a screen element, you can also flick your finger left or right—anywhere on the screen—to “walk” through everything on the screen, left to right, top to bottom.

Ordinarily, you tap something on the screen to open it. But since single-tapping now means “speak this,” you need a new way to open everything. So: To open something you’ve just heard identified, you double tap. (You don’t have to wait for the voice to finish talking.) In fact, you can double-tap anywhere on the screen; since the phone already knows what’s currently “highlighted,” it’s like pressing the Enter key.

There are all kinds of other special gestures in VoiceOver. You can make the voice stop speaking with a two-finger tap; read everything, in sequence, from the top of the screen with a two-finger upward flick; scroll one page at a time with a three-finger flick up or down; go to the next or previous screen (Home, Stocks, and so on) with a three-finger flick left or right; and more.

If you do a three-finger triple-tap, you turn on Screen Curtain, meaning that the screen goes black. You gain visual privacy as well as a heck of a battery boost. (Repeat to turn the screen back on.)

Joe, however, doesn’t see that battery boost, since he’s on the phone all day long. In fact, he’s equipped his phone with one of those backup-battery cases.

The Rotor

Joe also demonstrated for me the Rotor: a brilliant solution to a thorny problem. There are dozens of settings to control in a screen reader like VoiceOver: voice, gender, language, volume, speaking speed, verbosity, and so on. How do you make all of these options available in a concise form that you can call up from within any app—especially for people who can’t see controls on the screen?

Story continues