For visually impaired users, most smartphone camera apps leave something to be desired (Image: Victor Jori/Getty)

Blind and partially sighted people can now boost their photography skills, thanks to a smartphone camera app that bypasses the visual cues sighted people take for granted.

Dustin Adams and colleagues at the University of California at Santa Cruz noted that people with impaired vision want to be able to snap pictures and show them to friends – just like anybody else does. But little research existed on what helps people with vision problems take better pictures.

So the researchers quizzed 54 people aged between 18 and 78 – some totally blind, some partially sighted and some with a degree of light perception – about what they find hardest about taking snaps. The results pretty much served as a specification for an app.


One survey respondent said that knowing how to frame a shot was one of the main obstacles: “If I am in a group, I usually have someone advise me on camera placement, even if I take the picture myself.”

The survey showed that although many smartphones already offer face detection and a useful accessibility feature that speaks the function of screen buttons you touch, many more features are needed to make a camera app suitable for the visually impaired.

Swipe to snap

The researchers also built their own app that dispenses with a “shutter” button as it can be hard for people with a visual impairment to locate. Instead, the app snaps a picture in response to a simple upward swipe gesture. And it merges face detection and the voice accessibility features so that the phone speaks out loud the number of faces detected, helping the user get everyone in shot. Audio cues help get the main subject of a shot in frame and in focus.

As soon as the app’s camera mode is turned on, the phone also begins recording a 30-second audio file which can be restarted at any time with a double tap to the screen. This is to help with photo organising and sharing – and is used as an aide-memoire as to who is in shot. The user can choose to save this sound file along with the time and date, and GPS data that is translated into audio giving the name of the neighbourhood, district or city the shot was taken in.

The UC Santa Cruz team will reveal their full survey results and detail the app’s capabilities at the Pervasive Technologies Related to Assistive Environments conference in Rhodes, Greece, later this month.