Microsoft has developed a smartphone app called GazeSpeak which makes it easier for those with Motor Neurone Disease to communicate using only their eyes.

The app using video recorded in real time by the smartphone’s camera to convert gaze to speech, and uses a sticker on the back of the phone with a grid of letters which the speaker looks at to indicate their selection and spell out their message.

The app uses artificial intelligence similar to predictive texting to know which of the letters in each arm of the grid the speaker is selecting. The top four word predictions are shown onscreen, and the top one is read aloud.

“For example, to say the word ‘task’ they first look down to select the group containing ‘t’, then up to select the group containing ‘a’, and so on,” says Xiaoyi Zhang, who developed GazeSpeak whilst he was an intern at Microsoft.

“We’re using computer vision to recognise the eye gestures, and AI to do the word prediction,” says Meredith Morris at Microsoft Research in Redmond, Washington.

People with Motor Neurone Disease (commonly called ALS) often only have movement in their eyes, leaving this as their only method to express themselves.

“People can become really frustrated when trying to communicate, so if this app can make things easier that’s a really good thing,” says Matthew Hollis from the Motor Neurone Disease Association.

The smartphone app is a good alternative to the older practice of using boards and manual gaze tracking by specially trained interpreters, which is difficult and less effective, with GazeSpeak in tests taking 78 seconds on average to complete a sentence, vs 123 seconds using the boards.

“I love the phone technology; I just think that would be so slick,” said one of the interpreters.

Microsoft will present the app at the Conference on Human Factors in Computing Systems in Colorado in May. The app, which is iOS only, will be available in the App store, and the source code will also be freely available.