The participants viewed a series of houses and faces that appeared on a screen for 400 milliseconds at a time, and were told to look for the upside-down building. An algorithm tracked the brain waves of their temporal lobes, which deals in sensory input. By the end of each session, the program was able to pinpoint with roughly 96 percent accuracy which images the patients were looking at, in real time. The program knew whether the patient was seeing a house, a face or a gray screen within 20 milliseconds of actual perception.

"Clinically, you could think of our result as a proof of concept toward building a communication mechanism for patients who are paralyzed or have had a stroke and are completely locked-in," UW computational neuroscientist Rajesh Rao said.

[Image credit: Kai Miller and Brian Donohue, Kai Miller]