Kids with autism can get a boost in their ability to recognize emotions in people's faces with the help from technology. The system uses machine learning and Google's augmented reality system, Google Glass.

"The actual term 'Superpower Glasses' came from the children who we've worked with throughout the programming and development of this intervention," Dennis Wall, who led the research, told Bob McDonald in and interview with Quirks & Quarks. Wall is an assistant professor of pediatrics, psychiatry and biomedical data sciences at Stanford University in California,

One of the hallmarks of Autism Spectrum Disorder is those that have it have trouble "reading" other people. Emotions most of us can easily read in other faces — anger, surprise or happiness, for example — are difficult for those with autism to figure out.

The system helps train kids with autism in how to recognize facial emotional cues by watching faces along with them and decoding the emotions for them.

"If other people are feeling different things, I could tell what emotion they're feeling, which is kind of fantastic," said Ethan, an eight year old from California who tested the glasses, in conversation with Mark Hanlon from Stanford Medical School in a short video documentary the university produced.

How it works

The lens-free glasses sit on the child's face, and have an outward facing camera that watches the world around the child. The glasses have a display in the child's peripheral vision. When the system recognizes a face, it does an analysis of the emotion the face is expressing and signals that emotion to the child by displaying an appropriate emoji.

They created a phone app to accompany and power the glasses, which can also record the interactions the child has when getting help in decoding faces, so the parents can later review and reinforce that learning with the child.

"Essentially, in this context, the glass units act as a message passer from the world to the phone and back in real time," added Wall.

How it decodes emotions on people's faces

The app detects emotion in the faces it sees using an algorithm generated by a machine learning system. This AI system was trained on large datasets of faces to decode the emotions from facial expressions.

Our faces reveal a lot about us. When we feel an emotion, our facial muscles pull and tug at our faces, which then display an outward emotion.

There are macro-expressions like big smiles or frowns, but also micro-expressions that are very subtle, involuntary, shorter in duration, and nearly impossible to fake.

"We call them fiduciary points in the face — essentially the amount of eyebrow raising or lip curl," said Wall. "We track these as landmarks that then get converted into numerical vectors. Those numbers then get submitted to a classifier, which is trained to understand what those vectors mean and in turn spits out a classification."

How well it works

Wall tested the glasses in a randomized clinical trial of 71 children with autism spectrum disorder. The results were published this week in the journal JAMA Pediatrics.

"In this randomised control trial ... we compare treatment with Glass to treatment as usual — the 'standard of care' therapy," said Wall.

The standard behavioural intervention involves using flashcards to help the child recognize emotions in the face.

"We see a change in the individuals who received the Glass in comparison to the individuals who did not receive the Glass — while (...) both groups, the control and treatment group, are still receiving the standard of care therapy."

"Right now, I think we can say that there's quite a bit of promise for this particular system to be used as an augmentation to standard of care therapy," said Wall.

Their training protocol involves the kids using the glasses for six weeks. Wall said he's gotten a lot of positive feedback from parents and teachers about the impacts they've seen from their kids who trained with the glasses. He said not only do the kids become more interested in faces, but it also improves their ability to read emotions.