That bowed head says it all (Image: Pierre Baelen/Plainpicture)

WHETHER striding ahead with pride or slouching sullenly, we all broadcast our emotions through body language. Now a computer has learned to interpret those unspoken cues as well as you or I.

Antonio Camurri of the University of Genoa in Italy and colleagues have built a system which uses the depth-sensing, motion-capture camera in Microsoft’s Kinect to determine the emotion conveyed by a person’s body movements. Using computers to capture emotions has been done before, but typically focuses on facial analysis or voice recording. Reading someone’s emotional state from the way they walk across a room or their posture as they sit at a desk means they don’t have to speak or look into a camera.

“It’s a nice achievement,” says Frank Pollick, professor of psychology at the University of Glasgow, UK. “Being able to use the Kinect for this is really useful.”


The system uses the Kinect camera to build a stick figure representation of a person that includes information on how their head, torso, hands and shoulders are moving. Software looks for body positions and movements widely recognised in psychology as indicative of certain emotional states. For example, if a person’s head is bowed and their shoulders are drooping, that might indicate sadness or fear. Adding in the speed of movement – slow indicates sadness, while fast indicates fear – allows the software to determine how someone is feeling. In tests, the system correctly identified emotions in the stick figures 61.3 per cent of the time, compared with a 61.9 per cent success rate for 60 human volunteers (arXiv.org/1402.5047).

Camurri is using the system to build games that teach children with autism to recognise and express emotions through full-body movements. Understanding how another person feels can be difficult for people with autism, and recognising fear is more difficult than happiness.

“In one of the serious games we developed, a child is invited to look at a short video of an actor expressing an emotion,” Camurri says. “Then the child is invited to guess which emotion was expressed in the video.” He adds that you can also ask the child to express the same emotion just by moving her body; joy, for example, can be characterised by energetic, fluid movements and a tendency to raise your arms.

The team also plans to use the system to figure out how “in tune” a group of people is with their leader, looking for signals like how people’s heads move when someone is speaking.

Pollick says it could be useful as an automatic way to classify emotion – as part of a CCTV system to infer intent, or to help shops understand customers.

This article appeared in print under the headline “I know just how you feel”