CAN you detect someone’s emotional state just by looking at his face?

It sure seems like it. In everyday life, you can often “read” what someone is feeling with the quickest of glances. Hundreds of scientific studies support the idea that the face is a kind of emotional beacon, clearly and universally signaling the full array of human sentiments, from fear and anger to joy and surprise.

Increasingly, companies like Apple and government agencies like the Transportation Security Administration are banking on this transparency, developing software to identify consumers’ moods or training programs to gauge the intent of airline passengers. The same assumption is at work in the field of mental health, where illnesses like autism and schizophrenia are often treated in part by training patients to distinguish emotions by facial expression.

But this assumption is wrong. Several recent and forthcoming research papers from the Interdisciplinary Affective Science Laboratory, which I direct, suggest that human facial expressions, viewed on their own, are not universally understood.

The pioneering work in the field of “emotion recognition” was conducted in the 1960s by a team of scientists led by the psychologist Paul Ekman. Research subjects were asked to look at photographs of facial expressions (smiling, scowling and so on) and match them to a limited set of emotion words (happiness, anger and so on) or to stories with phrases like “Her husband recently died.” Most subjects, even those from faraway cultures with little contact with Western civilization, were extremely good at this task, successfully matching the photos most of the time. Over the following decades, this method of studying emotion recognition has been replicated by other scientists hundreds of times.