How do we know what someone else is feeling? Clues about the emotions of others can come in various forms. Facial expression can be a dead giveaway, but we can also make inferences from body posture, or even from seeing or reading about the situation that caused the emotion. An interesting problem in neuroscience is how these very different cues about the emotions of others can all lead to the same ultimate realization: She’s happy; he is sad.

In a paper published this week in the Journal of Neuroscience, Amy Skerry and Rebecca Saxe sought to find the region of the brain that is responsible for these empathetic realizations, regardless of the origin. They did this by showing people different types of media that relayed emotion: a short video clip from a movie, or an animated clip that showed a geometric figure experiencing prosocial or antisocial action from its fellow geometric shapes. For instance, in the figure below, a woman makes a sad face, and then a red circle is excluded from a group of purple triangles, squares, and pentagons (so sad! poor circle).

The authors then trained a computer program to look at the fMRI brain scans of people during each emotional media presentation, and guess which emotion was being conveyed. Importantly, the program was trained to discriminate the emotional states based on one type of media (facial expression, say) and then was tested on data for the other type of media (animated situations). The scientists were looking for brain regions that had such distinct neural response to the emotional state that the computer program could recognize it no matter which media type the person had seen to make the inference. To qualify, the program had to perform significantly better than chance on data from that region.

Following data from previous studies, the authors homed in on the prefrontal cortex, or PFC. This is not surprising, as the prefrontal cortex is a particularly “thinky” part of the brain, responsible for, among other things, future planning and impulse control. But the PFC is large (it’s basically everything in your forehead region) and has many functions. Specifically, it seemed to be the medial part of this structure, or MPFC, that held the key to invariant recognition of emotional states, regardless of how they were communicated. Further subdividing, the authors found that data from both the dorsal (upper) MPFC and middle MPFC reliably allowed the computer program to perform above chance.

Skerry and Saxe then asked another question. Would these same brain regions represent emotions the same way when it was the self experiencing that emotion, rather than another? To determine the answer, the participants in the study were told that they were either winning money (happy 🙂 ) or losing it (sad 😦 ). They then had the computer program guess, based on neural response, what emotion they had induced in the individual. Here, the middle MPFC still held reliable information, whereas the dorsal MPFC no longer did.

This study succeeded in identifying a region of the brain that has an particular response to particular emotions, regardless of how the brain whether it was perceived visually or merely implied, and regardless, even, of whether it was the self or someone else experiencing it. While the current study dealt only in binary (good or bad, happy or sad) it remains an open question whether these findings hold for more complex emotions like greed, jealousy, or gratitude.

Reference: A Common Neural Code for Perceived and Inferred Emotion. Amy E. Skerry and Rebecca Saxe. (2014) Journal of Neuroscience, 34(48): 15997-16008

Intro image by Dietmar Temps, all other images adapted from above.