New research shows that when we hear stories, brain patterns appear that transcend culture and language. There may be a universal code that underlies making sense of narratives. Share on Pinterest The brain activity related to understanding stories crosses cultural and language boundaries. Telling and listening to stories is a pastime that spans all cultures. From crime novels to bedtime stories and from ancient legends to spicy romances, humanity loves a good book. We are all very used to the idea of stories, but the processes at work in the brain are more complex than it seems. Following a narrative and understanding the story’s meaning and themes, as well as the interaction of causes and effects across time, involves challenging cognitive gymnastics. But of course, our brains make it seem effortless. Neuroscience has made headway in finding out which brain regions help us to understand smaller chunks of language – words and sentences, that is – but we still have a lot to learn about how the brain understands a narrative. Following a story involves a steady accumulation of meaning.

Storytelling and the brain Recently, a group of researchers from the University of Southern California (USC) in Los Angeles designed a study to investigate the networks involved in understanding stories. Their findings are published in the journal Human Brain Mapping. More specifically, they wanted to understand whether or not the same story but told in different languages would activate similar brain regions in native speakers of those languages. Further to this, they planned to see whether they could work out which specific story a participant was reading by analyzing their brain activity alone, which is no mean feat. The team was led by Morteza Dehghani, of the Brain and Creativity Institute at USC. Using software developed by the USC Institute for Creative Technologies, the team sifted through 20 million blog posts including personal stories. They narrowed this wealth of stories down to just 40, all of which covered personal topics such as going through divorce or telling a lie. These stories were then condensed to a paragraph of around 150 words. Next, the English stories were translated into Mandarin Chinese and Farsi by translators. In total, 90 participants of American, Chinese, and Iranian descent read the stories while their brains were scanned using functional MRI scans. The USC team used cutting-edge machine learning and text-analysis techniques, including an analysis involving 44 billion classifications to “reverse engineer” data from the scans. In this way, they were able to determine which story any individual reader was listening to in any of the three languages purely from the brain activity that they were measuring. In other words, the researchers were reading the participants’ minds as they read the stories. “Even given these fundamental differences in language, which can be read in a different direction or contain a completely different alphabet altogether, there is something universal about what occurs in the brain at the point when we are processing narratives.” Morteza Dehghani