Really? Really. How Our Brains Figure Out What Words Mean Based On How They're Said

Enlarge this image toggle caption Lizzie Roberts/Ikon Images/Getty Images Lizzie Roberts/Ikon Images/Getty Images

It's not just what you say that matters. It's how you say it.

Take the phrase, "Here's Johnny." When Ed McMahon used it to introduce Johnny Carson on The Tonight Show, the words were an enthusiastic greeting. But in The Shining, Jack Nicholson used the same two words to convey murderous intent.

Now scientists are reporting in the journal Science that they have identified specialized brain cells that help us understand what a speaker really means. These cells do this by keeping track of changes in the pitch of the voice.

"We found that there were groups of neurons that were specialized and dedicated just for the processing of pitch," says Dr. Eddie Chang, a professor of neurological surgery at the University of California, San Francisco.

Chang says these neurons allow the brain to detect "the melody of speech," or intonation, while other specialized brain cells identify vowels and consonants.

"Intonation is about how we say things," Chang says. "It's important because we can change the meaning, even — without actually changing the words themselves."

For example, by raising the pitch of our voice at the end of a sentence, a statement can become a question.

The identification of neurons that detect changes in pitch was largely the work of Claire Tang, a graduate student in Chang's lab and the Science paper's lead author.

Tang and a team of researchers studied the brains of 10 epilepsy patients awaiting surgery. The patients had electrodes placed temporarily on the surface of their brains to help surgeons identify the source of their seizures.

This allowed the team to monitor the activity of cells in each patient's brain as they listened to a series of sentences spoken by a computer.

"What we did was change where the intonation contour — the pitch changes — were happening in each of those sentences," Chang says.

So the volunteers would hear different versions of a sentence like, "Reindeer are a visual animal." Sometimes the computer voice started high and ended low, making the sentence a statement. Other times it started low and ended high, making the sentence a question.

The cells that track pitch didn't care whether they heard a high female voice or a low male voice, Chang says. It was the pattern of pitch changes that mattered.

"To people like musicians this is not a surprise," Chang says, "because you can take a melody and shift all of its notes higher or lower, but it's still recognizable."

The identification of specialized cells that track intonation shows just how much importance the human brain assigns to hearing, says Nina Kraus, a neurobiologist who runs the Auditory Neuroscience Laboratory at Northwestern University.

"Processing sound is one of the most complex jobs that we ask our brain to do," Kraus says. And it's a skill that some brains learn better than others, she says.

Kraus found that out when she did a study that looked at whether musicians were better than people who aren't musicians at recognizing the subtle tonal changes found in Mandarin Chinese.

"The English-speaking musicians were able to process with high precision those contours," she says, "and the nonmusicians didn't."

On the other hand, recognizing intonation is a skill that's often impaired in people with autism, Kraus says.

"A typically developing child will process those pitch contours very precisely," Kraus says. "But some kids on the autism spectrum don't. They understand the words you are saying, but they are not understanding how you mean it."

The new study suggests that may be because the brain cells that usually keep track of pitch aren't working the way they should.