Distinguishing between other people's voices may seem like a trivial task. However, if those people are speaking a language you don't understand, it becomes much harder. That's because you rely on individuals' differences in pronunciation to help identify them. If you don't understand the words they are saying, you don't pick up on those differences.



That ability to process the relationship between sounds and their meanings, also known as phonology, is believed to be impaired in people with dyslexia. Therefore, neuroscientists at MIT theorized that people with dyslexia would find it much more difficult to identify speakers of their native language than non-dyslexic people.



In a study appearing in Science on July 29, the researchers found just that. People with dyslexia had a much harder time recognizing voices than non-dyslexics. In fact, they fared just as poorly as they (and non-dyslexics) did when listening to speakers of a foreign language.



The finding bolsters the theory that impaired phonology processing is a critical aspect of dyslexia, and sheds light on how human voice recognition differs from that of other animals, says John Gabrieli, MIT's Grover Hermann Professor of Health Sciences and Technology and Cognitive Neuroscience and senior author of the Science paper.



"Recognizing one person from another, in humans, seems to be very dependent on human language capability," says Gabrieli, who is part of MIT's Department of Brain and Cognitive Sciences and also a principal investigator at the McGovern Institute for Brain Research.



Verbal cues



The lead author of the study, MIT graduate student Tyler Perrachione, earned his undergraduate and master's degrees at Northwestern University, where he was involved in studies showing that it is easier to recognize voices of people speaking your own language.



"Everybody's speech is a little bit different, and that's a big cue to who you are," he says. "When you're listening to somebody talk, it's not just properties of their vocal cords or how sound resonates in their oral cavity that distinguishes them, but also the way they pronounce the words."



After Perrachione arrived at MIT, he and Gabrieli decided to try to link this research with evidence showing that phonological processing is impaired in people with dyslexia. They tested subjects in identifying people speaking their native language (English), then Chinese.



When listening to English, the non-dyslexic subjects were correct nearly 70 percent of the time, but performed at only 50 percent when trying to distinguish Chinese speakers. Dyslexic individuals performed at 50 percent for both English and Chinese speakers.



"It's a beautiful study, in the sense that it's so simple," says Shirley Fecteau, a visiting assistant professor at Harvard Medical School and research chair in cognitive neuroplasticity at Laval University in Quebec. "It really seems like a very clear effect on voice recognition in people with dyslexia."



The finding suggests that people with dyslexia may have even more trouble following a speaker than they may realize, Gabrieli says. This adds to the growing evidence that dyslexia is not simply a visual disorder.



"There was a big shift in the 1980s from understanding dyslexia as a visual problem to understanding it as a language problem," Gabrieli says. "Dyslexia may not be one thing. It may be a variety of ways in which you end up struggling to learn to read. But the single best understood one is a weakness in the processing of language sounds."



Friend versus foe



Recognizing other members of one's species by their voices is critical for humans and other social animals. "You want to know who is a friend and who is a foe, you want to know who your partner is," Perrachione says. "If you're cooperating with someone for food, you want to know who that person is."



However, it appears that humans and animals perform that task in different ways. Animals can identify other members of their own species by the sounds they make, but that ability is innate and based on the sounds themselves, rather than the meaning of those sounds.



"We notice individual differences in this learned feature of our communication, which is the words that we use, and that's what really distinguishes human communication from animal communication," Perrachione says.



The researchers believe their work may also offer insight into the performance of computerized voice-recognition systems. Voice-recognition programs with access to dictionary meanings of words might do a better job of understanding different speakers than systems that only identify sounds, Perrachione says.



The researchers are now using functional magnetic resonance imaging (fMRI) to determine which parts of the brain are most active in dyslexics and non-dyslexics as they try to identify voices.



