Babies are easy to underestimate. This is understandable; after all, when most of us interact with an infant, we see a clumsy, messy creature — one more adept at stringing together strange gurgling noises than distinct consonants and vowels.


And yet, evidence continues to accumulate that just because an infant can't speak, doesn't mean it can't grasp what's being said in its presence. In fact, new research suggests that babies may be capable of understanding many common nouns months earlier than we once thought possible.

There is a significant distinction between understanding the elements of sound that comprise a language, and comprehending the meaning of a word itself.


"It is widely accepted that infants begin learning their native language not by learning words, but by discovering features of the speech signal: consonants, vowels, and combinations of these sounds," explain psychologists Elika Bergelson and Daniel Swingley in the latest issue of Proceedings of the National Academy of Science.

"Learning to understand words, as opposed to just perceiving their sounds, is said to come later, between 9 and 15 months of age, when infants develop a capacity for interpreting others' goals and intentions."

To test the validity of this assumption, Swingly and Bergelson rounded up 33 infants between the ages of 6 and 9 months, and 50 between the ages of 10 and 20 months, and ran them through two complementary attention tasks.


In the first task, infants were presented with a screen featuring images of a food item and a body part (a nose and an apple, for instance), and verbally encouraged by their caregivers to look at one item or the other (look at the apple or where's the hand?).

In the second task, the children were again instructed to direct their attention to a food item or body part, only this time the images were placed in a more natural context (i.e. no more disembodies noses — this schnoz actually appeared attached to a human figure).


For both tests, the researchers used an eye-tracking device to monitor where on the screen the infants were looking. In both tests, Bergelson and Swingley found that the babies between 6 and 9 months of age tended to spend more time looking at the named item than the other image (or images) on the screen. These findings, the researchers claim, suggest that the infants actually understood that some words were associated with specific objects.

Interestingly, the researchers noticed little improvement in task performance until the infants were around 14 months old, at which point word recognition spiked drastically.


"Maybe what is going on with the 14-month olds is they understand the nature of the task as a kind of game and they're playing it," Swingley said in a release issued by the University of Pennsylvania. "Or the dramatic increase in performance at 14 months may be due to aspects of language development we did not measure specifically, including better categorization of the speech signal, or better understanding of syntax."

Either way, the complexity of thought going on in the minds of these pudgy poop machines — even at just six months of age — is downright impressive.


"I think this study presents a great message to parents: You can talk to your babies and they're going to understand a bit of what you're saying," Swingley said. "They're not going to give us back witty repartee, but they understand some of it. And the more they know, the more they can build on what they know."

The researchers findings are published in the latest issue of PNAS.

Top image via Shutterstock — apple and nose via 1, 2