With the help of human instructors, a robot has learned to talk like a human infant, learning the names of simple shapes and colors.

“Our work focuses on early stages analogous to some characteristics of a human child of about 6 to 14 months, the transition from babbling to first word forms,” wrote computer scientists led by Caroline Lyon of the University of Hertfordshire in a June 13 Public Library of Science One study.

Named DeeChee, the robot is an iCub, a three-foot-tall open source humanoid machine designed to resemble a baby. The similarity isn’t merely aesthetic, but has functional purpose: Many researchers think certain cognitive processes are shaped by the bodies in which they occur. A brain in a vat would think and learn very differently than a brain in a body.

This field of study is called embodied cognition and in DeeChee’s case applies to learning the building blocks of language, a process that in humans is shaped by an exquisite sensitivity to the frequency of sounds.

'Robot embodiment evokes reactions which disembodied software does not.'

“Learning needs interaction with a human, and robot embodiment evokes appropriate reactions in a human teacher, which disembodied software does not,” said Lyon.

Using DeeChee also allowed the researchers to quantify the transition from babble to recognizable word forms in detail, drawing statistical links between sound frequencies and the robot’s performance that might eventually inform research on human learning.

To be certain, DeeChee doesn’t yet think like a human baby. It doesn’t have the software. Asked if this process of learning might be combined with higher-level cognitive programs to produce something like consciousness, Lyon demurred.

“First we have to ask, ‘What is consciousness?'” she said.

Video: Like a child learning to talk, an iCub robot named DeeChee interacts with a human teacher to learn the names of shapes and colors. (Lyon et al./PLoS One)