A robot has taught itself to smile, frown, and make other human facial expressions using machine learning.

To get the incredibly realistic Einstein robot to make facial expressions, researchers used to have to program each of its 31 artificial muscles individually through trial and error. Now, computer scientists from the Machine Perception Laboratory at the University of California, San Diego have used machine learning to enable the robot to learn expressions on its own.

"The robotic approach is the ultimate in helping us understand learning and development," said social development expert Daniel Messinger at the University of Miami, who was not involved with the Einstein research but collaborates with the group on another project. "There’s so much we can learn by actually trying to make it happen instead of just watching kids try to move their faces — it’s like having a baby as opposed to just watching a baby."

According to the researchers, who presented the project last month at the 2009 IEEE 8th International Conference on Development and Learning, this is the first time anyone has used machine learning to teach a robot to make realistic facial expressions.

To begin teaching the robot, the researchers stuck Einstein in front of a mirror and instructed the robot to "body babble" by contorting its face into random positions. A video camera connected to facial recognition software gave the robot feedback: When it made a movement that resembled a "real" expression, it received a reward signal.

"It's an iterative process," said facial recognition expert Marian Bartlett, a co-author of the study. "It starts out completely random and then gets feedback. Next time the robot picks an expression, there’s a bias towards putting the motors in the right configuration."

After the robot figured out the relationship between different muscle movements and known facial expressions, it started experimenting with new expressions, such as eyebrow narrowing.

The robot's expressions are still a bit awkward, but the researchers say they're working on ways to make them more realistic, as well experimenting with strategies besides "body babbling" that might speed up the learning process. The group says its studious robot may even improve our understanding of how infants and children learn to make facial expressions.

"The idea is to try to understand some of the computational principles behind learning," Bartlett said. "Here the computational principle is reinforcement learning and active exploration, which may also be behind learning motor movements in an infant."

The next step is to get the Einstein robot to start socializing. Once the robot can mimic facial expressions in a social context, the researchers plan to use him in an "automatic tutoring" experiment.

"We're putting facial expressions onto the robot so that he can engage with a pupil in a non-verbal manner and approximate one-on-one human tutoring as much as possible," Bartlett said. "Studies have shown that human one-on-one tutoring improves learning by as much as two standard deviations — we want to know how can you try to approximate that with robotic tutoring."

See Also:

Image: UC San Diego/Erik Jepsen