The robot that reads your mind to train itself By Lakshmi Sandhana

Technology journalist Published duration 25 October 2010

Rajesh Rao is a man who believes that the best type of robotic helper is one who can read your mind.

In fact, he's more than just an advocate of mind-controlled robots; he believes in training them through the power of thought alone.

His team at the Neural Systems Laboratory, University of Washington, hopes to take brain-computer interface (BCI) technology to the next level by attempting to teach robots new skills directly via brain signals.

Robotic surrogates that offer paralyzed people the freedom to explore their environment, manipulate objects or simply fetch things has been the holy grail of BCI research for a long time.

Dr Rao's team began by programming a humanoid robot with simple behaviours which users could then select with a wearable electroencephalogram (EEG) cap that picked up their brain activity.

The brain generates what is known as a P300, or P3, signal involuntarily, each time it recognizes an object. This signal is caused by millions of neurons firing together in a synchronised fashion.

This has been used by many researchers worldwide to create BCI-based applications that allow users to spell a word, identify images, select buttons in a virtual environment and more recently, even play in an orchestra or send a Twitter message.

Skill set

The team's initial goal was for the user to send a command to the robot to process into a movement.

However, this requires programming the robot with a predefined set of very basic behaviours, an approach which Dr Rao ultimately found to be very limiting.

The team reasoned that giving the robot the ability to learn might just be the trick to allow a greater range of movements and responses.

"What if the user wants the robot to do something new?" Dr Rao asked.

The answer, he said, was to tap into the brain's "hierarchical" system used to control the body.

"The brain is organised into multiple levels of control including the spinal cord at the low level to the neocortex at the high level," he said.

"The low level circuits take care of behaviours such as walking while the higher level allows you to perform other behaviours.

"For example, a behaviour such as driving a car is first learned but later becomes an almost autonomous lower level behaviour, freeing you to recognize and wave to a friend on the street while driving."

To emulate this kind of behaviour - albeit in a more simplistic fashion - Dr Rao and his team are developing a hierarchical brain-computer interface for controlling the robot.

"A behaviour initially taught by the user is translated into a higher-level command. When invoked later, the details of the behaviour are handled by the robot," he said.

A number of groups worldwide are attempting to create thought-controlled robots for various applications.

Early last year Honda demonstrated how their robot Asimo could lift an arm or a leg through signals sent wirelessly from a system operated by a user with an EEG cap.

Scientists at the University of Zaragoza in Spain are working on creating robotic wheelchairs that can be manipulated by thought.

On-the-job training

Designing a truly adaptive brain-robot interface that allows paralysed patients to directly teach a robot to do something could be immensely helpful, liberating them from the need to use a mouse and keyboard or touchscreen, designed for more capable users.

Using BCIs can also be a time-consuming and clumsy process, since it takes a while for the system to accurately identify the brain signals.

"It does make good sense to teach the robot a growing set of higher-level tasks and then be able to call upon them without having to describe them in detail every time - especially because the interfaces I have seen using... brain input are generally slower and more awkward than the mouse or keyboard interfaces that users without disabilities typically use," says Robert Jacob, professor of computer science at Tufts University.

Rao's latest robot prototype is "Mitra" - meaning "friend". It's a two-foot tall humanoid that can walk, look for familiar objects and pick up or drop off objects. The team is building a BCI that can be used to train Mitra to walk to different locations within a room.

Once a person puts on the EEG cap they can choose to either teach the robot a new skill or execute a known command through a menu.

In the "teaching" mode, machine learning algorithms are used to map the sensor readings the robot gets to appropriate commands.

If the robot is successful in learning the new behaviour then the user can ask the system to store it as a new high-level command that will appear on the list of available choices the next time.

"The resulting system is both adaptive and hierarchical - adaptive because it learns from the user and hierarchical because new commands can be composed as sequences of previously learned commands," Dr Rao says.

The major challenge at the moment is getting the system to be accurate given how noisy EEG signals can be.

"While EEG can be used to teach the robot simple skills such as navigating to a new location, we do not expect to be able to teach the robot complex skills that involve fine manipulation, such as opening a medicine bottle or tying shoelaces" says Rao.

It may be possible to attain a finer degree of control either by utilising an invasive BCI or by allowing the user to select from videos of useful human actions that the robot could attempt to learn.

A parallel effort in the same laboratory is working on imitation-based learning algorithms that would allow a robot to imitate complex actions such as kicking a ball or lifting objects by watching a human do the task.

Dr Rao believes that there are very interesting times ahead as researchers explore whether the human brain can truly break out of the evolutionary confines of the human body to directly exert control over non-biological robotic devices.

"In some ways, our brains have already overcome some of the limitations of the human body by employing cars and airplanes to travel faster than by foot, cell phones to communicate further than by immediate speech, books and the internet to store more information than can fit in one brain," says Rao.

"Being able to exert direct control on the physical environment rather than through the hands and legs might represent the next step in this progression, if the ethical issues involved are adequately addressed."