In April of this year I wrote about Honda’s press release announcing that they had made some breakthroughs in the area of mental control of robots. On close inspection it seemed that Honda had not accomplished anything that other researchers weren’t already doing, so they were essentially announcing that they were in the game. Well, not another Japanese car manufacturer is in the game also – Toyota.

They announced that they have developed a system that allows a person to control a wheelchair with their thoughts alone. The system allows a person to make the chair go right, left, or forward with thoughts alone. Curiously, in order to stop the chair a traditional puffer control is still needed (where the operator controls the chair by puffing into a tube).

This is nothing new – devising a system that can distinguish among three brain states is fairly crude. Obviously they could not make it distinguish a fourth to command the chair to stop. But Toyota needs their distinguishing feature – something to put in the press release to make their technology seem like a breakthrough. So, they tell us, theirs is the fastest system yet. While other systems take several seconds to process the thought commands, their system accomplishes this task in 125 milliseconds (thousandths of a second). That is fast enough to give the feel of instantaneous control. In fact the response time of the brain itself, from thought to action, is about 100 milliseconds.

While this is an incremental advance at best, it does sound like a legitimate advance. But more interesting is that we now have two large corporations investing in research into mental control of machines. They add to the research groups at Duke University and in New York who have also made headlines with their research.

The University of Pittsburgh has a suite of labs that are working on the various aspects of robotic mental control. At the Motorlab their research if focusing on correlating patterns of cortical activation with motor control in three-dimensional space. At the NTE lab, they are researching the interface between neural tissue and “smart biomaterials,” as well as neural tissue engineering. So they are working on the interface between brain and machine.

At the Rehab Neural EngineeringLab they are working on decoding the sensory feedback given to the brain from limbs during movement. This research could close the loop – allowing users to feel their prosthetic limb, which would greatly enhance control.

Although not mentioned on their site, there is straight neuroscience research which is discovering the neural activity that correlates with the brain’s sense of ownership over a body part – the sense that a body part belongs to you. So it is very plausible that in the future not only would a brain-machine interface allow a user to control a robotic limb, but also to feel it, receive information from the limb regarding its position and the tension on its “muscles”, and also feel as if it is part of their body. This is Six Million Dollar Man territory.

This is all very exciting. As the Neuro Rehab website says:

There is a rapidly growing brain-computer-interface (BCI) community working to develop methods of extracting and processing neural signals for directly controlling devices such as cmputers or prosthetic systems.

Now we also have large corporations like Honda and Toyota in the game – which is a sign that a technology is poised to move out of the lab and into the market. These companies say they have not plans to market any particular technology, they are just doing technology research. This is clear from the crude devices they are demonstrating. But it means that we are beyond the basic university level research into developing technological applications.

We must also remind ourselves that while this technology is steadily progressing, and has already made interesting advances, it is probably years away from the simplest applications and decades away from any mature applications, like fully functional neuro-prosthetic limbs.

That is the one downside to tracking new and exciting technology in its early phase – you have to wait years to decades before the promised applications emerge. It’s like seeing an awesome movie preview a decade before the release date.