Duke Center for Neuroengineering

In clinical trials last year, scientists showed

that a human can control a single prosthetic arm with mind control alone. But many daily tasks, from typing to washing the dishes, require two hands.

Researchers led by Miguel Nicolelis at Duke University Medical Center have created a system that allows monkeys to control simultaneously the arms of a monkey avatar on the computer screen using only their minds. The team reports the work in the journal Science Translational Medicine today.

"Nicolelis has taken the first step to bimanual, while others have attempted a single-limb control," says Nitish Thakor, professor of biomedical engineering at Johns Hopkins University and director of the Singapore Institute for Neurotechnology, who was not involved in the work. "This is a big step, as two hemispheres of brain are involved in their work."

In the human body, signals from the brain direct the motion of our limbs, along with everything else we do. Brain–machine interfaces use a computer algorithm to translate brain signals into the motion of an external object, which could be a cursor on a computer screen, a prosthetic limb, or in this case the arms of a computerized avatar. Nicolelis and his fellow researchers implanted electrodes to record data from about 500 neurons in the brains of two monkeys, called Monkey C and Monkey M, and trained the animals to control the movement of an onscreen avatar with their thoughts.

How, exactly, do you teach a monkey to do this? The first step is to have the monkeys think about the desired action—in this case, making the avatar's hands grab two objects on the screen—and determine which neurons are firing and in what patterns when the monkeys think about that action. At first, Monkey C was allowed to control the avatar using joysticks, with one joystick for each arm. The computer recorded the neuronal firing patterns as they related to the motion of the avatar's arms onscreen. To give the monkey a goal to work toward, the animal was rewarded with juice when it made the two arms touch a pair of white circles on the screen simultaneously.

In the next phase of training, Monkey C was again encouraged to make the avatar touch the targets and was again allowed to move the joysticks. This time, though, the joysticks weren't controlling the avatar at all: The monkey's neurons, as decoded by the algorithm, were moving the objects on screen.

In phase three, the monkey's arms were lightly restrained during the same task so that it would learn that its thoughts, not the motion of the joystick, were controlling the avatar.

Of course, a paraplegic person couldn't move a joystick to teach the computer how to interpret his or her thoughts. To simulate such a system could work for humans, the researchers restrained the arms of the other monkey throughout the experiment. During the first step, instead of moving a joystick the monkey simply watched the avatar moving on the screen, and the computer correlated the firing patterns of its neurons to the motion of the avatar. In the next step, Monkey M started controlling the avatar with its brain.

By the end of their training, the monkeys were able to palm their targets with an accuracy of over 60 percent without moving their own arms.

Experiments allowing human subjects to control single prosthetic arms via brain–machine interfaces have already succeeded. In December 2012, a paraplegic woman fed herself chocolate using a prosthetic arm. In May of that year, another paralyzed patient brought a bottle of coffee to her lips using a brain machine interface system called Braingate2.

This new research differs from those in ways beyond adding a second arm. The earlier studies, which were clinical trials, used a 96-electrode array smaller than a penny to measure neuron firing in a defined area of the motor cortex. This new experiment measured nearly 500 neurons all over the cortex and in both hemispheres of the brain, which was needed to capture the brain signals that control two-handed movements. Moving both arms is more complicated than the sum of the signals for moving each arm individually.

The success impressed Cynthia Chestek, assistant professor of biomedical engineering at the University of Michigan, who was not involved in the work. "The Nicolelis lab does amazing surgeries and are indeed setting the record for number of wires implanted in a primate brain." She noted that the monkeys were tested up to three years after the electrodes were implanted, which shows that the electrodes can last long-term. "Altogether this is building up a consensus that these arrays already last for clinically relevant time periods. People usually list that as the biggest barrier to clinical translation."

Chestek was not overly impressed by the decoder's accuracy. Referring to a video produced by 60 Minutes about work from the University of Pittsburgh, she commented: "This level of control would not have been good enough, for example, to do the prosthetic arm experiment in the 60 Minutes piece."

But Nicolelis says that translating this system from an onscreen avatar to a prosthetic device would be "totally smooth, because what we are doing is controlling the movements, the kinematics of the avatar, and that's exactly what the subject would do with our robotic exoskeleton." Thakor agrees that the translation from avatar to prosthesis is "feasible," but also requires "a much more cautious approach, not just to overcome the technical barriers but also to show safety and efficacy, and long-term effectiveness."

The next step, Nicolelis says, is to apply the results of this work to the Walk Again project at Duke, an endeavor that hopes to enable disabled patients to regain mobility through wearable robots, or exoskeletons, controlled through brain–machine interfaces.

This content is created and maintained by a third party, and imported onto this page to help users provide their email addresses. You may be able to find more information about this and similar content at piano.io