Last month, DARPA announced the latest breakthrough in brain-controlled artificial limbs: a robotic arm that a user can actually feel. It works via neural implants that connect to a computer through an interface top of the user's skull, and sends singals from there to the arm telling it how to move.

This probably isn't the first time you've heard of such a thing. Over the past several years, teams around the world have devised brain-machine interfaces that make the seemingly impossible a reality. Back in 2012, the program that funded this breakthrough, Revolutionizing Prosthetics, made news when a quadriplegic with neural implants manipulated a robotic arm just by thinking about it, and an amputee felt the relative resilience of various objects with the help of electrical stimulation of his peripheral nerves. Last year, a paraplegic kicked off the FIFA World Cup in Sao Paulo with the help of an exoskeleton controlled by a computer reading signals from an electrode-studded cap. Even DIYers are getting in on the brain-controlled robot action through the open-source brain-computer interface (OpenBCI) project, which launched last year via a Kickstarter campaign.

With these spectacular successes, it seems only a matter of time before brain-controlled artificial limbs that move like natural ones and feed sensations to the brain are as commonplace as today's unconnected prosthetics. So how far are we, really, for a world of Skywalker-esque robot hands?

This content is imported from YouTube. You may be able to find the same content in another format, or you may be able to find more information, at their web site.

There are two major challenges for the commercialization of these sci-fi prosthetics, The first, unsurprisingly, is the cost. Michael McLoughlin is head of the DARPA effort at Johns Hopkins University's Applied Physics Lab (APL), where the work on Revolutionizing Prosthetics has been under way since 2006. McLoughlin says that engineers are well on their way to solving the technical challenges of building such a thing. Instead, he singles out cost as the main barrier to commercialization.

"Right now we make these arms one at a time, which makes them very expensive," he says. No kidding: Each custom-built prototype costs about $400,000 to build.

One path to mass production and a lower cost per unit, he says, would be to remove some of the motors and sensors that are in the prototypes. This would render the production arms less dextrous and less sensitive, he says, but it could drop the price per arm down to $100,000 with production runs of a few hundred prosthetics—a lot closer to feasibility. True, that means that each arm is still about the price of a fully loaded Tesla Model S. But what's true for cars could be just as true for amazing brain-controlled mechanical arms: Once the technology is proven in luxury models, more plentiful, affordable artificial limbs could follow.

"It's not plug and play like an iPod or something."

The other issue is simply the neuroscience, says NYU neuroscientist and brain-machine interface researcher Robert Froemke, who is not affiliated with DARPA. "We don't understand how the brain generates movement signals. We understand that a little bit, particularly in work from animal models, but there's limited data in humans." In other words, more testing needs to be done on brave research projects like those participating in Revolutionizing Prosthetics.

For now, it's up to sophisticated machine learning algorithms. Working with humans test subjects putting in hours of practice, the algorithms learn to correctly interpret a user's intentions for basic tasks like picking up an object based on the limited information picked up by a few electrodes. More complicated tasks require better systems. Froemke gives the example of drinking coffee. When you do this with your natural limbs, you don't think about how complex the motion is. But a brain-controlled arm must correctly understand the intention and execute a series of moves: pick up the cup, bring it to a user's lips, tilt it toward the mouth without spilling it. To make it happen, you need higher-bandwidth communications between brain and interface. "How many channels of information can you have?" asks Froemke. The more the better, obviously, but that adds complexity to surgeries and requires more processing power. "It's a really, really hard problem."

The payoff to solving this problem will be artificial limbs that approach, if not match, the agility of natural ones. More channels, for example, will be able to drive more sensors, which in turn could enable proprioception—the ability to sense the position of a limb in space without looking at it. "That's a very important part of our ability to use our limbs," says McLoughlin. That's still out of reach of artificial limbs, but it's a problem McLoughlin and his team think they can solve.

Johns Hopkins University Applied Physics Laboratory

If you imagine a world with abundant brain-controlled prosthetics, McLoughlin says, the first to see the light of day outside the lab could be in-home devices that wouldn't need to be fully portable to be useful. For example, a quadriplegic might use a house-bound arm to feed herself without having to rely on a human helper.

Before these amazing limbs go mobile, there's a lot of work to be done on miniaturizing the brain-machine interface. That's because the arm itself must be self-contained. All the motors, sensors, and processors required for movement and sensation can fit within the shape and nine-pound weight of a typical adult male arm. A lithium ion battery pack can keep the arm going for up to four hours between charges, depending on usage. The problem is the computing power. The interface needs an external processor to interpret signals coming from the brain through a pair of terminals on top of the skull, and to translate them into instructions for the arm itself.

To have a futuristic arm that's truly self-contained and portable, you'd need to miniaturize the computers required, McLoughlin says. Same goes for a wireless link from the wearer's brain to the arm, so they don't have to have a bunch of wires running between the two.

And then there's the software. "Algorithms for machine learning these days are amazing," Froemke says. "You can take signals that are very noisy that are generated by the brain, and with a lot of practice with specific users, get some functionality out of them." But, he says, "I think for the foreseeable future, a lot of this is going to be very customized. By that I mean the person really has to work with the system and with the algorithms. Every prosthetic is probably going to be a special case, adapted for that patient, perhaps over a matter of weeks to months of training. It's not plug-and-play like an iPod or something."

This content is imported from YouTube. You may be able to find the same content in another format, or you may be able to find more information, at their web site.

In other words, both the patient and the software for that patient's artificial limb will have to adapt to each other over time. Even a more complete understanding of how the brain generates motor signals won't be enough to make an artificial arm or leg that you can just buy off the shelf, Froemke says.

"Those arms and legs are still being driven by different kinds of things than the spinal cord is driven by," he says, "and so, simply understanding the neural code is only going to get us so far." Once the basic software gets good enough, through more human trials, brain-machine training will have to be added to the physical therapy regimen that every amputee goes through.

These might sound like monumental hurdles, but the team at APL at Johns Hopkins is undeterred. The scientists have already started talks with potential partners for commercialization. The effort is aided by the arm's ability to function with different kinds of interfaces besides direct neural link, opening more possibilities for use without the admittedly drastic step of brain surgery. An amputee outfitted with a different type of interface, for example, could move the arm by via signals picked up from residual arm nerves. Or, a quadriplegic could control the arm with eye movements.

It will be up to future commercial partners to find the right balance between extreme tech and affordability. "We're an academic organization," says McLoughlin. "We're not going to go off and manufacture it, so we're hoping to find people who are interested in it."

Michael Belfiore is the author of The Department of Mad Scientists: How DARPA Is Remaking Our World, from the Internet to Artificial Limbs (Harper, 2010). Find him online at michaelbelfiore.com.

This content is created and maintained by a third party, and imported onto this page to help users provide their email addresses. You may be able to find more information about this and similar content at piano.io