Burkhart says that having the ability to move objects was “fantastic,” but he was limited without a sense of touch. Without this feedback, grabbing objects required his full attention. Unless he was looking at it, he couldn’t say whether he was holding something or not. “That’s really challenging, especially if I want to grab something that’s behind me or in a bag,” Burkhart says. Even when he could see the object, the firmness of his grip was out of his control, which made handling delicate objects difficult.

Adding a sense of touch into the system proved more difficult. Neuroscientists have successfully reproduced the sensation of touch in quadrepeligic people by relaying data from sensors in a robotic prosthetic hand to a chip in the user’s brain. The problem was Burkhart’s BCI wasn’t designed for that kind of input. It wasn’t even located in the right place. Touch is registered in the somatosensory cortex, which is located behind the motor cortex, where the chip was installed. Yet Ganzer says the somatosensory cortex can be a “noisy neighbor” and some of its signals were picked up by the chip. It was just a matter of finding out what they meant.

To tease out the unique signals corresponding to touch, Ganzer and his colleagues began doing targeted stimulations on Burkhart’s thumb and forearm, parts of his limb where he still had a very weak sense of touch. By observing how Burkhart’s brain signals changed when pressure was applied to his fingers and hand, they were able to identify the weak touch signals against a background of much stronger movement signals. This meant a computer program could split the signals coming from Burkhart’s BCI so that motion signals went to the electrodes around his forearm and touch signals to an armband on his upper bicep.

The WIRED Guide to Robots Everything you wanted to know about soft, hard, and nonmurderous automatons.

Burkhart’s upper arm was also one of the few parts of his body that still had sensation after the accident. This meant that the weak pressure signals relayed from his hand to his brain could be converted into vibrations that would let him know he was touching an object. During tests with the armband, Burkhart could tell when he was touching an object with nearly perfect accuracy, even if he couldn’t see it.

At first, the Battelle touch band was a simple, on-off vibration device. But Ganzer and his colleagues further refined it so that it changes its vibration based on how hard or soft Burkhart grips an object. It’s similar to how videogame controllers and cell phones provide feedback to users, but Burkhart says it took some getting used to: “It’s definitely strange. It’s still not normal, but it’s definitely much better than not having any sensory information going back to my body.”

Robert Gaunt, a biomedical engineer at the University of Pittsburgh’s Rehab Neural Engineering Labs, contrasted Battelle’s system to the approach being developed in his own lab, where a BCI controls a robotic limb and sensors on that limb return signals that stimulate the brain to artificially recreate a sense of touch in a person’s hand. “What they’re doing is a little more like sensory substitution, rather than restoring touch to his own hand,” Gaunt says. “We all have the goal of developing devices that improve the lives of people with spinal cord injuries, but the most effective way to do that is totally unclear at this point.”

Now that Ganzer and his colleagues have demonstrated the technology in the lab, he says the next step is to improve the system for everyday use. The team already has shrunk the electronics used in the system to a box the size of VHS tape that can be mounted on Burkhart’s wheelchair. The bulky system of electrodes has also been reduced to a sleeve that is relatively easy to take on and put off. Recently, Burkhart used the system for the first time at home, controlling it through a tablet.