There have been plenty of other efforts to create extra limbs that you can wear, and in fact this isn’t Saraiji’s first time making robotic limbs meant to attach to a human: he and most of the other Fusion researchers previously built a wearable set of arms and hands called MetaLimbs that a wearer controlled with their feet.

Having the limbs controlled by someone else—someone who can be in another room or another country, and in VR to boot—is a little different, however. Saraiji says he wanted to see what would happen if someone else could, in a sense, dive into your body and take control.

The backpack includes a PC that streams data wirelessly between the robotic arm-wearer and the person controlling the limbs in VR. The PC also connects to a microcontroller, letting it know how to position the robotic arms and hands and how much torque to apply to the joints.

The robotic arms, each with seven joints, jut out of the backpack, along with a connected head, of sorts. The head has two cameras that show the remote operator, in VR, a live feed of everything the backpack-wearer is seeing. When the operator moves their head in VR, sensors track that motion and cause the robotic head to move in response (it can turn left or right, tilt up and down, and pivot from side to side, Saraiji says).

The wearable system is powered by a battery that lasts about an hour and a half. It’s pretty heavy, weighing in at nearly 21 pounds.

“Of course, it’s still a prototype,” Saraiji points out.

While I’m talking to him, Saraiji puts on the backpack and enlists a graduate student to wear the VR headset and help demonstrate how it works. I call out a few commands, such as asking the robot-limb operator to pick something up. At first, he fumbles with a squeaky yellow toy with cartoon eyes, then manages to grab it and hand it to Saraji; then one of the robot hands takes the toy back, and gives it back to Saraji again. At one point, Saraiji walks behind the guy operating the arms in VR, so the operator can tap himself on the shoulder with one of the robot’s fingers and give himself an abbreviated neck rub.

Different buttons on the Oculus Rift controllers enable different finger functions: the operator can move the pinky, ring, and middle finger of each robotic hand simultaneously with a single button, while the thumb and index finger each have their own controls.

Hermano Igo Krebs, a principal research scientist at MIT who has spent decades studying rehabilitation robotics, doesn’t think the project would be practical for rehab. But he can imagine it being helpful in a lot of different situations—to assist an astronaut in outer space, for instance, or a paramedic with an unfamiliar medical procedure.

Saraiji says that he’d like to turn the project into an actual product, and he and his collaborators are in the process of pitching it to a Tokyo-based startup accelerator.