Not Like Us is Aviva Rutkin's monthly column exploring the minds of intelligent machines – and how we live with them

“What is it like to be a bat?” the philosopher Thomas Nagel wondered in 1974. You’d flap around, echolocating, eating bugs, hanging out upside-down in someone’s attic. But something essential about the experience was off limits to his imagination. “I am restricted to the resources of my own mind, and those resources are inadequate to the task.”

Nagel’s famous essay considered a sticky problem: what is the relationship between our body and our mind? How could we ever comprehend a state of being that isn’t just our own? The question of what it’s like to be someone, or something, else, has continued to tantalise.

Now, research into making telerobotics happen may offer a weird and cool possibility – that of beginning to understand, if only a little, the experience of entities that are not at all like us.

Telerobotics typically promises a future where you can accomplish far more in a day than you ever did before. After all, what would it be like with a few extra bodies lying around? Not around your house, mind you, but all over the world – perhaps stronger, better-looking, enhanced with robo-capabilities. Maybe one is in Cairo in case you feel like going for a tour of the streets in the morning; one is in London so you can meet a friend for lunch; one is in San Francisco so you can attend a class in the afternoon. Or you might suit up in one every morning to do a job thousands of miles away: troubleshooting problems at faraway factories, for example, or checking in on distant patients.


It is in the service of this vision that most of the research is being done. “Our long-term vision is to have docking stations all over the world,” writes Mel Slater, a computer scientist at the University of Barcelona, Spain, and his colleagues. “Anyone could connect to whichever robot they wanted, and more substantially, as many robots as they wanted, and ‘teleport’ there instantaneously.” These second bodies – or third, or fourth, or fifth – would translate the movements of your “real” one, relaying local sights and sensations across the distance.

Embody a robot

That’s also the direction the hardware is going in. You can’t yet fully embody an existing robot – many of the devices look more like Skype on wheels – but the nascent technology has already opened up some unusual possibilities. It has allowed Edward Snowden to roam freely in the US while his human body is barred from its soil; permitted an enterprising Australian to wait in-line for a new iPhone; and enabled a disability activist to meet Barack Obama at the White House.

A lot of research is being done on systems that let people control these other selves ever more dextrously from afar. In one recent experiment, three paralysed volunteers in Italy controlled the movements of a robot in Japan, sending commands 10,000 kilometres via EEG. “When the robot was stationary the feeling of embodiment was low, but the moment I gave the first command or changed direction, there was this feeling of control and increased embodiment,” one of the participants told New Scientist.



There is even some work on giving people tactile feedback, using a range of increasingly sophisticated techniques ranging from haptics to nerve reinnervation.

But the most powerful tool in the arsenal may be the human brain. We started to figure that out thanks to a curious phenomenon called “the body ownership illusion” – and a famous experiment involving a rubber hand.

Nearly two decades ago, scientists invited 10 people into their lab. Each sat at a small table with a screen that hid their left arm from view. In front of them sat a rubber hand. On one side of the screen, the scientist stroked the rubber hand with a small paintbrush; on the other, they mimicked those exact movements on the subject’s hidden left hand. The subjects suddenly felt as if they were experiencing the fake hand’s sensations, as if they could feel what the rubber hand felt. “I found myself looking at the dummy hand thinking it was actually my own,” said one. (You can try this illusion for yourself with our guide here.)

Rubber hand illusion

The rubber hand illusion suggests that our brain is remarkably willing to accept, at least for a little while, an alien body as its own. Later studies showed that the experience activates parts of the brain involved in movement and sensation – and that even threatening the fake hand with a knife incites anxiety, as if your own hand is threatened.

“In your whole life, whenever you’ve looked down, you’ve seen your body. Whenever you’ve moved your arm, it’s your arm that moves,” says Slater. “In virtual reality or with a robot, the simplest hypothesis for the brain to adopt is, ‘OK, it’s my body.’ It doesn’t mean you believe it, but you have this strong illusion.”

The kind of body you have may change the way you see the world, says Slater. Virtual reality offers a much more adaptable way to put people into different bodies. In previous experiments, Slater and his colleagues have tried placing people into virtual bodies that don’t match their own. When adults switched to a child-sized body, for example, they started to overestimate the size of objects and identify more closely with childish attributes. In another experiment, a group of white people spent about 10 minutes in a virtual body with darker skin. Afterwards, their implicit bias against other races seemed to go down (though this conclusion still needs to be replicated). “The type of a body makes a difference to how people respond,” he says.

A few games have already started to play with this idea, pushing people to reconsider their notions of themselves: like VR game Girl Mirror Look, which lets people swap bodies with someone of the opposite gender for a little while.

Robothespian: for your cultural needs Matt Cardy/Getty

In a new study, Slater’s team pushes the envelope a bit further, seeing if the brain will accept being fragmented over three different bodies. Forty-one people suited up to beam into not one, but three different robot bodies. In one room, elsewhere in the university, they controlled a life-sized Robothespian robot, giving a talk to a room of humans. In another, they became a Nao robot and chatted with someone nearby. And in a third virtual destination, they were a human again, helping another virtual person perform an arm exercise. They switched between the three destinations, letting proxy software take over their robot body whenever they left it for another.

Overall, the participants seemed happy beaming between their three new bodies, and commented that they really did feel like they were in those locations, with the people who were there. “I felt transported,” said one.

There’s a long way to go before technology like this can quickly and easily relay movements and sensations between human and machine. And we have no idea what the effects would be of living this way for long stretches of time. There are boundaries, too, to what our brain will accept: the rubber hand illusion, for example, doesn’t work when you swap out the fake hand for a wooden block.

But experiments like Slater’s suggest that we might be okay with something as unfamiliar as a mechanical body, even multiple mechanical bodies of different shapes and sizes. If such technology ever becomes commonplace, it may be interesting to watch how that changes our relationships with our robot friends. Will it change us? Will it help us to empathise a little better? Will we be a little closer to understanding what it means to move through the world as a nonhuman entity? Bot or bat, the potential is exciting.

Journal reference: Frontiers in Robotics and AI, DOI: 10.3389/frobt.2016.00065