Ryerson University

In early August 2015 hitchBOT, a hitchhiking robot designed to roam the world relying on the kindness of strangers, was brutally vandalised. Social media platforms were filled with anger and dismay at the news, with people tweeting heartfelt missives such as: "hitchBOT is dead because people are awful".

HitchBOT's demise didn't come as a shock to everyone though -- including Kate Darling, an expert in robot ethics at MIT Media Lab. "I honestly was a little bit surprised that it took this long for something this bad to happen to hitchBOT," she tells the audience at The Conference in Malmo.


Darling watched people's reactions to the news with fascination. "There was an outpouring of emotions and support," she says. "Of course people are upset -- this was an act of vandalism. No matter what it is, if it's a car -- we don't like this behaviour."

The messages that caught her attention in particular were the those that offered fierce apologies to hitchBOT. There were, she says, "all these informed adults sympathising directly with hitchBOT". The reason for this behaviour lies in our natural inclination towards anthropomorphism and the desire to emotionally relate to things.

Read next Everybody loves robots until they start punching people in the face Everybody loves robots until they start punching people in the face

Ryerson University

Robots aren't anything new, but because they are starting to appear in new areas of our lives, our relationship and our ability to anthropomorphise them is evolving. "The new thing were seeing about this with robots is that this effect seems to be more intense," explains Darling.


Physicality is a big factor in this, she adds. "Robots move and we're biologically programmed to respond to anything that moves." This is why people get attached to their Roomba vacuum robots. There are of course also much more extreme examples, such as soldiers in the US military becoming so attached to the robots they are working with that if they can't repair them they'll have funerals for the machines. "There are soldiers actually risking their lives to save the robots they're working with."

Military robots are primarily tools, but Darling points out that there is a whole new category of robots that are specifically designed to make us respond automatically and subconsciously using sounds and movements. "If you work in social robotics this is awesome," says Darling. Many of these robots have been designed specifically to assist with health conditions like autism and with education. "Who wouldn't want to learn languages from a fluffy dragon rather than an adult?"

But these robots are not just designed for children -- in fact many are intended for the elderly. The most famous example is PARO, the seal pup robot designed to alleviate stress in dementia patients. "People will treat certain robots more like an animal than a machine or a device," says Darling -- but there is a dark side to this too. "There are some questions of human autonomy and human dignity when you are deceiving people into thinking something is alive when it isn't." It's important that robots do not replace human care, but supplement it she says.

Privacy will become increasingly important as we start to invite robots into our homes, she adds. "I do not see the companies working in this technology really caring enough about privacy and data security at the moment," says Darling.


http://www.flickr.com/photos/potaufeu/2128423657/ Flickr CC: Pot au Feu

A video released last year by Boston Dynamics, now owned by Google, showed a person kicking one of its dog-like robots in order demonstrate its self-stabilising ability. Animal rights charity PETA received so many complaints about the video that it was forced to issue a statement. Darling has undertaken many experiments in which she encourages people to form attachments with robots -- in particular cutesy toys like Pleo dinosaurs and Hexbugs -- and what frequently comes through is "people's natural tendency to empathy".

Our attitudes and ethical behaviours towards robots are just one element of Darling's study however. The question she is tackling now focuses less on how our interactions with robots reflect our psychology and more on how robots can be used to affect change in humans. "Can we change people's empathy with robots -- might we able to use robots to make people more empathic?" she asks. This, she says is "at the core of what I view as ethics. I don't think robot ethics is about robots, it is about humans."