Not Like Us is Aviva Rutkin’s monthly column exploring the minds of intelligent machines – and how we live with them

Helpful or harmful? Artur Debat/getty

Early into the Forbidden Research event, Ethan Zuckerman issued a warning. “If we make it through today without you feeling uncomfortable, then we’ve done something wrong,” he said.

Zuckerman was addressing a packed room at the Massachusetts Institute of Technology Media Lab, assembled for a day devoted to “restricted scientific and cultural topics”. Forbidden Research promised academic danger zones, often considered too hot or too risky to touch: covert government surveillance, genetically engineered human beings, Islam and women’s rights to name a few.

For my money, the most excruciating topic on the bill came in a panel just after lunch: “Sexual Deviance: Can Technology Protect Our Children?” Over the next hour, two researchers would discuss a hypothetical near future where technology like robotics or virtual reality collided with child pornography.


This isn’t as far from the truth as you might think. A Japanese company named Trottla already ships child-sized sex dolls globally. Earlier this year, a Canadian man went on trial after he was arrested in 2013 for ordering such a doll. Under Canadian law, they are considered child pornography and he could go to jail for seven years.

But what if dolls like these could help rather than hurt? Ron Arkin, a robotics engineer at the Georgia Institute of Technology and one of the panelists, argued that people should not only legally be permitted to have such dolls, but perhaps some should be handed prescriptions for them. In his opinion, VR and sex robots might function as an outlet for people to express their urges, redirecting dark desires toward machines and away from real children. If it works, it could help past offenders reintegrate harmlessly into society as well as helping prevent those who have never offended from doing so.

Tough to measure

At the moment, there is a hidden population who have urges but are desperate not to act on them. These people are at best ignored by legal and medical systems that could help them. At worst, mandated reporting laws mean that admitting those urges – even to a mental health professional – can trigger an official report and a world of social and legal consequences. It’s down to policies like these that we don’t know how prevalent paedophilia is. Estimates suggest that it occurs in 0.5 to 1 per cent of the population, but these numbers are hard to confirm. People don’t want to officially admit to having these feelings because of these risks, so that leaves only the people whose actions have confirmed their diagnosis.

Patrice Renaud, a psychologist at the University of Montreal, Canada, has first-hand experience with the challenges of studying sexual urges, and the role technology plays. Every year, his team sees individuals referred to them by the court or specialised clinics and must assess whether they pose a danger to others. Subjects are hooked up to an eye-tracker, an EEG brain monitor and a device that measures blood flow to the genitals, and then exposed to a sexual stimulus. In the case of paedophiles, these stimuli used to be real photographs of children obtained during police raids. After that practice was banned in Canada, Renaud’s lab turned to audio recordings describing different sexual scenarios.

But both pose problems, says Renaud. Real photographs raise public concerns about the minors who are portrayed, along with problems for the scientific method: the haphazard collections aren’t standardised. Audio recordings, he says, aren’t sufficiently immersive to trigger a response.

So a few years ago, Renaud started to wonder whether virtual reality pornography could help determine someone’s sexual preferences in a more accurate and less morally problematic way than existing methods. In a series of experiments, Renaud and his colleagues exposed both non-deviant men and sexual offenders to computer-generated pornography and measured how their bodies responded. Individuals’ patterns of response matched their stated sexual preference, suggesting that VR can create a sense of what Renaud calls “sexual presence”.

Unknown effect

At the moment, Renaud’s lab focuses on how VR can help assess paedophiles, but he would also like to explore synthetic pornography for treatment. There’s plenty of precedent: VR is already being used to treat phobias, post-traumatic stress disorder and schizophrenia.

There aren’t a lot of other options. “Paedophilia is something that’s very difficult to treat,” he says. “You cannot change this sexual preference in itself as you can change a bad habit like smoking.” Those who do find their way to treatment can try cognitive behavioral therapy; in many countries, the more drastic option of chemical castration is available – or even forcible under law. Online, anonymous groups such as Virtuous Pedophiles have convened, offering support to people who don’t want to act on their desires, but who also don’t want to risk speaking to a therapist.

Perhaps, Renaud suggests, VR – coupled for example, with cognitive behavioural therapy – can help people learn to cope with and understand their desires. One project he’s working on will offer a walk through a computer-simulated park filled with “criminal opportunities”. He suggests another future therapy might combine virtual reality with neurofeedback to parts of the brain associated with empathy, to help paedophiles who have committed offences better grasp what their victims experience, in the hopes that this will prevent them from reoffending. In a controlled lab setting, a sex robot might help make the simulations seem even more realistic, adding touch and texture to these experiences.

Some researchers are cautiously optimistic about the idea. “It is possible that virtual child pornography content or other simulations such as child sex dolls or robots might be a safer outlet for at least some individuals who are sexually attracted to children,” says Michael C. Seto, director of the Forensic Research Unit at the Royal Ottawa Health Care Group in Canada.

But both he and Renaud pause at the notion of open access to child-sized sex robots – not when we don’t know what effect they might have on pedophiles. “I wouldn’t take any chance with that kind of use of robotics,” says Renaud. “Maybe some very intelligent and controlled individual could have such contact only with dolls, but for [others], I think that would only lead to the need to go further and to cross the line with real victims.”

Putting science before fear

In such a scenario, a bot would normalise deviant behavior, putting children in even greater danger. “We just don’t know the answer,” said Kate Darling, a human-robot interaction researcher at MIT. “We have no idea what direction this goes in and we can’t research it.” Funding is scarce, and it isn’t easy to find a group of paedophiles willing to participate in research. Such a line of inquiry would also be likely to provoke objections from many corners – such as the Campaign Against Sex Robots, which argued in a paper last year that technological sexual substitutes haven’t been shown to reduce demand for prostitutes.

It’s difficult to do objective research on paedophilia not least because of the moral and visceral revulsion it often provokes. But it may be time to wrestle with these fears. “It’s very important to understand this because we need to do more to prevent child sexual abuse and exploitation,” says Seto. And it’s only a matter of time before dolls like the ones sold by Trottla are souped up with artificial intelligence. How lifelike can they get? Will more realistic technologies help reduce the problem, or make it worse? We need to start figuring out what the impact will be. As Arkin told the panel, “The cost if we don’t explore it is intolerably high.”