Humans should be "very careful" about developing intimate relationships with robots, experts have warned.

The Human Choice and Computers Conference, held on Wednesday, will study the question of technology and intimacy and examine whether humans could ever fall in love with a robot.

Dr Kathleen Richardson, a senior research fellow in the ethics of robotics at De Montfort University, told Sky News: "The biggest problem with it is that the idea that human needs, complex needs, can be met by inanimate objects. By things, basically.

"One of the first impacts of something like sex robots would be to increase human isolation because once you try to tell people that they don't need other human beings any more, one of the consequences of that is more isolation."

Dr Richardson runs the Campaign Against Sex Robots.


She said: "I created the campaign because I want people to really think about how we develop our technologies ethically.

"And I do want to live in a world where we think about how we can develop robots - robots that can help us with the hard jobs, the hard toils we have as human beings.

"But machines, inanimate objects, can't do relations. You can't manufacture human intimate relations, and that's what we're all about."

Others disagree that machines will never be capable of relationships.

Ghislaine Boddington, who researches how humans interact with robots at the University of Greenwich, told Sky News: "I think that will occur in terms that we're developing through AI empathy - the empathy side of robotics and other areas of technology.

"This is a side that's being pioneered at the moment.

"A synthetic emotion is when we're having some kind of relationship to an inanimate object, or a non-human human. It could be a robot it could be an avatar. But also we need to think a lot more carefully about the future."

Professor Charles Ess, from the University of Oslo, told Sky News: "In maybe 20 years, it will be not uncommon to have a sex bot around.

"There could be any number of scenarios in which this would be perfectly fine - beneficial and therapeutic. I don't see a moral objection to it.

"There are good reasons for a middle ground while recognising there are serious concerns."