A future in which androids look and feel so much like humans that they start to believe they are actually alive - as depicted in the film Blade Runner - may soon be reality.

Scientists in Japan have invented a robot that can ‘feel’ pain and is programmed to visibly wince when an electric charge is applied to its synthetic skin.

The team from Osaka University is hoping that coding pain sensors into machines will help them develop empathy to human suffering, so they can act as more compassionate companions.

For lead researcher Prof Minoru Asada, who is also President of the Robotics Society of Japan, the question of whether robots could one day seem human is almost irrelevant.

“In Japan we believe all inanimate objects have a soul, so a metal robot is no different from a human in that respect, there are less boundaries between humans and objects,” he said.

In the 1982 film Blade Runner, which was based on the short-story ‘Do androids dream of electric sheep?’ by Philip K Dick, androids became so lifelike it was impossible to tell them apart from humans.

Asked if such a future was possible, Prof Asada said: “I think we are not far away from that technically, but obviously ethically that is another matter.

“We are embedding a touch and pain nervous system into the robot to make the robot feel pain so that it can understand the touch and pain in others. And if this is possible, we want to see if empathy and morality can emerge.

“We are aiming to construct a symbiotic society with artificially intelligent robots, and a robot that can feel pain is a key component of that society.

“Japan is a very high ageing society and many senior people are living alone, so these kinds of robots could provide physical and emotional assistance.”

The artificial pain system has been built into an eerily-lifelike robot child head called ‘Affetto’ which was unveiled by Osaka engineers in 2018.

But mapping 116 different points on the face, scientists are able to create nuanced expressions such as smiling and frowning, and now wincing, as seen above.

Affetto has soft tactile sensors which can detect both a gentle touch and painful blow and induce a range of facial expressions to demonstrate the level of discomfort.

Although presently a thump today produces a synthesised reaction, it is hoped that in the future the robot will be able to understand that it is being harmed through being hit, and experience a real kind of pain.

Dr Hisashi Ishihara, who helped design the robot, said such empathy was crucial if robots and humans were to live side by side.