Who should do the questioning? Image Source/Alamy Stock Photo

WOULD a child open up to a robot? A team at Mississippi State University is suggesting using robots to question children in investigations of child abuse. But not everyone is convinced.

Children’s accounts are often vital evidence in cases of abuse. But even specially trained police interviewers can find it tough to stay neutral when talking to children. This can result in leading questions and bad evidence, because children can be very suggestible to saying what they think someone wants to hear.

The stakes are high: poorly conducted interviews can lead to someone being convicted of a crime they didn’t commit, or a child being returned to an abusive environment.


Cindy Bethel and Zachary Henkel at Mississippi State University say robots could reduce bias and lead to more reliable outcomes.

Best-practice guidelines for police interviewers in child abuse cases include asking open-ended questions and maintaining neutral body language, facial expressions and vocal tone. Such procedures improve the quality of information obtained, but can be hard to follow. A 2014 report into child sexual abuse cases in the UK described police compliance with guidelines as “poor”.

“The techniques are not perfect, because humans are not perfect,” says Bethel. She and Henkel suggest that an interviewer could remotely control a robot that asks questions. That way, the interviewer can focus on asking the right questions, without worrying about their delivery. More advanced future robots might be able to conduct the whole conversation. “Robots will always follow the procedure, no matter the situation,” says Bethel.

“Interviewers find it difficult to talk to children who have been abused. Robots don’t”

Robots could also monitor a child in ways an interviewer can’t, using sensors to record body movement to help see if they are upset or uncomfortable.

And there is evidence that children will open up to a robot. In one study, children were as willing to share a secret with a robot as they were with a human interviewer. In another, children were more willing to share details about bullying with a robot.

This may not always be a good thing. “There is a risk of children being tricked into disclosing information that they do not wish to disclose,” says Henkel. Testimonies acquired through deception would be inadmissible as evidence, so it would be important for children to understand that their conversations with a robot will be shared with authorities.

One of the biggest hurdles could be if robots inadvertently encourage creative storytelling. “Interview rooms are normally very plain, because when they are not, people embellish their stories more often,” says Henkel. We don’t know if a robot could have the same effect. “Children might really want to continue talking with the robot, so could say things that aren’t true to continue doing so.”

Bethel and Henkel presented their work at the Conference on Human-Robot Interaction in Vienna, Austria, this month.

Michael Lamb at the University of Cambridge isn’t convinced that robots would be better than adults at interviewing children. His research focuses on getting high-quality information from children by creating a caring but non-suggestive relationship during interviews. “I am doubtful that this will be easily achieved [with robots],” he says.

But Marilena Kyriakidou at Coventry University, UK, who trains police interviewers in Cyprus, says robots could bring huge benefits, with more research. “Interviewers say that it’s difficult to talk face to face with children who have been abused. Robots won’t have that problem,” she says.

This article appeared in print under the headline “Robots could help police interview children”