When US troops return home from a tour of duty, each person finds their own way to resume their daily lives. But they also, every one, complete a written survey called the Post-Deployment Health Assessment. It’s designed to evaluate service members’ psychiatric health and ferret out symptoms of conditions like depression and post-traumatic stress, so common among veterans.

But the survey, designed to give the military insight into the mental health of its personnel, can wind up distorting it. Thing is, the PDHA isn’t anonymous, and the results go on service members’ records—which can deter them from opening up. Anonymous, paper-based surveys could help, but you can’t establish a good rapport with a series of yes/no exam questions. Veterans need somebody who can help. Somebody who can carry their secrets confidentially, and without judgement. Somebody they can trust.

Or, perhaps, something.

"People are very open to feeling connected to things that aren't people," says Gale Lucas, a psychologist at USC's Institute for Creative Technologies and first author of a new, Darpa-funded study that finds soldiers are more likely to divulge symptoms of PTSD to a virtual interviewer—an artificially intelligent avatar, rendered in 3-D on a television screen—than in existing post-deployment health surveys. The findings, which appear in the latest issue of the journal Frontiers in Robotics and AI, suggest that virtual interviewers could prove to be even better than human therapists at helping soldiers open up about their mental health.

“Most people would assume these things are in conflict with each other—that you can’t have anonymity and rapport at the same time,” Lucas says. But a virtual interviewer can offer both. A few years ago, Lucas and her colleagues paired hundreds of test subjects with Ellie, an embodied AI designed to engage test subjects in verbal interviews. Participants sat alone in a room with the virtual therapist, who appeared and communicated via a television screen. Ellie would begin with general questions like “Where are you from?” to build rapport; gradually proceed to more sensitive, clinical queries, like “How easy is it for you to get a good night’s sleep”; and finish with mood-boosting questions, like “What are you most proud of?”

With a virtual interviewer, you don't have to ruin your career to begin seeking help. USC Psychologist Gale Lucas

But Ellie is no brainless bot. Unlike, say, Eliza, the 1960s computer program designed to respond to users with non-directional questions, Ellie uses machine vision to interpret test subjects’ verbal and facial cues and respond supportively. For example, Ellie not only knows how to perform sympathetic gestures, like nodding, smiling, or quietly uttering “mhm” when listening to a sensitive story—she knows when to perform them. Psychologists call these kinds of sounds and gestures backchannels. When interspersed appropriately throughout an interaction, they can help build rapport and elicit sharing.

Ellie's capacity for subtle and supportive engagement reveals fascinating things about humans, and how we choose to guard our secrets. Lucas and her colleagues told half their test subjects they’d be interacting anonymously with a virtual therapist. The other half were deceived into thinking there was a person pulling Ellie's strings. In the end, the participants who thought they were talking with the virtual therapist alone were significantly more likely to open up. For civilians, at least, just removing the idea of human presence led to more fruitful clinical sessions.

To see if Ellie could help soldiers reveal their PTSD symptoms, Lucas and her colleagues recruited soldiers recently returned from Afghanistan. As in the previous study, Ellie began each interview with rapport-building questions and ended with positive, mood-boosting ones. But this time, Ellie’s clinical questions were geared toward symptoms of PTSD, specifically. Questions like: