The war in Syria is horrifying.

According to the UN, over 3 million Syrian refugees are now in neighboring Turkey, Lebanon, Jordan and Iraq, with millions more displaced within Syria.

To help with this crisis, artificial intelligence startup X2AI is in the middle of a two week stay in Beirut, Lebanon, where it's piloting the use of artificial intelligence as a psychotherapy treatment for refugees.

Partnering with Singularity University and the Field Innovation Team, X2AI is pitching the psychotherapy bot (named Karim) to aid workers and refugee communities.

X2AI founder and CTO Eugene Bann watches on as a student from Jusoor school has a conversation with Karim in Arabic, and his first interaction with an AI. X2AI Karim helps therapists remotely monitor and care for patients, and can administer therapy itself.

X2AI founders Eugene Bann and Michiel Rauws tell Tech Insider that the goal is to support aid workers in giving refugees support (and offer support for aid workers themselves). Users don't have to download anything to interact with Karim — the bot is accessible by text or instant message.

Karim is a version of Tess, an artificial intelligence bot that Bann and Rauws have deployed with a healthcare chain in the Netherlands, currently in a two-month pilot. Tess is being deployed to under 100 patients, potentially scaling up after eight weeks.

The bot helps fill in the gaps when therapists aren't available.

"When Tess is there, she’s there 24/7," says Rauws, who's originally from the Netherlands. "When they’re really feeling bad, right at that moment they could discuss it with Tess, and record how they’re feeling."

Bann and Rauws have been working on Tess since 2014, when they met and realized they both had deep interest in building algorithms that understand emotions.

Bann and Rauws say that with Tess, a therapist could serve 100 clients in a day instead of 10. If a patient were to mention suicide, self harm, or say that they want to speak with a person, there's an automatic handover from the AI to the patient's therapist or another therapist at the same clinic or hospital.

According to Bann and Rauws, Tess (and Karim) are the best in the world compared to other conversational bots at "sentiment analysis," or the way the bot assesses and responds to the emotional content of words. With help from the 10,000 emotional states and 5,000 medical terms categorized in its database, Tess generally knows what you're talking about (similar to how Google's AlphaGo knows the right moves to make in a board game).

In a demo shown to Tech Insider, a user told Tess that they were feeling depressed. Tess replied, saying that mental health is like physical health.

"We all get sick sometimes, in different ways and for different amounts of time," Tess said. "You can and will overcome depression, just like you can heal from a broken arm."

Tess then asked if the user had done anything about the depression yet, acknowledged that depression can make people feel hopeless, and suggested that "a moment of self compassion" could be a start – complete with a link to a five minute exercises in doing so.

The bot will give homework like that to people (or a live therapist through Tess), and then automatically check in on whether the user is practicing that down the line.

Like a good friend (or counselor), Tess remembers you.

"Tess is not merely like the bots that you’re seeing Google make," says Bann. With competitors, "you say something and you get a response that seemingly makes sense and logical, but they’re not holding a conversation with you."

If Tess asked you how you were doing and you said you were really looking forward to seeing a space rocket launch, Tess would remember that. Then, in a following conversation, if you said that you were visiting NASA, Tess would recognize that as a good thing — since you said that you loved rockets before.

A Syrian refugee camp in Lebanon. Getty

Bann and Rauws say Karim is Tess's "little brother," designed to be deployed in the Middle East.

The founders say that in Middle Eastern culture, it takes more time to earn the trust of users, so Karim isn't as direct with asking users about their problems and immediately delivering therapy. Instead, it first asks users more generally about their families. Instead of directly telling users to do an exercise — like the self-compassion meditation mentioned above — Karim will ask patients to imagine a friend doing the same.

One difficulty that came up during a demo of Karim in Lebanon: convincing patients it's a bot, not a human.