Racism, sexism, and homophobia from abusive players can ruin an online gaming session. So Intel and London-based firm Spirit AI are prototyping technology that uses artificial intelligence to flag toxic behavior on voice chat and live streams.

Spirit AI created Ally, a tool that gaming companies can use to detect harassment and abuse. It's currently limited to processing text-based chat, but Intel has been working with Spirit AI in recent months to allow Ally to process human speech, too.

"A lot of games these days have voice chat, and that's where a lot of the more hostile behavior occurs," said Kim Pallister, Intel's chief technology officer for virtual and augmented reality.

Ally goes beyond picking out swear words and other offensive terms in online gaming chats. It's been designed to process entire conversations and their context, including whether the players are talking to friends, members of their own gaming "guild," or strangers.

To make it understand voice, Intel incorporated its speech-to-text technology with Ally, so that it can transcribe spoken language into words, which the system can then parse. "We're at a stage now, where 'Hey, it's kind of working,'" Pallister said.

To be clear, Ally's goal isn't to stop teenagers from uttering curse words or to prevent friends from engaging in smack talk. "We're not interested in that," said Peter Alau, Spirit AI's director of business development. "We're focused on stopping things like Russian bots, scammers, and people who are damaging your online gaming community."

(Demo of the Ally's tool speech-to-text ability.)

Although the toxic behavior can manifest itself in different ways, it also follows distinct patterns, which Ally has learned to recognize from friendly chats. The intelligence is then handed off to a game's online community manager, who can then decide how to respond, whether it be with suspensions or user bans.

Alau declined to say which gaming companies have already been using Ally. "No one wants to admit the level of toxicity they have," he said. But Spirit AI does have many interested customers.

"All of our potential clients are asking, 'So do you support voice yet?'" Alau added.

Of course, Ally's voice-recognition capabilities also face challenges, like distinguishing human words from music and other noise. It also has to be smart enough to understand different human accents.

But even though the system's voice-recognition capabilities have only reached a proof-of-concept stage, Alau expects Spirit to begin testing the technology with actual clients over the next three months. "By the end of this year, I expect a couple of clients to be fully invested in it," he added.

The thought of an AI monitoring your online game session may creep you out. Others may worry about the tool paving the way for censorship. But it's also true that games are a business focused on attracting as many customers as possible, and those customers need to be protected from abuse.

"If I really had to describe Ally, it's a customer intelligence tool cleverly disguised as anti-toxicity chat tool," Alau said. "For the moment, most of our developers and licensees, are only focused on the worst elements because of the direct effect on their bottom line."

How Ally is used to help enforce a game's code of conduct is up to the client. But on the privacy front, Alau said Spirit AI is a GDPR-complaint company and it anonymizes all the data that Ally receives. "We don't know who is saying it, and in general, we don't care," he said. "We're looking for the worst of the worst."

"There are plenty of places on the internet where the worst of the worst can go, and say terrible things. But it doesn't have to be in my game," Alau added.

Further Reading

Gaming Reviews