Google, the world’s largest technology firm and FACEIT, a renowned independent platform for professional gaming competitions, have come together to develop an AI system which can identify and punish toxic players.

Toxicity in basic words means rude or abusive behavior exhibited by players in online multiplayer games and this is an issue which has plagued the gaming world for all eternity. Dota 2 and Counter Strike: Global offensive are the two major titles suffering with this problem and are often said to have the most toxic online communities compared with any other online game. FACEIT being one of the biggest platforms for competitive matchmaking for games like Dota 2, CS GO, League of Legends and many more, has partnered with Alphabet’s Jigsaw Labs to curb toxicity in online gaming through identifying toxic players in a server and penalizing them for bad behavior.

How will they do it?

Jigsaw Labs is a subsidiary of Alphabet Inc (Google’s parent company) and it acts as a technology incubator by applying technological solutions to global challenges. FACEIT’s partnership with Google discloses the development of an Admin Artificial Intelligence system called as ‘Minerva’, the first of its kind to make its way to esports. This AI system makes use of machine learning to scan through thousands of chat messages and quickly identifies toxic texts. The AI then critically analyzes the breach and immediately notifies the abuser in a matter of seconds making him/her aware of their wrongdoings. Minerva can also effectively ban repeat offenders, the ones who continue to exhibit toxic behavior after being notified several times in a short span of time. The AI system does not need human moderation and can reportedly function on its own.

Does it work?

The testing phase commence in August on FACEIT’s servers and so far, Minerva has analyzed 200,000,000 chat texts and marked over 7,000,000 as toxic. The system issued over 90,000 warnings and 20,000 bans for verbal abuse and spam messaging. FACEIT as an additional measure, enforces phone number verification for accounts marked with toxic behavior, smurfing and spamming. Since the application of the system, FACEIT saw a 20% decline in toxic behaviour owing to the threat of instant punishment. Hence we can say that it definitely works.

Measure against cheating, toxic behavior, smurfing, spamming, account boosting have become customary and they give us hope that someday we will be able to enjoy these multiplayer games in the true sense they are supposed to be enjoyed.

Stay tuned for more news and updates!