Keep your kids far, far away from this horrific WhatsApp game.

Authorities in multiple countries, including the U.S., have issued warnings to parents of a disturbing challenge called Momo, facilitated via WhatsApp.

(Image credit: Momo uses this terrifying statue (which is not affiliated with the game) as its avatar. (Credit: CEN) )

Cops in Argentina are currently investigating the game's potential link to the death of a 12-year-old girl near Buenos Aires. The cops, who have unlocked the girl's phone, claim to have evidence that she filmed a video of herself shortly before she died for the purposes of the Momo game, reports Mirror.

Here's how it works. Your kid adds a mysterious phone number to their WhatsApp contacts. The number then sends them violent images, and orders them to follow grotesque orders, often involving posting images and videos of self-harm or suicide.



MORE: Best Parental Control Apps - iPhone and Android



The game "controller" claims to know personal information about the player, and threatens them if they don't follow orders.

Argentinian officials are currently investigating the identity of the mysterious "controller." They've reported that the number is associated with an 18-year-old.

This isn't the first social media game that has encouraged children to commit suicide. Since 2016, a social media game called the Blue Whale challenge has been proposed to be linked to at least 130 deaths, according to The Sun (but authorities have not yet found it directly responsible for any). Instagram flagged the #bluewhalechallenge tag with a warning that the posts were known to lead to self-harm, but did not remove the posts.

Unlike Momo, Blue Whale originated in a Facebook group. The administrator assigned daily tasks to complete. The earlier tasks included watching horror movies and waking up at unusual hours, but escalated to self-harm and suicide.

“WhatsApp cares deeply about the safety of our users," a WhatsApp spokesperson said in a statement to Fox News. "It’s easy to block any phone number and we encourage users to report problematic messages to us so we can take action.”