Facebook says it is working on ways to limit the reach of misinformation on the social network, including using its algorithms to de-emphasize what it calls “low quality” sources in the News Feed. But there appears to be very little it can do about what for many is an even larger problem: namely, hoaxes and conspiracy theories spreading via WhatsApp, the text messaging service Facebook acquired in 2014 for around $20 billion.

According to a number of reports from India on Monday, this kind of weaponized fake news has led to another two deaths at the hands of a mob outraged about alleged kidnappings. According to the BBC:

Two men became the latest victims of hysteria over WhatsApp rumors of child kidnappers. The men had stopped to ask directions in north-eastern Assam state when they were beaten to death by a large mob. Rumors of child kidnappings are spreading across India over WhatsApp, and have already led to the deaths of seven other people in the past month.

Police say several of these attacks on strangers have been fueled by a video circulating on WhatsApp, which appears to show a young child being abducted by two men on a motorcycle. But the video is not of an actual abduction, and it’s not from India at all—it is a clip from a child safety video produced in Pakistan, edited to remove the segment explaining its origins. To make matters worse, local media outlets have added to their credibility by reporting on them.

Last month, a 55-year-old woman was lynched by a mob after she handed out candy to children and a transgender woman was hung by a crowd because they suspected she was involved in the rumored kidnappings that have been spreading via WhatsApp. Four men have also been killed for similar reasons, in most cases because they were believed to be acting suspiciously and were not from the local area. One man was tied up and beaten to death with cricket bats. WhatsApp has also been implicated in the spread of other hoaxes that have also led to violence.

Police in many communities have been watching social media to try and stop the spread of the messages, and in one city they even marched through town with megaphones asking residents not to believe the rumors, while in at least one state, authorities arrested people who were spreading the video.

Unfortunately, there is very little the police—or anyone else for that matter, including Facebook—can do about these kinds of rumors spreading. While regular posts on Facebook are technically public (with certain restrictions set by users), messages sent via WhatsApp are typically sent from person to person, or to a very small group of friends. Even worse, however, is that the app uses end-to-end encryption, so even Facebook can’t see the actual content that gets posted.

Sign up for CJR 's daily email

This makes WhatsApp appealing for users who want to just message their friends and family the way they would with a regular text-messaging app, without having anyone listen in. But it also makes the app appealing for anyone who wants to spread misinformation, for whatever reason, because it’s almost impossible to track and even harder to get rid of.

Has America ever needed a media watchdog more than now? Help us by joining CJR today

Mathew Ingram is CJR’s chief digital writer. Previously, he was a senior writer with Fortune magazine. He has written about the intersection between media and technology since the earliest days of the commercial internet. His writing has been published in the Washington Post and the Financial Times as well as by Reuters and Bloomberg.