The accusation leveled against Facebook is that the company plans to embed content moderation and blacklist filtering algorithms directly onto users’ mobile devices, scanning Messenger and WhatsApp messages before and after they are encrypted. The post by Leetaru points to potential future scenarios where the vast majority of phones would include this type of scanning, rendering encryption meaningless.

“Facebook’s model entirely bypasses the encryption debate by globalizing the current practice of compromising devices by building those encryption bypasses directly into the communication clients themselves and deploying what amounts to machine-based wiretaps to billions of users at once,” Leetaru writes, adding that this would “create a framework for governments to outsource their mass surveillance directly to social media companies.”

How did the rumor start? It has to do with the blogging platform itself and an unrelated presentation detailing potential ways to automate content moderation efforts on social platforms.

Forbes does not typically review blog posts by its contributors, who are not staff writers for the publication. The company did not immediately respond to a request for comment about this. (Disclosure: I was a Forbes contributor myself from July 2015 to January 2017.)

Though Leetaru originally stated in his post that a Facebook spokesperson declined to comment, Facebook tells OneZero this was not the case and that it gave Leetaru “background” information: context meant to inform an article without being quoted directly.

Reached via email, Leetaru stated that Facebook “did not dispute the characterization and pointed to [Facebook CEO Mark] Zuckerberg’s March blog post calling for precisely such filtering.” The post in question, “A Privacy-Focused Vision for Social Networking,” lists a plan for making Facebook more private by focusing on encrypted and ephemeral communication. Although the post states that Facebook might detect “patterns of activity or through other means, even when we can’t see the content of the messages” across apps, it does not specifically refer to client-side filtering of WhatsApp messages or private messaging. In other words, there’s no suggestion from Zuckerberg’s writing that a system is being developed to read user messages.

Following the references in Leetaru’s post led to another post of his about an alleged WhatsApp backdoor. That post linked to a video of a technical talk on Facebook’s developer site about the use of artificial intelligence to keep content that violates Facebook’s policies, such as hate speech, nudity and pornography, off of the network.

The moderation would be performed by content classifiers, which is when a machine learning model is trained to recognize specific images and learns to reliably predict whether or not an image depicts violent content, for example. A Facebook spokesperson said there’s no connection between this type of moderation and private messaging encryption.

“The article is completely off base,” said Weis of the Forbes post. The video being discussed was about filtering content before it’s posted to Facebook in the first place — the app could, for example, detect that an image is pornographic and simply prevent a user from uploading it to the News Feed. “It was never talking about WhatsApp.”

Granted, a user wishing to post whatever they’d like on social media might take issue with this kind of automated moderation on the client side. (Technically, moderation like this already occurs on Facebook’s servers once content is uploaded.) But the important distinction is that it does not represent a backdoor into your conversations on WhatsApp.

Further, Weis says that moderating content on people’s phones is actually a privacy win, if you’re concerned about material being stored on the social network’s servers. “Today if you post a picture that gets sent to Facebook, and then they run their content filtering, and it gets rejected, it gets taken down, but they still have it. In this case, your content will get filtered locally, before it ever gets sent over. So it reduces the amount of information that will be sent to Facebook in the first place.”

Although the Forbes piece raises concern that plaintext copies of moderated messages would be sent to Facebook, “that’s completely filling in the blanks,” Weis said. “Nobody is talking about doing this for WhatsApp, and even if they did, nobody is talking about sending the plaintext to the server.”