Social media clears posts even after the Guardian reveals plot to control at least 21 far-right pages and spread disinformation

This article is more than 9 months old

This article is more than 9 months old

Facebook is telling users that Islamophobic posts distributed through a clandestine network of far-right pages meet its “community standards”, despite revelations they are being used as part of a coordinated scheme profiting from hate and disinformation.

The Guardian revealed on Friday that an Israel-based group had gained access to at least 21 far-right Facebook pages with vast followings across the western world.

The 21 pages were used to coordinate the distribution of more than a thousand “news” posts each week to more than 1m followers, spreading disinformation and hate targeting Muslims, promoting far-right politicians and vilifying prominent Muslim politicians.

The motive for the operation appears to be financial. The Facebook posts funnelled users to a cluster of ad-heavy websites, all controlled by a single entity.

The hate factory: inside a far-right Facebook network Read more

After being presented with the Guardian’s findings, Facebook launched its own investigation and pulled down pages and accounts it says were spamming content for financial gain. Facebook said it did not tolerate hate speech on its platform.

But the Guardian has learned that, since the original story was published on Friday, Facebook has been telling users that dozens of the posts distributed through the network meet its community standards.

The posts cleared by Facebook include one on Australian Facebook page “Assimilate or Migrate” that falsely associates the German chancellor, Angela Merkel, with the quote “The killing of Jews by Hezbollah is not terrorism”. Another uses an altered image to depict Merkel with blood splattered on her hands and face alongside a story about Germany’s support for “pro-Hamas” resolutions at the United Nations.

A third post distorts news about child brides in Turkey to attack Muslims.

The posts were distributed across the network in a coordinated way and drive users back to the cluster of websites, milking the traffic for money.

One of Facebook’s core community standards is “authenticity”, which incorporates restrictions on “spam”, “inauthentic behaviour”, “false news” and “misrepresentation”.

But, despite this, Facebook told the users that the posts were acceptable. When asked why the posts were deemed acceptable, a spokesman said Facebook’s investigations into the Guardian’s revelations were ongoing.

“We’ve taken action on a number of pages and accounts, some of which were shared by The Guardian Australia, and we’ll continue to take action if we find any further violations,” he said.

The network has been partially dismantled since public reports about its existence. Facebook took some pages and accounts offline on Friday.

The coordinated content that was spread through the 21 pages has largely ceased. Many of the websites that received traffic from the Facebook posts have been taken down, fully or partially.

That has not stopped calls on Facebook to explain why it did not detect the network sooner. Facebook has repeatedly promised to do more to shut down inauthentic coordinated behaviour, particularly in the wake of the Cambridge Analytica and Russian interference scandals.

The network uncovered by the Guardian has been operating for two years with relative impunity, and grew significantly in the months since Mark Zuckerberg apologised to the US congress for the company’s failings.

Now, the social media giant is being urged to give evidence to a new Senate inquiry into foreign interference on social media. The inquiry was established by the Australian parliament on Thursday to consider the “use of social media for purposes that undermine Australia’s democracy and values, including the spread of misinformation”.

Its chair, Labor MP Jenny McAllister, said Australians expected “social media platforms to do more”.

“I don’t think we’ve got this right yet, I don’t think the social media platforms have got this right yet and part of the job of this committee is actually to get all of those stakeholders in the room and create a forum where we can have a really good discussion about what are the boundaries, about what is and isn’t acceptable on these type of matters,” she said.

A separate Guardian US investigation in November revealed a number of white nationalist pages were operating openly on Facebook eight months after a promised ban on such content came into effect. Two of the pages named in the report were subsequently banned.