Social network says tool will let users see if they have liked or followed accounts created by organisation that carries out misinformation operations

Facebook has promised to tell users whether they liked or followed a member of Russia’s notorious “troll army”, accused of trying to influence elections in the United States and the United Kingdom.



The social network says it will create a tool allowing users to see whether they interacted with a Facebook page or Instagram account created by the Internet Research Agency (IRA), a state-backed organisation based in St Petersburg that carries out online misinformation operations.



“It is important that people understand how foreign actors tried to sow division and mistrust using Facebook before and after the 2016 US election,” the company said in a statement. “That’s why as we have discovered information, we have continually come forward to share it publicly and have provided it to congressional investigators. And it’s also why we’re building the tool we are announcing today.”

The tool will not be able to warn everyone who may have seen content created by the IRA, however. The company estimates that more than 140 million people, across both Facebook and Instagram, may have seen a story or page initially created or shared by one of those Russian-run accounts, in addition to the 10 million people who saw adverts bought by Russian state-sponsored actors.

From Peppa Pig to Trump, the web is shaping us. It’s time we fought back | Jonathan Freedland Read more

The majority of those people will not have liked or followed a Russian-backed account, instead having seen the content when it was shared by friends or promoted on to their newsfeed through some other facet of Facebook’s curation algorithm.

Facebook will not tell those users about their exposure to misinformation, although the company has not said whether it is unable, or simply unwilling, to provide that information. A source close to the company described it as “challenging” to reliably identify and notify everyone who had been incidentally exposed to foreign propaganda.

Facebook has also declined calls to be proactive about informing users of their interaction with foreign propaganda. A mock-up of the tool provided by the company shows it in Facebook’s help centre, rather than in a more prominent position on the news feed. A Facebook spokesperson told the Guardian: “In the coming weeks, we will take significant steps to make users aware of this new tool.” When the tool is available, the company says, it will be easily accessible at Facebook.com/actionplan.

Both Facebook and Twitter have steadily been making public the results of their investigations into Russian influence operations on the 2016 US election.

In October, Twitter released to the US Congress a list of 2,752 accounts it believes were created by Russian actors in an attempt to sway the election.

Damian Collins MP, the Conservative chair of the digital, culture, media and sport committee, welcomed the new tool, but said there was more to be done by the company. “Although the Internet Research Agency is the most prolific Russian-backed disseminator of disinformation that has been discovered so far, I believe that it is just the tip of an iceberg.

“Facebook need to be developing tools that allow it to uncover fake news and fake accounts across its platform, no matter where they are geographically located.”

Both companies have yet to release equivalent information about an influence campaign which is believed to have occurred during the British referendum over EU membership. “It is to be welcomed that Facebook has now decided to provide transparency, as should all social media platforms,” said the Liberal Democrat MP and Brexit spokesperson, Tom Brake.

“However, that is little consolation to the 73% of young voters who wanted to remain in the EU, yet who now face the prospect of their futures being snatched away from them partly as a result of Russian meddling in the EU referendum. We now need a full and independent inquiry to establish the extent to which interference by foreign powers may have influenced the result of one of the most crucial British votes since the war.

“I will demand this in the cross-party three-hour debate on the subject of Russian interference in UK politics I have secured on 21 December in the House of Commons.”

Yin Yin Lu, a researcher at the Oxford Internet Institute, agreed with the need for a separate investigation into EU-specific interference. “What we have so far is a subset of the fake US accounts that happened to cross-post about Brexit as well, given the salience of the topic (which is why many of them were especially active on the day after the referendum).

“What we don’t know – and very much need if we hope to provide any substantial evidence about Russian interference in Brexit on social media – is if there is an equivalent list of fake UK accounts.”



In October Collins asked Facebook to investigate its own records for evidence that Russia-linked accounts were used to interfere in the EU referendum, and later asked Twitter to do similar. Collins gave the company a deadline of the end of November.