Image Credits: Bryce Durbin

Facebook will release over 3,000 ads bought by a Russian entity to interfere in U.S. politics and the 2016 presidential election with congressional investigators tomorrow morning at 8am Pacific, a Facebook spokesperson confirmed to TechCrunch. Facebook’s disclosure to the House and Senate Intelligence Committees and the Senate Judiciary Committee will include information on the ads’ content and targeting, as well as the accounts that paid approximately $100,000 for them to run between 2015 and 2017 in the U.S. It previously announced these ads were tied to 470 accounts and Pages “associated with a Russian entity known as the Internet Research Agency.”

Facebook believes that congressional investigators for the three comittees are best placed to review the ads and make determinations on them based on their access to classified intelligence and information from all relevant companies and industries, beyond Facebook own internal investigation, according to a spokesperson. Facebook does not plan to release the ad data publicly.

Congressional investigators could combine Facebook’s data with that which Twitter has pledged to provide. This includes data on 201 accounts suspected of having engaged in misinformation campaigns on Twitter, and $274,100 in spend on U.S. ads in 2016 by Russian government-linked news outlet Russia.

Google, Facebook, and Twitter have been invited to testify to the Senate Intelligence Committee on November 1st about the Russian probe. However, Senator Mark Warner said Twitter’s initial briefing to the Senate Intelligence Committee was “inadequate”. A Facebook spokesperson confirmed Facebook had received the invitation but didn’t have more to share on whether Facebook will in fact testify.

Facebook already shared this information with special counsel Robert Mueller around September 15th the Wall Street Journal first reported, likely because it received a search warrant, but initially withheld them from congress to avoid violating federal privacy laws or disrupting the Mueller probe. On September 21st Facebook announced that it planned to share these ads with congress, and is now confirming that tomorrow morning will be when that happens. Facebook also plans a long list of changes to its political ad buying systems which could be seen a sufficient self-regulation to deter the need for official government regulation.

Facebook says the goal is to provide the relevant information necessary for congressional investigators to understand and contextualize the ads. These ads are believed to have been a purposeful attempt by Russian operatives to influence the U.S. presidential election.

Who Is The Internet Research Agency?

The Internet Research Agency has been described by the New York Times as “an army of well-paid ‘trolls'” based in St. Petersburg, Russia that have tried to spread misinformation through coordinated Internet campaigns to interfere with U.S. foreign policy and boost Russian president Vladimir Putin.

The company was tied to the 2014 #ColumbianChemicals hoax where trolls used a falsified CNN screenshot, YouTube video, and Wikipedia page to convince people that a fictional chemical processing plant in Louisiana had been blown up by ISIS, with accounts urging the US to bomb Iraq in response. Russian newspapers had at one point reported that the Internet Research Agency had 400 employees and a $400,000 per month budget. They had also written that the Internet Research Agency is funded by Evgeny Prigozhin, a restaurant owner and oligarch with big Russian government contracts and close ties to Putin.

The investigation of Facebook’s data could strengthen the known connections between the Internet Research Agency, political misinformation campaigns, and the Kremlin.

Facebook’s Ongoing Plan To Thwart Election Interference

Two weeks ago Facebook CEO Mark Zuckerberg announced a 9-point plan to curb election interference, help people and investigators understand what happened, increase ad transparency, and improve election integrity. Tomorrow we may learn more about how Facebook will implement those plans, which are:

Providing Russian-bought ads to Congress Continuing Facebook’s own investigation Enhancing Political ad transparency Implementing stronger political ad reviews Hiring 250 more election integrity workers Expanding partnerships with election commissions Collaborating with other tech companies Protecting political discourse from intimidation Monitoring the German election

The first plan will be fulfilled tomorrow. The last plan was carried out successfully in an election that saw German chancellor Angela Merkel secure a fourth term, but with reduced authority as radical right-wing party AfD joined parliament as the third-largest party.

Facebook took actions including deleting tens of thousands of suspicious account, fighting fake news in clickbait, providing ways for politicians to share their stances, and working with the German election commission. Facebook’s VP of Public Policy for EMEA Richard Allan wrote that “These actions did not eliminate misinformation entirely in this election – but they did make it harder to spread, and less likely to appear in people’s News Feeds.”

Facebook’s forthcoming changes could further reduce the potential for misuse of its platform to sway elections, and provide tactical assistance to other tech giants trying to do the same. Facebook may need to make it harder for organizations to buy election ads, which could reduce the revenue it receives from this business.

But that’s a small price it seems willing to pay in order to prevent the voice it gives everyone being misappropriated to spread lies and propaganda. I’ve called on Facebook to more proactively anticipate the worst-case scenarios for how its platform can be employed, rather than assuming people will utilize it only in good faith or that problems are just low-scale edge cases. With enough technology, human moderation, and cooperation with governments and peers, it could put the safeguards in place to steeply reduce the chances that this foul play ever reaches the News Feed again.