Last October, in an attempt to fend off additional regulation, advertising trade organizations and major Internet platform providers—including Google, Facebook, and Twitter—signed off on a voluntary code of conduct aimed at reducing the threat posed by fraudulently purchased political advertisements and the posting of "fake news" articles. But a report released by the European Commission today called the social media platforms to task for not living up to those voluntary measures to help protect upcoming elections across Europe in the next few months—and particularly the European Parliament elections in May.

In a joint statement issued by the European Commission, Vice President for the Digital Single Market Andrus Ansip; Commissioner for Justice, Consumers, and Gender Equality Věra Jourová; Commissioner for the Security Union Julian King; and Commissioner for the Digital Economy and Society Mariya Gabriel wrote:

[W]e need to see more progress on the commitments made by online platforms to fight disinformation. Platforms have not provided enough details showing that new policies and tools are being deployed in a timely manner and with sufficient resources across all EU Member States. The reports provide too little information on the actual results of the measures already taken. Finally, the platforms have failed to identify specific benchmarks that would enable the tracking and measurement of progress in the EU. The quality of the information provided varies from one signatory of the Code to another depending on the commitment areas covered by each report. This clearly shows that there is room for improvement for all signatories... We urge Facebook, Google and Twitter to do more across all Member States to help ensure the integrity of the European Parliament elections in May 2019. We also encourage platforms to strengthen their cooperation with fact-checkers and academic researchers to detect disinformation campaigns and make fact-checked content more visible and widespread.

The EC report specifically called out each of the major social media platforms for specific failures. Facebook was cited for not providing details of its efforts to scrutinize political advertisement placement, which the company said it began in January. Facebook had also promised a Europe-wide archive for political and issue advertising, to be available by March 2019. And while Facebook’s reports to the EC thus far have given details on “cases of interference from third countries in EU Member States,” the commissioners said, it does not provide the number of fake accounts removed due to “malicious activities targeting specifically the European Union.”

Google’s reporting to the EC detailed actions the company had taken to improve its oversight of advertisements targeting citizens of EU countries, but the commissioners found fault in how Google measured its result. “The metrics supplied are not specific enough and do not clarify the extent to which the actions were taken to address disinformation or for other reasons (e.g. misleading advertising),” according to a summary of the report published by the EC. And while Google issued a new election advertising policy in January that includes the creation of a transparency report, the company did not provide any evidence that it had done anything concrete to implement “integrity of services” policies in its January report to the EC.

For its part, Twitter did’t even provide a report to the EC in January—the company announced the expansion of its political advertising transparency report to Europe on February 19. Twitter did release five new data sets of what it called state-backed information campaigns, including collections of posts from accounts connected to campaigns by Russia, Iran, Bangladesh, and Venezuela, which are publicly downloadable. But Twitter did not give any details on how it was measuring progress on spotting such activity.

The commissioners plan on pushing for full compliance with the voluntary rules by before the May 2019 European Parliament elections and continuously monitoring the results of the platform vendors’ actions through the end of the year. “By the end of 2019, the Commission will carry out a comprehensive assessment of the Code's initial 12-month period,” the commissioners’ statement noted. “Should the results prove unsatisfactory, the Commission may propose further actions, including of a regulatory nature.”