The year was 2009 and social media platforms were desperate. “Content moderation” was no longer so straightforward as policing for nudity and graphic violence. Each deletion or suspension carried far-reaching political consequences — consequences with which Silicon Valley was unprepared to deal. “Please, take this power away from us!” one senior executive at an unnamed tech company pleaded in a meeting with a U.S. government counterpart.

So recalled Nathaniel Gleicher, Head of Cybersecurity Policy at Facebook and former Cybersecurity Director for the U.S. National Security Council, as he spoke at the Digital Forensic Research Lab’s 360/OS event on June 20–21, 2019, in London, the United Kingdom. Nowadays, as Gleicher explained, the dynamic is very different. Content moderation and disinformation — with its multitude of political considerations — now represents a central focus of Facebook and other large social media platforms.

Nathaniel Gleicher, Facebook’s Head of Cybersecurity Policy, speaks at the DFRLab’s 360/OS event in London on June 21, 2019. (Source: YouTube)

Gleicher’s remarks offered insight into how these issues will evolve in the years ahead:

1. Privacy Emphasis and Continuing Disruption of the Open-Source Community. As the DFRLab and Atlantic Council’s Adrienne Arsht Latin America Center noted in a joint March 2019 report, the proliferation of WhatsApp and other end-to-end encryption services made it very difficult to conduct open-source monitoring of 2018 elections in Mexico, Colombia, and Brazil. While good for the privacy of users, the closed nature of these platforms renders it impossible to contextualize the reach and influence of disinformation trends. With Mark Zuckerberg’s stated intention of shifting to a similar “privacy-focused” model for Facebook and Instagram, these challenges will grow steeper.

One consequence of these shifting priorities can be seen in the June 11 termination of Facebook Graph Search, which had long been an invaluable tool for the open-source research community. Gleicher offered some of the first on-the-record comments regarding the decision, explaining that the flexibility of Graph Search had left it prone to abuse. As an example, he described an adversarial data-scraping operation that had sought to infer the hidden sexual identity of Facebook users, leaving them open to intimidation and harassment.

2. The “Professionalization” of Information Operations. Since 2016 and the explosion of interest in the disinformation and open-source research community, it has been fairly straightforward to distinguish amateurish digital influence operations from the more sophisticated efforts of nation-states. As Gleicher observed, however, this is quickly changing.

A prime example can be seen in the DFRLab’s recent analysis of the privately owned Archimedes Group, an Israel-based marketing firm. The firm operated at least 265 inauthentic assets across Facebook and Twitter, working to manipulate elections in Nigeria or slander media in Tunisia on behalf of its paid clients. The Archimides Group’s sophisticated tactics were essentially identical to those previously attributed to Russia’s Internet Research Agency. It is a sign of things to come.