Collective euphoria turned to harsh reality for social media in 2017. Propaganda campaigns came to light. Bot armies continued to bully, intimidate, and harass. Warehouses of trolls pushed political agendas. Public-comment processes were polluted.

Manipulating attention has never been easier.

The key weakness of social media — an inability to ensure the authenticity of communication and interaction — will continue to be exploited in 2018. And it’s going to get a lot worse.

Artificial neural networks are advancing rapidly in their ability to synthesize content — including images, videos, and texts — that are increasingly indistinguishable from authentic content. Just look at the results of state-of-the-art face synthesis here. Phony Yelp reviews that read as legitimate opinion can be algorithmically generated at scale too. These technologies enable believable social posts and profiles to be automatically synthesized whole cloth, essentially “imagined” by neural networks, and they will overrun legitimate speech online. How will we have the debates, dialogues, and dialectic we need to run a democracy?

In the arms race to secure the authenticity of online media, platforms will need to step up their internal protocols for both purging inauthentic accounts as well as identifying influence campaigns. They should be as transparent as possible about this without undermining their efforts. They should also recognize that this is too important an issue to take up solely on their own.

Journalists and other actors in civil society can play a role in helping to hold accountable the authenticity of the communications processes through which the public is informed. But they need far more access to data from platforms if they are to be effective. The platforms should enable this access, recognizing that observation by trusted parties will help identify how the system is being manipulated. Scale means that journalists also need powerful computational tools that can trace information flows. And the development of technically robust and adaptable media forensics tools will be essential so journalists can assess the authenticity of potentially synthesized media.

Appropriate data and tooling in the hands of computational journalists would enable the creation of a new beat covering social influence campaigns. An “online weather report” would show which ways the bot and troll winds were blowing and which topics or issues were being manipulated that day. By grappling with vast amounts of data using computational tools journalists could produce these reports (or even forecasts) that illuminate the flows of information online, fortifying the public against disingenuous and subversive media.