Democracies Can’t Blame Putin for Their Disinformation Problem

With a looming election and an ongoing pandemic, concerns about Russian disinformation are never far from U.S. policymakers’ minds. So it was no surprise when former U.S. President Barack Obama shared an article on the subject last week, exhorting Americans to become more aware of foreign informational threats.

The problem is that the article, which focused on Russian COVID-19 disinformation efforts, had been roundly criticized by experts. They pointed out its factual inaccuracies, its amplification of obscure troll accounts, and its unwillingness to consider the domestic sources of disinformation that have plagued the United States since long before the pandemic.

The small episode highlights a recurring problem: Focusing on Russian President Vladimir Putin’s minions not only misses the big picture but can undermine the search for solutions. Shifting the blame onto foreign autocrats, though politically palatable in Washington, creates a neat distinction in which an innocent democracy is being subverted by shadowy outside forces. And once you accept this framing, the problems of the U.S. informational sphere become a foreign disease—one that requires inoculation and informational distancing but little in the way of looking after your own health.

Yet as research has increasingly shown, homegrown disinformation is making democracy sicker than any foreign efforts can. The U.S. media landscape has seen a proliferation of fake local news, homegrown conspiracy theories, and a steady stream of questionable information and selectively edited videos from President Donald Trump and top government officials. The presidential reelection campaign, in concert with a number of private efforts, is poised to spend over a billion dollars on what the Atlantic called the “most extensive disinformation campaign in U.S. history.”

This isn’t just a problem of the contemporary United States. There are immense incentives for disinformation built into democratic institutions themselves. Treating disinformation as an alien disease ignores the fact that it is perfectly compatible with democratic norms and thrives inside democratic states. A recent report by the Oxford Internet Institute, for example, found that politicians inside 45 democracies have used social media for “amassing fake followers or spreading manipulated media to garner voter support.”

The same factors that promote healthy democracies also promote the spread of disinformation. Democratic deliberation requires free flows of information and multiple competing narratives. It demands a variety of media and political actors trying to persuade their audiences in a marketplace of ideas. It tolerates and encourages opposing and even outlandish viewpoints. All these democratic advantages, unfortunately, also create massive opportunities for disinformation.

Framing the problem as democracy versus disinformation thus ignores the ways in which disinformation is woven into the very fabric of the system. Traditional methods of digital autocracy—blocking and censoring—rely on suppressing free flows of information and don’t live easily inside democracies. Disinformation, however, relies not on controlling information but on spreading it, even if the information is false, distracting, or otherwise worthless. (Which is why Russian observers, who have become deeply familiar with this method, call it “info-noise.”) Russia’s relatively free media landscape has made Chinese-style brute-force censorship less feasible, forcing the government to pioneer new techniques of narrative control.

Disinformation assumes that censorship will not always succeed and in fact may be counterproductive. Unlike traditional methods of control, it lacks the clear stamp of autocracy. Instead, the focus is on deluging people with a variety of provocative, distracting, and even blatantly false information. As a result, it lives quite comfortably inside democracies, is perfectly compatible with democratic norms, and is easily generated by democratic actors. After all, the “FUD” strategy of disinformation, in which fear, uncertainty, and doubt are used to influence both sales and politicians, was pioneered in U.S. lobbies and boardrooms. “We can’t blame Russia for all our troubles,” Alex Stamos, Facebook’s former chief security chief, told the New York Times recently. “The future of disinformation is going to be domestic.”

The goal of disinformation is not to dominate the informational space but to dilute it. “Flood the zone with shit,” was Steve Bannon’s infamous advice for dealing with the media. The ultimate goal of such a strategy is cynicism, overwhelming distrust of all news sources, and the fragmentation of a shared social reality. The opposite of internet freedom, therefore, is not censorship but a mix of control, co-option, and strategic distraction. The result, a fog of half-truths, leaves potential voters “numb and disoriented, struggling to discern what is real in a sea of slant, fake and fact.”

As a result, disinformation may be a bigger problem for democracies than other regimes. Democracies require a degree of social consensus to function, one created by a shared belief in a common reality. This consensus does not expect people to agree, but it does expect shared knowledge of what they disagree about. (To borrow from the late U.S. Sen. Daniel Patrick Moynihan, “Everyone is entitled to his own opinion but not his own facts.”)

Autocratic rulers, on the other hand, benefit from the distrust, cynicism, and social atomization produced by disinformation, precisely because it inhibits political engagement and paralyzes organized social movements. Social media echo chambers, deepfakes, and automated surveillance have all served to lower the costs of autocracy while undermining the benefits of open democratic deliberation. Far from being overwhelmed by free information, dictators are increasingly learning to take advantage of it.

Russia will undoubtedly continue its attempts to influence U.S. politics using these methods (hence why it’s better to see Russian interference as a process and not an event). But trolling, strategic distraction, and disinformation is not something primarily done to the United States by external actors. Instead, it stems primarily from domestic actors inside the country and benefits from the free discourse crucial to a functioning democracy. (If anything, it’s the United States that has increasingly served as a model of disinformation for other leaders.)

Scholars of autocracies have sometimes invoked the idea of the “dictator’s dilemma” to explain the choices rulers face in the informational sphere. Allowing increased availability of information, goes the argument, is crucial for regimes that want economic development but can also threaten their existence by opening up public discourse. What we see emerging now is the “democrat’s dilemma”—controlling information is inimical to democracy, but allowing it to spread unchecked creates disinformation that can undermine democratic discourse. Increasingly, this trade-off appears hardwired into modern democratic regimes. Regardless of Russian or Chinese efforts, the same institutional advantages that allow democracy to function also make disinformation pervasive and inevitable. An emphasis on foreign trolls, while politically convenient, creates an unhelpful distinction that obscures the root of the problem.

Russian disinformation is not an imaginary threat. But focusing on Russian teens posting on Facebook even as Trump, large media companies, and international organizations pour out a steady stream of false and misleading information is the equivalent of treating a cancer patient for a cold. The problem is more intractable than “Putin did it,” and the sooner policymakers accept this, the sooner they can start thinking about actual solutions.