Illustration by João Fazenda

Alex Jones has drawn millions of people to his Infowars brand by promoting conspiracy theories and noxious bigotry; he monetizes his following by selling survivalist gear and dietary supplements such as the testosterone booster Alpha Power. (“Infowarriors and Patriots of the world know that it takes real vitality to push back in the fight against the globalist agenda.”) Like President Trump, Infowars has updated for the digital era the timeworn populist tactic of manufacturing controversy to attain influence. Last week, after Apple, Facebook, and Google removed some of Infowars’ content from their platforms, Jones attracted even more attention; he described himself as the victim of “a war on free speech.” He urged his fans to back him by buying more of his stuff. (Downloads of the Infowars app soared.)

The action by the Silicon Valley companies was, in one respect, bold: congenitally reluctant to alienate any customer, they decided to absorb the predictable outcry from the far right. In another respect, however, their decisions were routine. Social-media platforms continually censor content that they judge to be offensive or illegal. Facebook employs or contracts with thousands of “moderators” in countries such as India, Ireland, and the Philippines. With the aid of algorithms, they review more than a million pieces of flagged content daily. The moderators—censors, really—take down postings and may ban users if they detect a violation of their corporation’s rules. Such censorship is not unconstitutional. The First Amendment protects us against governmental intrusions; it does not (yet) protect speech on privately owned platforms. Still, the Internet and social media increasingly function as a “modern public square,” as Justice Anthony Kennedy put it in a 2017 Supreme Court opinion. This has created new dilemmas concerning free expression.

The forums of Google and Facebook seem quasi-public in part because of their extraordinary reach. Facebook’s two hundred million monthly users in the U.S. constitute about three-fifths of the American population. Its algorithms and its censors’ judgments, though they inevitably affect commerce and political competition, are based upon rules that aren’t all published. When moderators at Facebook, Google, and Twitter review the appropriateness of posted content, they generally follow First Amendment-inspired principles, according to Kate Klonick, a legal scholar who analyzed the practices of the three companies in the Harvard Law Review last year. Some of the platforms’ standards are unsurprising, such as their bans on pornography and terrorist incitement. Other rules require moderators to block “hate speech,” an ambiguous term that, despite Facebook’s efforts at delineation, can be politicized. Still other censorship reflects sensitivities that arise from operating in dozens of countries, including some run by dictators. In 2012, Gawker obtained a Facebook contractor’s bizarrely eclectic list of topics requiring careful scrutiny. These included the poaching of endangered animals, Holocaust denial (a crime in Germany, among other countries), maps of Kurdistan, and the defamation of Atatürk.

Despite this surveillance, extremist activists and propagandists managed to create huge numbers of fake accounts and distribute millions of pieces of made-up or incendiary content on Facebook during the 2016 election. (They also mounted propaganda campaigns on YouTube and Twitter.) Some abusers sought merely to make money; others tried to inflame and mislead voters. Russia’s state-directed interference was intended to help elect Donald Trump, according to American intelligence agencies. It took Facebook a year to discover and disclose the scale of the problem. That epic fail has altered the environment in which the company and its competitors now make decisions about content like that of Infowars.

Facebook and YouTube have long positioned themselves as neutral platforms, akin to eBay, open to all who are willing to abide by community standards. They’ve resisted the argument that they are in fact publishers—that their human moderators and algorithms function like magazine editors who select stories and photos. But Facebook’s stance has seemed to shift recently. In April, its founder, Mark Zuckerberg, told Congress, “When people ask us whether we’re a media company or a publisher, what they’re getting at is: do we feel responsible for the content on our platform? I think the answer is clearly yes.” This is a be-careful-what-you-wish-for intersection; none of us will be happy if Silicon Valley engineers or offshore moderators start editing our ideas.

Donald Trump and his far-right fellow-travellers have vigorously exploited the neutrality of social-media platforms. The Administration and its allies may occasionally bash Silicon Valley or pressure platform companies to remain open to alt-right channels such as Breitbart or Infowars. But Trump is unlikely to try to delegitimize the social-media giants in the way that he has sought to discredit professional journalism. The President forged his election victory on social platforms; it is by now difficult to imagine a Twitter-less Trump (and a Trump-less Twitter is an increasingly distant memory). Nor is Trump likely to endorse the tough legal steps taken recently in European countries such as Germany, where Facebook, Google, and other platforms risk fines of as much as fifty million euros if they don’t quickly remove prohibited content when notified about it.

The First Amendment would likely preclude German-style regulation here, in any event; for better or worse, America is a nation forged from raucous speech. Yet it’s one thing to defend openness and another to tolerate malign interference in election campaigns. The challenge is to combat external propaganda and bot-farmed lies without allowing Facebook, Google, and the like to become even more powerful arbiters of news or public debate. The companies themselves must strike this balance. One way would be to do much more to affirmatively promote fact-based journalism. As for Alex Jones and his fevered legions, assuming they were subjected last week to the same rules that all other users of Apple, Google, and Facebook must comply with, they can now adapt or go elsewhere. Still, we should be wary of celebrating any instance of censorship, especially by opaque corporations. There are other ways to challenge hatemongers—at the voting booth, for example. Practices that marginalize the unconventional right will also marginalize the unconventional left. In these unsettled times, the country could use more new voices, not fewer. From its origins, the American experiment has shown that it is sometimes necessary to defend the rights of awful speakers, for the sake of principles that may help a free and diverse society renew itself. ♦