The ongoing controversy over Twitter’s decision not to permanently ban Infowars extremist Alex Jones following similar moves by companies such as Apple and Facebook landed Twitter CEO Jack Dorsey in front of NBC’s Lester Holt last week. Dorsey attempted to explain himself on a variety of thorny questions involving free speech and what he refers to as “sensationalist” and “de-humanizing” use of his platform, on which misinformation and harassment continues to be a serious problem. One particular exchange between Holt and Dorsey highlighted a significant flaw in the logic of social media executives confronting misinformation: that it is the job of journalists to debunk misinformation, and not the responsibility of platforms.

In tweets responding to the Infowars controversy, Dorsey stated that “accounts like Jones' can often sensationalize issues and spread unsubstantiated rumors, so it’s critical journalists document, validate, and refute such information directly so people can form their own opinions. This is what serves the public conversation best.” Holt questioned Dorsey on this point: “You put the onus on journalists to police some of the misinformation that’s put out there, is that fair?” In reply, Dorsey admitted to Holt: “No I don’t believe it’s completely fair. I went a little too far in that push.”

While Dorsey appeared chastened, the idea that it is someone else’s job to confront the problem is core to the way social media companies look at the ongoing challenges of misinformation and online toxicity. Dorsey’s initial statement betrayed the truth: Powerful companies like Facebook, YouTube and Twitter increasingly rely on the efforts of others to address the spread of misinformation on their platforms.

And it’s not just journalists they hope to conscript into the battle. Consider YouTube’s announcement earlier this year that it would rely on Wikipedia to address conspiracy theories (the company failed to inform Wikipedia before it made this announcement), or Facebook’s seeming reliance on a loose network of volunteers to spot fake accounts and debunk misinformation its own teams miss.

Issues of misinformation and online toxicity are challenging nations across the globe, and the reality is that these problems are too big for these companies to address. Evidence is piling up on the real world implications. That’s why it’s time to think about radical solutions. Here’s one: Governments should place levies on social media company profits to pay for measures that will address the externalities they produce.

This idea is not new. Carbon taxes are a good corollary. Governments across the globe have come to the conclusion that the emission of harmful greenhouse gases can be quantified, and companies producing these harmful emissions should be pay for their remediation. Why not \ treat the externalities created by social media companies as pollution; cleaning up this pollution can and should be paid for by the companies that create it.

Such a solution has already been proposed in the United Kingdom, where a parliamentary committee looking at the problems of misinformation suggested levies as a mechanism to expand the function of Britain’s watchdog agency, the Information Commissioner’s Office, and to pay for digital literacy campaigns. Expanding the funding of necessary oversight and programs to help citizens of all ages cope with the overload of information and think critically about the health effects of social media consumption seems fair. After all, these companies have been allowed by democracies to achieve enormous market dominance and systemic importance.

Expanding the funding of necessary oversight and programs to help citizens of all ages cope with the overload of information and think critically about the health effects of social media consumption seems fair.

What else might such levies fund? Perhaps the social media companies should be required to create trusts that fund a massive injection of capital into journalism. Google has voluntarily committed $300 million to such efforts to some good effect; what would a multiple of that number look like? It’s a rounding error at one level: Google’s parent company, Alphabet, saw revenues of $111 billion in 2017. Facebook saw net income of nearly $16 billion in 2017 on revenues of $40 billion.

Meanwhile, the combined advertising and circulation revenues of the entire newspaper industry in the United States in 2017 was $27.5 billion, a number that has shrunk precipitously even as the platforms continue to grow. “A strong news industry is also critical to building an informed community,” Mark Zuckerberg wrote last year. “Giving people a voice is not enough without having people dedicated to uncovering new information and analyzing it.” Sounds good — but perhaps democracies ought not rely on Zuckerberg’s limited generosity, and instead put some serious demands on his company.

“Addressing these diseconomies of scale- negative externalities borne by users and society as a result of the size of these platforms- represents a priority for technology policy in the 21st century,” Sen. Mark Warner wrote in a recent technology policy paper. These companies cannot address the problems they have created alone — now is the time to think about how to capitalize solutions at scale. Whether we are able to engage with our fellow citizens in a healthy information environment may well determine the fate of our democracy.

Justin Hendrix is Executive Director of NYC Media Lab, a consortium of universities and media companies exploring emerging media technology. NYC Media Lab’s membership includes NBCUniversal. Opinions expressed here are entirely his own. Follow him on Twitter (@justinhendrix).