After months of stubborn resistance, Facebook has finally caved to pressure from users and officials around the globe: it will begin experimenting with ways of filtering its news content for so-called “fake news.” These are the outlets that cynically push elaborate conspiracy theories like “Pizzagate,” and much more straightforward lies like fake Pope-quotes, while also attempting to hide their nature by mimicking the style and naming scheme of traditional news outlets. Some of the sites’ own proprietors admit the content is made-up — but even fighting such an exaggerated problem, it’s a dangerous idea to moderate political speech in its new most relevant forum. Facebook could very well be starting down a path that actually fosters less truth, less freedom in the conversation, and an even sharper level of political polarization than we see today.

Ramparts Magazine (1962 – 1975) was a left wing rag, a frequent mouthpiece for KGB misinformation, and a booster of rank conspiracy theories — and we would be much worse off today, had it never existed. The extreme viewpoints of the Ramparts staff, and the associated willingness to work outside of traditional systems of power, made the magazine the perfect platform for some of the most necessary journalism of the time. Ramparts published the first-ever interview with an NSA leaker/whistleblower, providing the very first look at the incredible sophistication of global surveillance, and it helped expose the CIA’s use of aid organizations to cover covert actions in Vietnam. Its founders have gone on to help create publications like Rolling Stone, Mother Jones, and TruthDig.

The point is not that Ramparts was denounced by the establishment as propaganda and, yes, fake news, but that Ramparts really was both of those things, from time to time. Back when ideas could only really be disseminated over government-regulated airwaves and newsstands on government-run streets, the constitutional principle of free speech protected outlets like Ramparts. More extreme political magazines and even proto-trolling publications like the National Enquirer dropped the bar even lower, kept in check by little more than plagiarism and libel laws. Back then, the question of what to do about fake news was a largely academic — with the exception of anti-socialists in the Cold War, few wanted the government to crack down on speech. Since the government was the only body that could crack down, the whole situation ultimately came to nothing (except for the socialists).

Now, of course, the situation is very different. In most places, the conventional airwaves host a smaller and smaller proportion of the most important content, and the weakness of the modern media business keeps well-meaning new entrants from being able to compete in physical space. In the internet age, the most important point of distribution for news and information is social media — but since privately held companies like Facebook aren’t subject to the same restrictions as the government, they have far more leeway in how to take advantage of that position of power.

This means it’s worth asking what we might have lost, had Ramparts‘ genuinely incorrect assertions about, say, the Kennedy assassination, been able to trigger a freeze on the majority of the magazine’s distribution. Personally, I think we gained more through the dissemination of the magazine’s true ideas than we lost through the dissemination of its false ones, and I think its more ideological approach to journalism is directly responsible for its ability to dig up or attract those few vital stories it reported that more traditional outlets could not. Radical political newsletters have always been a vital part of challenging power and keeping the government in check, but they’ve also been a constant source of offensive or outright false information.

Though they might not even see it this way themselves, from an outside perspective the overall strategy of these publications could be generously described as lowering the standards of evidence so they can catch those truths that just so happen to be very difficult to verify to a newspaper editor’s satisfaction, but which might nonetheless be true and important. This can obviously be used as cover for less principled uses; the aforementioned National Enquirer has published some interesting exposés, and recently managed dig up a story that got it genuine consideration for a Pulitzer Prize, but the tabloid’s net effect overall has still been to lower the amount of true and important knowledge in the world.

In this way, fake news of the modern sort is not so different from the various political rags of old, and especially similar if much of it turns out to be foreign counter-intelligence. So it’s tempting to think that the impact might not be much greater, and that all the current worrying is for nothing — but those radical old magazines were visibly different from newspapers, they were found in different places, and they came from different people. On Facebook, and the internet in general, there are fewer built-in sorting methods to let people naturally differentiate the status quo from the radical outsider, and adjust their thinking accordingly.

In the face of this new trend, and still reeling from an upset Trump victory that many partially credit to fake news stories, left-leaning users and politicians alike demanded that Facebook step up and save them from extreme gullibility — that is, the extreme gullibility of other people. But Zuckerberg doesn’t want to step up and enforce a blacklist of fake cash-grab news sites, because then he would have to take responsibility for that list being too harsh, not harsh enough, or more likely both at the same time. So, Facebook is offloading this problem on the same team of problem-solvers Silicon Valley has used so many times in the past: the crowd.

Facebook wants to begin using a more aggressive “flagging” system so users can alert them to potentially fake news. If the social media giant ends up rejecting the vast majority of these take-down mobs and removes only the most obvious and egregious examples of fake news, this could very well work. If it begins to take down what we might call genuine ignorance, rather than just cynical trickery, it will meaningfully harm democracy, and society as a whole by stifling free (yes, even sometimes false) expression. This has the potential to become little more than users voting on the legitimacy of ideas — which would be pretty ironic, as part of a quest to reinforce the importance of objective truth.

In addition, this all has the potential to increase the echo-chamber aspect of social media. If flagging of fake, or potentially someday just offensive news becomes widespread, the natural shift will be toward making sure your news postings only go out to a group that is less and less likely to flag them. The flag, implemented too eagerly, could end up causing a decline in omnivorous political reading just as easily as a decline in fake news — or an even more extreme split, in which a large portion of America moves away from Facebook entirely, and toward a more segregated and “friendly” environment.

Thankfully, Facebook seems to be extremely reluctant to get involved at all, as seen in its decision to pull humans out of trending news entirely. This might not just be up to Mark Zuckerberg, however, as legislators in the EU and elsewhere are looking to blow right past any sort of reasonable implementation and fine Facebook 500,000 Euro if it doesn’t delete a particular link within 24 hours of a government’s all-knowing demand to do so. Such a policy basically destroys the distinction between private and government censorship, and it’s an interesting idea to see coming out of a country like Germany, which is usually so touchy about anything remotely resembling fascism.

These are the tools necessary to do real censorship of real and important ideas. Whether Facebook intends to use them that way initially or not, and I’m sure it doesn’t, the simple fact is that there’s no other way a trend toward censorship could possibly start, than this. Simply by setting a precedent that this can be done, by creating the software tools and acclimatizing the public to the idea that their view of current events has been pre-fact checked by their peers and even less accountable forces, Facebook is stepping toward a precipice. And when it has significantly more users than China has citizens, that means it’s taking the whole world right along with it.

If it accepts the impossibility, and indeed the active harm, of trying to root out all falsehood from Facebook, this new approach to news doesn’t have to be a disaster. But with governments, users, and shifting corporate leaders all struggling to reform the conversation according to their own best interests, such a measured way forward seems unlikely, to say the least.