“This edit was VERY poor,” wrote James Heilman, an emergency room doctor in British Columbia, to a Wikipedia contributor who had made a couple of changes toward the end of the article on the new coronavirus outbreak. Those edits recommended a special type of mask for blocking the transmission of the virus from those who have it, and Heilman, a prominent figure in reviewing medical Wikipedia articles, wanted to inform the editor that this advice was too sweeping and based on insufficient evidence. More than that, he aimed to send a warning. “Please do not make edits like this again,” he wrote.

Wikipedia’s reputation is generally on the ascent. Just last month, no less a publication than WIRED deemed it “the last best place on the internet.” What was once considered the site’s greatest vulnerability—that anyone can edit it—has been revealed to be its greatest strength. In the place of experts, there are enthusiasts who are thrilled to share their knowledge of a little part of the world with all of humanity. As Richard Cooke, who wrote the WIRED essay, observed: “It’s assembled grain by grain, like a termite mound. The smallness of the grains, and of the workers carrying them, makes the project’s scale seem impossible. But it is exactly this incrementalism that puts immensity within reach.”

Read all of our coronavirus coverage here.

His point, and it’s really indisputable, is that this mammoth online project has developed a personality, a purpose, a soul. Now, as the new coronavirus outbreak plays out across its many pages, we can see that Wikipedia has also developed a conscience.

The coronavirus articles on English Wikipedia are part of WikiProject Medicine, a collection of some 35,000 articles that are watched over by nearly 150 editors with interest and expertise in medicine and public health. (A survey for a paper co-written by Heilman in 2015 concluded that roughly half of the core editors had an advanced degree.) Readers of Wikipedia wouldn’t know that an article is part of the project—the designation appears on a separate talk page and really serves as a head’s up to interested editors to look carefully at the entries.

Once an article has been flagged as relating to medicine, the editors scrutinize the article with an exceptional ferocity. While typically an article in The New York Times or The Wall Street Journal would be a reliable source for Wikipedia, the medical editors insist on peer-reviewed papers, textbooks or reports from prominent centers and institutes. On these subjects, Wikipedia doesn’t seem like the encyclopedia anyone can edit, striving to be welcoming to newcomers; it certainly doesn’t profess a laid-back philosophy that articles improve over time and can start off a bit unevenly. The editor chastised by Heilman hasn’t returned to the article and instead is improving articles about sound-recording equipment.

By having these different standards within its pages, Wikipedia can be a guide to the big commercial platforms that have become way stations for fake cures, bogus comparisons to past outbreaks, and political spin. Twitter, Amazon, YouTube, and Facebook have all promised to cleanse their sites of this dangerous disinformation, but they are doing so in fits and starts and by relying in part on familiar, passive tools like acting when others flag dangerous content. Here is how Facebook's Mark Zuckerberg put it in a post on March 3: “It’s important that everyone has a place to share their experiences and talk about the outbreak, but as our community standards make clear, it’s not okay to share something that puts people in danger. So we’re removing false claims and conspiracy theories that have been flagged by leading global health organizations. We’re also blocking people from running ads that try to exploit the situation—for example, claiming that their product can cure the disease.”