As the Wuhan coronavirus spreads, so too is misinformation about it being disseminated across YouTube, Facebook, and Twitter.

There are nearly 3,000 confirmed cases of the SARS-like respiratory virus in China, five in the U.S., and two in Canada. So far, 82 people have died. Fifteen cities in China have been placed under full or partial lockdown, Hong Kong’s chief executive Carrie Lam has declared the outbreak an emergency, and the U.S. State Department today issued an official advisory telling citizens to reconsider traveling to China.

Scientists believe the virus may have originated in snakes, but some denizens of the internet have instead latched on to conspiracy theories to explain its origins. According to a new report from the Washington Post, people on Facebook have spread claims that the U.S. government created the virus or bought a patent for it, and have suggested the virus is a form of population control (again, created/deployed by the government). Twitter users are sharing racist claims that Chinese dietary habits sparked the virus. YouTube videos making similar claims are popping up, with one reportedly reaching at least 430K views.

On top of all this, and perhaps more dangerously, users are beginning to hawk supposed cures or preventative measures, talking up things like oregano oil or colloidal silver (neither of which are effective against coronavirus). One YouTube video, which WaPo reports has more than 20K views, falsely claims the virus has already killed 180,000 people, and offers fake cures.

YouTube tells Tubefilter it combats the spread of false information by surfacing authoritative content like trustworthy news sources in its search results and Up Next panels. For breaking news topics like coronavirus, it adds short previews of text-based news articles interspersed between videos, and posts a reminder that breaking and developing news can rapidly change, the platform says.

When we searched for “coronavirus” on YouTube, we were served dozens of videos from outlets like Global News, CBC, Bloomberg, TIME, NBC, and ABC. We did not see text articles or a reminder.

YouTube also tells us that videos containing false information generally don’t violate its Community Guidelines unless they cross the line into hate speech, harassment, inciting violence, or propagating scams.

All three platforms tweak content so users are less likely to see misinformation

As for how Facebook and Twitter are handling the situation…Facebook has mobilized its third-party fact-checkers to mark posts that claim coronavirus is fake and/or a government invention, plus posts hawking fake cures, as false. It also lowers those posts’ ranks, so they’re less likely to show up in users’ main feeds even if shared by friends. Additionally, organizations that work with Facebook are issuing statements of facts about coronavirus that directly contradict conspiracy theories being spread on the site, WaPo reports.

“This situation is fast-evolving and we will continue our outreach to global and regional health organizations to provide support and assistance,” Andy Stone, a Facebook spokesman, told the outlet.

Twitter, like YouTube, is pushing legit information in its search results. Spokesperson Katie Rosborough said it’s expanding a feature to the Asia-Pacific region so “when an individual searches a hashtag they’re immediately met with authoritative health info from the right sources up top.” (It’s not clear where else that feature is available, or if Twitter is using it combat coronavirus misinformation in regions outside APAC.)

Tweets that spread misinformation are apparently being removed as violations of Twitter’s policy against coordinated efforts to mislead users.

All three platforms have faced criticism for how they keep misinformation or unwanted content like graphic videos from spreading across their sites, but YouTube is perhaps the most embattled. Most recently, it’s faced allegations that some climate change-related videos recommended in its Up Next bar contain false information such as outright denial it’s happening or claims that humans have no part in global warming. YouTube has called the study’s methodology into question, and issued a statement reiterating that it surfaces “authoritative voices” on topics often cluttered with misinformation.