YouTube just took all paid advertising away from videos that contain anti-vax messages, rightfully saying they contain dangerous and harmful content.

Before we celebrate that decision, it’s important to note that the move wasn’t entirely up to the company. In fact, they only made the change after advertisers started to pull their own ads from the anti-vax videos… which only happened after BuzzFeed News asked them about it.

YouTube on Friday said it would prevent channels that promote anti-vax content from running advertising, saying explicitly that such videos fall under its policy prohibiting the monetization of videos with “dangerous and harmful” content. The move comes after advertisers on YouTube pulled their ads from these videos, following inquiries from BuzzFeed News. “We have strict policies that govern what videos we allow ads to appear on, and videos that promote anti-vaccination content are a violation of those policies. We enforce these policies vigorously, and if we find a video that violates them, we immediately take action and remove ads,” a YouTube spokesperson said in an email statement to BuzzFeed News. Earlier this week, BuzzFeed News found that while YouTube usually returns a top search result for queries like “are vaccines safe” from an authorized source such as a children’s hospital, its Up Next algorithm frequently suggested follow up recommendations for anti-vaccination videos.

Apparently, seven different advertisers said they didn’t know their ads were being shown on anti-vax videos that promote false beliefs.

This issue with YouTube’s algorithm has been a consistent problem. In fact, just last week we reported on a study that showed most Flat Earthers credit YouTube for their conversion. That’s in part because, once you watch one conspiracy video, YouTube will often recommend Flat Earth “documentaries.”

YouTube hasn’t addressed the Flat Earth issue just yet — no one’s dying from it — but because the advertisers’ retreat from anti-vax videos are now affecting the Google-owned company’s bottom line, they seem to be in crisis management mode.

In addition to demonetizing anti-vax content, YouTube also introduced a new information panel pertaining to vaccines. Previously, information panels appeared on anti-vax videos that explicitly mentioned the measles, mumps, rubella (MMR) vaccine, and only described what the MMR vaccine is for and linked to its Wikipedia page. Now, a considerably larger number of anti-vax videos have an information panel that links to the Wikipedia page for “vaccine hesitancy”, where it is described as “one of the top ten global health threats of 2019” according the World Health Organization.

This is a step in the right direction, but it’s undeniable that YouTube has become a breeding ground for false and even potentially dangerous conspiracy theories. That’s not to say that the company itself endorses all of these ideas, but they create a platform where those views can thrive and they haven’t done enough to put a stop to it. It’s a problem every social media outlet has to deal with. In the case of vaccinations, though, sowing doubt is putting lives at risk. Removing the incentive for content creators to spread dangerous myths is one way to get in front of the problem.

(Image via Shutterstock)

