YouTube on Wednesday announced that it updated its policies to officially ban videos that promote extremist ideologies such as white supremacy or caste superiority, a move that will likely result thousands of videos being removed.

In a blog post on Wednesday, YouTube said it will begin "prohibiting videos alleging that a group is superior in order to justify discrimination, segregation or exclusion based on qualities like age, gender, race, caste, religion, sexual orientation or veteran status."

ADVERTISEMENT

The Google-owned company said videos that "glorify Nazi ideology" would fall under that category.

It also said it would begin promoting more "authoritative content" in recommendations, an attempt to address the deluge of conspiracy theories on the platform and criticisms that users are often drawn into a "rabbit hole" of false or hateful content by YouTube's recommendation algorithm.

"For example, if a user is watching a video that comes close to violating our policies, our systems may include more videos from authoritative sources (like top news channels) in the "watch next" panel," YouTube wrote.

The policy changes come two years after YouTube announced that it would begin limiting the distribution of videos promoting "supremacist content."

Now, the company says it will bar creators who violate its hate speech policies from running ads on their channel or otherwise monetizing their content.

"It’s critical that our monetization systems reward trusted creators who add value to YouTube," the company wrote. "We have longstanding advertiser-friendly guideline that prohibit ads from running on videos that include hateful content and we enforce these rigorously."

"In the case of hate speech, we are strengthening enforcement of our existing YouTube Partner Program policies," it wrote. "Channels that repeatedly brush up against our hate speech policies will be suspended from the YouTube Partner program, meaning they can’t run ads on their channel or use other monetization features like SuperChat."

YouTube has faced staunch criticism, at times from lawmakers on Capitol Hill, for its handling of hate speech and harassment. Critics have accused the company of allowing white supremacists to gain enormous followings and monetize their hateful content without any penalties from YouTube, which has explicit policies against hate speech and harassment.

On Tuesday night, YouTube roiled its critics again when it announced that it would not be taking action against conservative commentator Steven Crowder, who has been making videos using racial and homophobic slurs against Vox Media journalist Carlos Maza for two years. Maza in a viral Twitter thread last week posted a montage of Crowder calling him an array of discriminatory names - including "lispy queer" and the "gay Latino from Vox - and accusing YouTube of allowing bigots to profit off of hateful and discriminatory content.

YouTube in response said it had investigated the flagged videos and determined while Crowder's words were "hateful," they did not violate their policies.

All of the top social media platforms, including YouTube, have faced intensifying scrutiny for how they deal with hateful and extremist speech on their platforms. Facebook in March announced it would begin banning white nationalist or white separatist content on its platform, though reports have since shown neo-Nazis have still found a home on the platform. And while Twitter has vowed to address the "health" of conversation on its platform, but white nationalists have still gained significant followings on the website.

"We recognize some of this content has value to researchers and NGOs looking to understand hate in order to combat it, and we are exploring options to make it available to them in the future," YouTube said on Wednesday about the ban on supremacist content. "And as always, context matters, so some videos could remain up because they discuss topics like pending legislation, aim to condemn or expose hate, or provide analysis of current events."

The company said it will begin enforcing its new policy on Wednesday, but it will likely take more time for the systems to "fully ramp up."