YouTube is changing its community guidelines to ban videos promoting the superiority of any group as a justification for discrimination against others based on their age, gender, race, caste, religion, sexual orientation, or veteran status, the company said today. The move, which will result in the removal of all videos promoting Nazism and other discriminatory ideologies, is expected to result in the removal of thousands of channels across YouTube.

“The openness of YouTube’s platform has helped creativity and access to information thrive,” the company said in a blog post. “It’s our responsibility to protect that, and prevent our platform from being used to incite hatred, harassment, discrimination and violence.”

The changes announced on Wednesday attempt to improve its content moderation in three ways. First, the ban on supremacists will remove Nazis and other extremists who advocate segregation or exclusion based on age, gender, race, religion, sexual orientation, or veteran status. In addition to those categories, YouTube is adding caste, which has significant implications in India, and “well-documented violent events,” such as the Sandy Hook elementary school shooting and 9/11. Users are no longer allowed to post videos saying those events did not happen, YouTube said.

Second, YouTube said it would expand efforts announced in January to reduce the spread of what it calls “borderline content and harmful misinformation.” The policy, which applies to videos that flirt with violating the community guidelines but ultimately fall short, aims to limit the promotion of those videos through recommendations. YouTube said the policy, which affects videos including flat-earthers and peddlers of phony miracle cures, had already decreased the number of views that borderline videos receive by 50 percent. In the future, the company said, it will recommend videos from more authoritative sources, like top news channels, in its “next watch” panel.

Finally, YouTube said it would restrict channels from monetizing their videos if they are found to “repeatedly brush up against our hate speech policies.” Those channels will not be able to run ads or use Super Chat, which lets channel subscribers pay creators directly for extra chat features. The last change comes after BuzzFeed reported that the paid commenting system had been used to fund creators of videos featuring racism and hate speech.

In 2017, YouTube took a step toward reducing the visibility of extremists on the platform when it began placing warnings in front of some videos. But it has come under continued scrutiny for the way that it recruits followers for racists and bigots by promoting their work through recommendation algorithms and prominent placement in search results. In April, Bloomberg reported that videos made by far-right creators represented one of the most popular sections of YouTube, along with music, sports, and video games.

At the same time, YouTube and its parent company, Alphabet, are under growing political pressure to rein in the bad actors on the platform. The Christchurch attacks in March led to widespread criticism of YouTube and other platforms for failing to immediately identify and remove videos of the shooting, and several countries have proposed laws designed to force tech companies to act more quickly. Meanwhile, The New York Times found this week that YouTube algorithms were recommending videos featuring children in bathing suits to people who had previously watched sexually themed content — effectively generating playlists for pedophiles.

YouTube did not disclose the names of any channels that are expected to be affected by the change. The company declined to comment on a current controversy surrounding my Vox colleague Carlos Maza, who has repeatedly been harassed on the basis of his race and sexual orientation by prominent right-wing commentator Steven Crowder. (After I spoke with the company, it responded to Maza that it plans to take no action against Crowder’s channel.)

Still, the move is likely to trigger panic among right-wing YouTube channels. In the United States, conservatives have promoted the idea that YouTube and other platforms discriminate against them. Despite the fact that there is no evidence of systematic bias, Republicans have held several hearings over the past year on the subject. Today’s move from YouTube is likely to generate a fresh round of outrage, along with warnings that we are on the slippery slope toward totalitarianism.

Of course, as the Maza case has shown, YouTube doesn’t always enforce its own rules. It’s one thing to make a policy, and it’s another to ensure that a global workforce of underpaid contractors accurately understands and applies it. It will be fascinating to see how the new policy, which prohibits “videos alleging that a group is superior in order to justify ... segregation or exclusion,” will affect discussion of immigration on YouTube. The company says that political debates about the pros and cons of immigration are still allowed, but a video saying that “Muslims are diseased and shouldn’t be allowed to migrate to Europe” will be banned.

The changed policy goes into effect today, YouTube said, and enforcement will “ramp up” over the next several days.