SAN FRANCISCO (Reuters) - YouTube said on Wednesday it would remove videos that deny factual catastrophes such as the Holocaust ever happened and stop sharing ad revenue with channels that skirt too close to its rules, a major policy reversal as it fights criticism that it provides a platform for hate speech and harassment.

The streaming service, owned by Alphabet Inc’s Google, said it was taking aim at videos claiming school shootings and other “well-documented violent events” are hoaxes. It also will remove videos that glorify Nazi ideology or promote groups that claim superiority to justify discrimination.

In addition, video creators that repeatedly brush up against YouTube’s hate speech rules, even without violating them, will be removed from its advertising revenue-sharing program, YouTube spokesman Farshad Shadloo said.

YouTube applied the policy with immediate effect, announcing on Twitter that it suspended “monetization” for self-described conservative comedian Steven Crowder after days of complaints from journalists and gay rights advocates that his comments about a gay journalist amounted to bigotry.

Crowder responded in a Twitter video that the policy would hurt many creators.

“The ability for one to make a living online, particularly on YouTube, is about to change drastically,” he said.

Criticism of the policy swelled on Twitter as several users said they also received monetization suspensions. They said they were told they could change their content and re-apply to the program in 30 days.

“Taking steps impacting people’s speech should be done with care, with attention to context, with clarity and transparency, and a meaningful, timely opportunity to appeal,” said Katharine Trendacosta, manager of policy and activism at the online rights group Electronic Frontier Foundation. “What YouTube is doing does not appear to follow these ideas.”

YouTube for years has stood by its policy of allowing diverse commentary on history, race and other fraught issues, even if some of it was objectionable to many users.

FILE PHOTO: Silhouettes of mobile device users are seen next to a screen projection of Youtube logo in this picture illustration taken March 28, 2018. REUTERS/Dado Ruvic/Illustration/File Photo

Regulators, advertisers and users have said that free speech should have its limits online, where conspiracies and hate travel fast and can radicalize viewers. The threat of widespread regulation, and a few advertiser boycotts, appear to have spurred more focus on the issue from YouTube and researchers.

EXPERT VIEWS SOUGHT

In a blog post, the company did not explain why it changed its stance but said it consulted with dozens of experts in extremism and civil rights.

YouTube acknowledged the new policies could hurt those who seek out objectionable videos “to understand hate in order to combat it.” The policies will also frustrate free speech advocates.

“YouTube as a private company is well within its rights,” said Jennifer Granick, a speech and technology expert at the American Civil Liberties Union. But “YouTube will make mistakes, and over-censor.”

Jonathan Greenblatt, chief executive of the Anti-Defamation League, which researches anti-Semitism, said it had provided input to YouTube on the policy change.

“While this is an important step forward, this move alone is insufficient and must be followed by many more changes from YouTube and other tech companies to adequately counter the scourge of online hate and extremism,” he said in a statement.

Other types of videos to be removed under YouTube’s new rules include conspiracy theories about Jews running the world, calls to deny women civil rights on the grounds they are less intelligent than men, and some white nationalist content, Shadloo said.

YouTube said creators in the revenue-sharing program who are repeatedly found posting borderline hate content would be notified when they do it one too many times and could appeal their termination. The company did not respond to questions about what the limit on such postings would be.

The policy applied to Crowder’s comedy channel because its “pattern of egregious actions has harmed the broader community,” YouTube said.