New York (CNN Business) YouTube says it will ban supremacist content and remove videos that deny well-documented atrocities, like the Holocaust and the massacre at Sandy Hook elementary school. The company says it will be removing hundreds of thousands of videos that hadn't previously been considered in violation of its rules.

The move comes as the the video service, owned by Google, faces increasing scrutiny for hosting extreme and divisive content.

On Tuesday, however, YouTube said videos mocking Carlos Maza, a video producer for the website Vox, for his sexual orientation did not violate its policies. A series of videos posted by a right-wing commentator included calling Maza "an angry little queer."

In a tweet to Maza, YouTube said, "while we found language that was clearly hurtful, the videos as posted don't violate our policies."

YouTube's ban on supremacist content comes a few months after Facebook said it was banning white nationalist content from its platform . That ban came two weeks after the suspect in the terror attack at two New Zealand mosques streamed part of the massacre live on the platform. A manifesto allegedly written by the suspect revealed white nationalist views.

Lenny Pozner, the father of a Sandy Hook victim who established the HONR network, which monitors the spread of false information online, told CNN Business that "it is impossible to calculate the damage" done by videos denying the massacre took place. But he added, "We thank YouTube for taking the lead in enacting policy changes that we have been recommending for the past half-decade."

An attorney representing 10 of the families who lost relatives in the Sandy Hook massacre said on Wednesday that he welcomed YouTube's decision to remove videos denying the shooting. However, he said it was "too late to undue the harm" that has been caused to them by conspiracy theories circulating on the platform over the last several years.

YouTube has long faced criticism for allowing misinformation, conspiracy theories and extremist views to spread on its platform, and for recommending such content to users. It's been shown that people who visited the site to watch videos on innocuous subjects, or to see mainstream news, have been given recommendations pushing them toward extreme content.