Teachers are unable to upload historical footage of Nazi leader Adolf Hitler onto YouTube after the video hosting giant announced it will remove material that denies the Holocaust or glorifies Nazism, the Guardian reported Thursday.

British teacher Richard Jones-Nerzic told the newspaper that the policy was particularly problematic given that history students’ studies have a strong focus on World War II.

He told the newspaper that some clips he uploaded to his channel from old documentaries about the rise of Nazism now carry warnings that users may find the material offensive, while others were removed completely.

Get The Times of Israel's Daily Edition by email and never miss our top stories Free Sign Up

Jones-Nerzic said he was appealing the decision by YouTube, saying it was a “form of negationism or even Holocaust denial.”

Teachers were said to find the move particularly problematic as they use the site to show clips to students in class.

Scott Allsop, who runs an educational history website and teaches at an international school in Romania, said his YouTube channel was deleted for breaching hate speech guidelines.

YouTube sent him an email saying his channel was removed for uploading “content that promotes hatred or violence against members of a protected group.” His account was later reinstated after an appeal but he told the Guardian he had heard from other teachers who had the same problem.

The move by YouTube to delete videos that deny the Holocaust or glorify Nazism was announced on the company’s official blog, and marks a new stage in the battle against hate speech, the company said.

In 2017, Google’s video-sharing platform took a tougher stance against supremacist content, limiting actions such as sharing, recommending and commenting on clips. That policy, according to the company, reduced views of those videos by an average of 80 percent.

This latest stage aims at prohibiting videos that allege that a “particular group is superior in order to justify discrimination, segregation or exclusion based on qualities like age, gender, race, caste, religion, sexual orientation or veteran status.” The latter refers to the status of a person who has performed active duty in the military.

“This will include, for example, videos that promote or glorify Nazi ideology, which is inherently discriminatory,” the company said.

“Finally, we will remove content denying that well-documented violent attempts, like the Holocaust or the shooting at Sandy Hook elementary, took place.”

YouTube said it realized that content of this kind was valuable to researchers and NGOs trying to understand hate in order to combat it and that it was looking at ways to make it available to them in the future.

“And as always, context matters, so some videos could remain up because they discuss topics like pending legislation, aim to condemn or expose hate, or provide analysis of current events,” it said.

“We will begin enforcing this updated policy today; however, it will take time for our systems to fully ramp up and we’ll be gradually expanding coverage over the next several months.”

YouTube said that it piloted an update of its US systems in January to limit recommendations of “borderline content and harmful misinformation, such as videos promoting a phony miracle cure for a serious illness, or claiming the earth is flat.”

It said it hoped to bring the updated system to more countries by the end of the year.

“Thanks to this change, the number of views this type of content gets from recommendations has dropped by over 50% in the US.”