“People forget that Google owns the biggest search engine in the world—and it also owns the second-biggest search engine in the world, which is YouTube,” said Joshua Benton, the director of the Nieman Journalism Lab at Harvard.

“Any document has multiple meanings. It can be used to promote an ideology, or it can be used to study an ideology. When you have a video document like this that is clearly created to inform people about what is going on, it seems like a bad idea to lump that in with raw hate speech,” he added. “That this happened to a known publisher, without any notification initially, is pretty disappointing.”

The Spencer video did not only feature Nazi salutes and an apparent allusion to “Sieg Heil!” During the three minutes of footage, Spencer refers to “the mainstream media” as the “Lügenpresse,” a term used by Adolf Hitler and other members of his regime to discredit critics in the free press.

“America was until this past generation a white country designed for ourselves and our posterity,” Spencer says in the footage. “It is our creation, it is our inheritance, and it belongs to us.”

Ultimately, the video prompted apparent divisions in the American far-right, splintering it into a hyper-reactionary “alt-lite” and a Nazi-aligned “alt-right.”

Spencer later appeared at the “Unite the Right” rally in Charlottesville, Virginia, in August 2017, which included neo-Nazis, the Ku Klux Klan, and other ethno-nationalist groups. A 32-year-old woman, who was protesting the presence of white nationalists in Charlottesville, was killed at that march after a car plowed into a crowd. A man with neo-Nazi ties has been charged with first-degree murder for her death.

Spencer led another torchlit march in Charlottesville less than two months later.

YouTube has struggled to deal with the presence of both white nationalists and Islamist extremists on its platform since last year. In a blog post in August 2017, the company announced that it would remove some features from “videos that aren’t illegal but have been flagged by users as potential violations of our policies on hate speech and violent extremism.”

It also bragged that “better detection and faster removal driven by machine learning” were helping it identify extremist videos faster than ever before.

“YouTube has become our civilization’s collective visual memory. And bravo to them for making a great service and for being so successful, but there should be some responsibility that comes with that role,” Benton told me. He noted that when Google removes an entry from search results because of a copyright notice, it informs users on the results page.

“Taking down a legitimate video by a known publisher without informing either the user or the publisher doesn’t seem to recognize that responsibility,” he said.

“It is so much easier to complain about Nazis on a platform than it is to actually deal with them,” said Benton. The challenge of moderating YouTube, he added, is “not that far removed from having to edit all human speech.”