Want the best from VICE News in your inbox? Sign up here.

Updated: 2:05 p.m. ET

The official YouTube live stream of a House Judiciary Committee hearing on hate crimes and white nationalism was so quickly overwhelmed by racist commenters that it had to be removed in the first 20 minutes.

In a painfully ironic twist, the live chat tool that appeared beside a live-streaming of Tuesday’s hearing on hate crimes and tech’s role in the rise of white nationalism was immediately flooded by hateful comments, coded alt-right terms, and trolls.

A spokesperson for the social media company said it made the decision to disable comments because “hate speech has no place on YouTube.”

“We’ve invested heavily in teams and technology dedicated to removing hateful comments and videos, and we take action on them when flagged by our users,” a YouTube spokesperson told VICE News. “Due to the presence of hateful comments, we disabled comments on the live stream of today’s House Judiciary Committee hearing.”

Representatives from Google, which owns YouTube, and Facebook joined civil rights experts and conservative activist Candace Owens, to testify before Congress.

And as they spoke about the many problems ailing these platforms, trolls online helped prove their point.

For example, as Dr. Mohammad Abu-Salha — whose two daughters and son-in-law were murdered at the University of Chapel Hill in 2015 in an apparent hate crime — gave a heartfelt opening statement, viewers spewed hateful sentiments in the chat bar. “Muslim American is an oxymoron,” one commenter wrote. Another posted emojis showing a clown (which the alt-right has recently claimed as a symbol) and the “okay sign,” which is used to signify “white power.”

Another wrote in response to Abu-Salha’s testimony, “HER NAME WAS EBBA ÅKERLUND,” in reference to a young Swedish girl killed in a terrorist attack in 2017. “Ebba had dreams too,” wrote another commenter.

Åkerlund’s death has become symbolic among white supremacists; they use her name to peddle the conspiracy that whiteness is under attack. The manifesto published online by the alleged New Zealand mosque shooter contained numerous references to Åkerlund. Her name was also carved into one of his guns.

Another commenter wrote “screw the dead kebabs.” The New Zealand shooter had painted “kebab remover” on one of his guns that he used to kill 50 people last month, and in his manifesto, he described himself as a “kebab removalist.” “Kebab removal” is a common 4chan or 8chan meme that references the genocide of Bosnian muslims, called Turks, in the 1990s.

YouTube’s snafu wasn’t lost on members of the committee. “This just illustrates the problem that we’re dealing with,” said House Committee Chairman Jerry Nadler, pointing to reports of the platform's woes during the hearing.

Things didn’t appear to get much better for the video streaming giant after the broadcast ended. YouTube’s algorithm automatically began playing a live streamed segment by Red Ice TV, a far-right outlet popular among white nationalists. The title of the segment was “House Judiciary committee Hearing on Criminalizing Nationalism for White people.” YouTube's algorithm recommends users content based on what they've previously watched (VICE News' coverage of extremism, which involves monitoring far-right content online, may have played a role in the algorithm’s suggestion). YouTube did not immediately respond to VICE News’ request for comment.

The video platform has fended off mounting criticism that it's algorithm plays a role in the online radicalization of its users. In a recent interview with the New York Times, YouTube chief product officer Neal Mohan denied that the company helps drive users towards extremist content.