Google CEO Sundar Pichai admitted today that YouTube needs to do better in dealing with conspiracy content on its site that can lead to real-world violence. During his testimony on Tuesday before the House Judiciary Committee, the exec was questioned on how YouTube handles extremist content that promotes conspiracy theories like Pizzagate and, more recently, a Hillary Clinton-focused conspiracy theory dubbed Frazzledrip.

According to an article in Monday’s Washington Post, Frazzledrip is a variation on Pizzagate that began spreading on YouTube this spring.

In a bizarre series of questions, Rep. Jamie Raskin (D-MD) asked Pichai if he knew what Frazzledrip was.

Pichai replied that he was “not aware of the specifics about it.”

Raskin went on to explain that the recommendation engine on YouTube has been suggesting videos that claim politicians, celebrities and other leading figures were “sexually abusing and consuming the remains of children, often in satanic rituals.” He said these new conspiracist claims were echoing the discredited Pizzagate conspiracy, which two years ago led to a man firing shots into a Washington, D.C. pizzeria, in search of the children he believed were held as sex slaves by Democratic Party leaders.

He also explained the new Frazzledrip theory in more detail, which he read about in The Washington Post’s report about the still rampant hateful conspiracies being hosted by YouTube. This newer conspiracy claims that Hillary Clinton and longtime aide Huma Abedin sexually assaulted a girl and drank her blood.

The Post said some of the video clips were removed after first appearing in April, and had been debunked, but its review of the matter found dozens of videos where the claims were still being discussed. Combined, these videos had been viewed millions of times over the past eight months. In addition, the investigation found that YouTube’s search box would highlight these videos when people typed in terms like “HRC video” or “Frazzle.”

YouTube’s policy doesn’t prevent people from uploading falsehoods, the Post’s report noted.

Raskin asked Pichai about this type of extremist propaganda.

“What is your company policy on that? And are you trying to deal with it?,” he questioned.

Pichai admitted, essentially, that YouTube needed to do better.

“We are constantly undertaking efforts to deal with misinformation. We have clearly stated policies and we have made lots of progress in many of the areas where over the past year — so, for example, in areas like terrorism, child safety, and so on,” said Pichai. “We are looking to do more,” he said.

In terms of the Frazzledrip theory, he said it was more of a recent happening.

“But I’m committed to following up on it and making sure we are evaluating these against our policies,” the CEO promised.

The issue with videos like Frazzledrip is that YouTube’s current policies don’t fully encompass how to handle extremist propaganda. Instead, as the Post also said, its policies focus on videos with hateful, graphic and violent content directed at minorities and other protected groups. Meanwhile, it seeks to allow freedom of speech to others who upload content to its site, despite the disinformation they may spread or their potential to lead to violence.

The balance between free speech and content policies is a delicate matter — and an important one, given YouTube’s power to influence dangerous individuals. In addition to the Pizzagate shooter, the mass shooter who killed 11 people at the Pittsburgh synagogue in October had been watching neo-Nazi propaganda on YouTube, the Post’s report pointed out, in another example.

Asked what YouTube was doing about all this, Pichai didn’t offer specifics.

The CEO instead admitted that YouTube struggles with evaluating videos individually because of the volume of content it sees.

“We do get around 400 hours of video every minute. But it’s our responsibility, I think, to make sure YouTube is a platform for freedom of expression, but it’s responsible and contributes positively to society,” Pichai said. He added that its policies allow it to take down videos that “incite harm or hatred or violence.” But conspiracy videos don’t always directly incite violence — they just radicalize individuals, who then sometimes act out violently as a result.

“It’s an area we acknowledge there’s more work to be done, and we’ll definitely continue doing that,” Pichai said. “But I want to acknowledge there is more work to be done. With our growth comes more responsibility. And we are committed to doing better as we invest more in this area,” he said.