YouTube deleted over eight million videos during the fourth quarter of 2017 for violating their content policies, according to numbers released by the company on Tuesday.

Over six million of these videos were deleted automatically before any viewers saw them. The company said that most of the deleted videos were spam or adult content.

The numbers in YouTube’s new Community Guidelines enforcement report are part of Google’s quarterly transparency report.

ADVERTISEMENT

The disclosure comes as YouTube has faced criticism that it does not properly police content on its platform. Over the past two years, major companies have pulled ads from YouTube after their advertisements showed up alongside extremist and hateful content.

Recently, Procter & Gamble and Under Armour pulled ads from the video streaming service over those concerns.

Lawmakers have also scrutinized YouTube over extremist content on its platform.

The company said that it has ramped up its efforts to use algorithms to detect and remove such videos. But it also relied on humans to flag over 1.5 million videos that its automated systems missed.

Google CEO Sundar Pichai touted his company’s efforts in an earnings call on Monday, according to Variety.

“Even as we invest in new experiences, we stay very focused on making sure that YouTube remains a safe platform with great content,” Pichai said. “We are aggressively combating content that violates our strict policies through a combination of user and machine flags.”