YouTube took down 8.3 million videos in the last three months of 2017, responding to criticism it's slow to address inappropriate content on its site.

The majority of those videos were spam or attempts to upload adult content, the video-sharing site made the revelation Monday in its first quarterly moderation report. More than 80 percent of the videos removed were identified by machines rather than humans, highlighting the company's growing dependence on machine learning to cut down on content that violates the video-sharing site's policies.

"Our investment in machine learning to help speed up removals is paying off across high-risk, low-volume areas -- like violent extremism -- and in high-volume areas, like spam," the company wrote in a blog post. "We've also hired full-time specialists with expertise in violent extremism, counter-terrorism and human rights, and we've expanded our regional expert teams."

YouTube CEO Susan Wojcicki has previously said in a blog post that Google would increase the number of content moderators and other employees reviewing content and training algorithms to more than 10,000 in 2018.

The changes come in the wake of an advertiser boycott of the Google-owned video site over videos with children that were the target of sexually inappropriate comments. YouTube killed hundreds of accounts, removed more than 150,000 videos from the platform and turned off comments on more than 625,000 videos targeted by alleged child predators.

YouTube said Monday that of those 6.7 million videos identified by machines, 76 percent were removed before they received a single view.

iHate: CNET looks at how intolerance is taking over the internet.

Life, Disrupted: In Europe, millions of refugees are still searching for a safe place to settle. Tech should be part of the solution. But is it?