Get breaking news alerts and special reports. The news and stories that matter, delivered weekday mornings.

YouTube announced on Thursday that it will disable comments on most videos featuring minors after a number of advertisers withdrew their ads.

"We recognize that comments are a core part of the YouTube experience and how you connect with and grow your audience," the company wrote in a blog post. "At the same time, the important steps we're sharing today are critical for keeping young people safe."

Byers Market Newsletter Get breaking news and insider analysis on the rapidly changing world of media and technology right to your inbox. This site is protected by recaptcha

The company said that in the past week it has taken "a number of steps to better protect children and families" by suspending comments that violate its policies and deleting "tens of millions of videos," but now will take its efforts one step further and begin to suspend comments on "most" videos that feature minors.

Recently, there have been some deeply concerning incidents regarding child safety on YouTube. Nothing is more important to us than ensuring the safety of young people on the platform. More on the steps we're taking to better protect children & families: https://t.co/5ZYaMrMpsI — Susan Wojcicki (@SusanWojcicki) February 28, 2019

The announcement comes after the video platform faced a backlash from advertisers, including AT&T, Nestlé and Epic Games, following a viral video that claimed YouTube-hosted videos of young children were found to attract comments from apparent pedophiles.

"Until Google can protect our brand from offensive content of any kind, we are removing all advertising from YouTube," an AT&T spokesperson said in an email to NBC News on Feb. 21.

YouTube said there will be a small number of creators that will be able to keep their comments enabled on their videos, but those accounts will have to "actively monitor" their comment feeds and demonstrate "low predatory behavior."

In addition to suspending comments, the video platform said it will be launching a comments "classifier" that will be able to identify and remove "predatory" comments much quicker.

YouTube's announcement also comes a day after the Federal Trade Commission fined popular music app TikTok $5.7 million on allegations of illegally collecting images, voice records and geolocations of children, some younger than 13.