Congratulations, you're a child again! Or at least you will be one as far as YouTube is concerned any time you happen to see a video designated as being "for kids."

YouTube announced that change in a corporate blog post today, as the platform continues to try and thread a particularly tricky needle. YouTube (and the creators who use it) want to spread content as far and wide as possible, and they want to make as much money doing so as they can—but federal law limits what data companies can collect and use from the children who watch some of that content.

The changes stem from a $170 million settlement YouTube reached with the Federal Trade Commission last September over alleged violations of the Children's Online Privacy Protection Act (COPPA).

Sin no more

COPPA governs the collection and use of personal data associated with children younger than age 13. Websites, apps, and other digital platforms that knowingly collect data from children are required to post a privacy policy and have parents consent to it. This is to give parents the option to opt out of having their children's information shared with third parties, to let parents review their children's data, and to make sites, apps, and platforms follow sound data storage and retention policies.

At the time, YouTube allegedly didn't discern between potentially underage visitors who might be visiting explicitly kid-friendly content and visitors ages 13 and up. Instead, the online video giant hoovered up all data equally. According to the FTC complaint, while Google said internally that it had no "child-directed" content and didn't need to worry about COPPA, it was at the same time telling companies such as Mattel and Hasbro that "YouTube was unanimously voted as the favorite website for kids 2-12" and "93 percent of tweens visit YouTube to watch videos."

The settlement wasn't just a fine; it also included an agreement that going forward, basically, YouTube would do better. Since last fall, the company has been making a series of changes that theoretically will help it handle kids' data more carefully.

All content creators who upload video to YouTube Studio for their video channels are now required to indicate whether the content is made for kids. If a video is indicated as "for kids," then data captured from any viewer of that content will be treated as coming from a child under 13.

Not as easy as it looks?

YouTube is also making its own algorithmic content determinations for videos that don't have specified a "for kids" / "not for kids" designation, using factors flagged by the FTC. "Creators know their content best and should set the designation themselves," the company said, adding that it will use machine learning to identify content featuring children's characters or themes, toys, or games, among other things.

Content creators can "update a designation made by our systems if they believe it is incorrect," YouTube said. "We will only override a creator designation if abuse or error is detected."

YouTube content creators, who rely on the platform for their livelihoods, have for several weeks been raising the alarm about the new audience setting. Videos marked as "child directed" have comments, notifications, and personalized ads disabled, YouTube CEO Susan Wojcicki confirmed last year. Disabling those ways of engaging with a video could in turn lead YouTube's recommendation system to believe a video is simply not engaging and recommend it less to potential viewers. (A YouTube representative reached out after publication to clarify that videos marked for children would still be recommended alongside other videos marked for children.)

Individual video creators themselves are also now personally on the hook for penalties of up to $42,350 if they fail to explicitly mark their videos as for children, the FTC said in November. This draws concern from creators who consider their content family-friendly but not necessarily child-directed.