The latest people fighting against children’s privacy are not slick lobbyists, but instead a group much closer to home—YouTubers with family and kids’ vlogs. Their current target is the Children’s Online Privacy Protection Act, a federal statute first passed in 1998 that aims to put parents in control of their kids’ data. The Federal Trade Commission reviews COPPA every 10 years, and in 2013 the agency amended its rules to include tracking technology that websites use to personalize advertising. Recently, the FTC alleged that Google violated the law by collecting children’s personal information through YouTube without their parents’ consent. The agency claimed that when Google used personalized ads on videos aimed at kids, the company illegally used the tracking technology covered under the 2013 revisions to the law. Although Google neither admitted nor denied these allegations, it settled for $170 million.

Now the agency is beginning its 2023 review of COPPA early in response to “rapid changes in technology,” and YouTubers are currently organizing a push for the FTC to weaken protections for children online. These content creators have flooded the agency with comments and created a petition with nearly 730,000 signatures asking the FTC to carve out an exception that would leave their livelihoods unchanged. YouTubers have released talking points for content creators and viewers to include in their comments to the FTC. Even PewDiePie chimed in with a video about COPPA for his 102 million subscribers. As a result of this call to action, the FTC has received more than 152,000 comments thus far. For comparison, after three rounds of requesting comments from the public during the 2013 review of COPPA, the FTC received a little more than 500 comments total.

YouTube is wildly popular among kids: More than three-quarters of 8- to 12-years-olds report using the site, and its fifth most popular video, with nearly 4 billion views, is of children and animated characters singing and dancing along to “Baby Shark.” Kids’ content is incredibly lucrative for YouTube, which keeps nearly half of the advertising revenue generated by content on the platform. It even promoted the site to advertisers as a way to reach children. But because YouTube does not want to have to concern itself with children’s privacy protections, it continues to claim that the site is only for ages 13 and older. However, as Massachusetts Sen. Ed Markey recently noted in a letter to Google about protecting children online: “The access that children have to YouTube carries with it a corporate obligation to institute and enforce policies that protect the well being of these young users.” As part of its FTC settlement, Google created a new system allowing YouTubers to mark their content as “Made for Kids.” Once they do, Google disables some of the site’s features, such as personalized advertising, commenting, and notifications for these videos.

But YouTubers seem confused about what these things mean in practice and why Google has implemented them. As a result, many are unintentionally spreading misinformation about children’s privacy law. Some claim that YouTube will have to ban certain types of content, such as videos about the popular game Roblox. Others, from food bloggers to a capella artists, worry that YouTube will have to disable all personalized advertising on a video just because a child may watch it. One commenter even said that she will have to start “swearing like a sailor” and incorporate more “adult” conversations into her videos in order to avoid being covered by COPPA. Searching for “end of our YouTube channel” on the site reveals a number of content creators informing their loyal viewers that COPPA and the settlement may be the end of YouTube as they know it.

That’s not what COPPA does. Under this law, companies with content directed to children under 13 must inform parents what information they collect from kids and obtain parental permission before they collect it. So the content creators complaining that the law prohibits all personalized advertising are simply misinformed—and spreading that misinformation. If a company like Google really wanted to use personalized ads on videos for kids, it would just need to get parents’ permission first. But instead, Google is acting as if children on YouTube—and the protections they’re afforded—are relatively new phenomena, exacerbating content creators’ misunderstandings of the law.

Google is acting as if children on YouTube—and the protections they’re afforded—are relatively new phenomena.

But COPPA isn’t new. The law has been around longer than YouTube, and the fact that the company is just now starting to comply with it is no one’s fault but YouTube’s. Google could stop turning a blind eye to kids on YouTube, allow kids to create accounts, and then ask for parental permission—as it does with the Google Play Store. In fact, despite nearly two decades of industry claiming that these rules would destroy the online market for children’s content—just as the content creators are doing now—plenty of companies comply with COPPA by allowing kids under 13 to create accounts and treating their data differently.

Instead, YouTube has been encouraging content creators to take up their concerns with the FTC, perhaps hoping to escape stronger children’s privacy rules in the future. We reached out to Google to get a better sense of the full extent of its efforts in encouraging YouTubers to advocate against COPPA, especially given the clear parallels between Google’s talking points and those of content creators. But Google never responded.

YouTubers have every right to be concerned and frustrated by the sudden shift in YouTube’s policies. But this frustration should be directed at YouTube for suddenly shifting the burden entirely to content creators now that it’s been dinged by the FTC for allegedly not following the law. Amid bipartisan calls for stronger federal privacy protections from advocates, industry, and regulators, now is the time to strengthen protections for the most vulnerable members of our community, not weaken them. This conversation shouldn’t be about privacy laws being an impediment—it should be about how content creators, parents, and regulators can come together to ensure that the privacy of children is not superseded by companies’ desire to pad their bottom lines.

Future Tense is a partnership of Slate, New America, and Arizona State University that examines emerging technologies, public policy, and society.