YouTube says it’s not asking creators to moderate their comments or face ad restrictions, clarifying a message that has worried some of the site’s members. Yesterday, amid controversy over child predators congregating on YouTube, the Team YouTube Twitter account stated that “inappropriate comments” could result in videos having limited or no advertising. A YouTube spokesperson, however, tells The Verge that the platform isn’t basing these limits on creators’ comment sections. Instead, YouTube moderators are evaluating videos that seem likely to attract predatory comments, then restricting advertising as a short-term fix.

These restrictions are part of a larger effort to address inappropriate videos and comments involving minors — one that includes removing channels, banning users, and disabling comments. Unlike those measures, the monetization limits don’t remove any content; they just reduce the risk that companies might see ads appear near objectionable comments.

YouTube referenced this policy in a help thread, saying that “videos that include minors and are at risk of predatory comments may receive limited or no ads.” Creators can appeal the decision, and the limits are supposed to be lifted soon, although YouTube hasn’t specified a timeline. The spokesperson says YouTube categorically isn’t asking creators to police their comment sections, and that the move isn’t supposed to signal any permanent policy change. YouTube didn’t confirm whether disabling a video’s comments would lift its ad restrictions.

(2/2) With regard to the actions that we've taken, even if your video is suitable for advertisers, inappropriate comments could result in your video receiving limited or no ads (yellow icon). Let us know if you have any questions. — TeamYouTube (@TeamYouTube) February 22, 2019

YouTube’s earlier statements caused widespread concern and confusion — as YouTube commentator Philip DeFranco said in a video, “there are a lot of YouTubers of all types right now freaking the hell out” over the tweets. The tweets suggested a policy that could have put YouTubers’ ad revenue at risk if they didn’t constantly monitor their videos, and that would have opened a door for harassers to get videos demonetized by leaving bad comments. Some users also believed the policy covered all comments that violated community standards, before Team YouTube said it only applied to videos with minors.

The tweets also echoed language found in a memo to advertisers, which promised to “increase creator accountability” and “hold monetizing channel owners to a higher standard with regards to moderating their comments.” It’s possible that future changes will make creators more liable for what’s in their comment sections, but based on YouTube’s present statements, that’s not happening right now.

Instead, this restriction on monetizing videos sounds like a temporary way for YouTube to placate advertisers, which have been distancing themselves from the platform since a video revealed predatory comments under videos featuring children. YouTube has taken a number of measures to purge these comments from the site, including shutting down over 400 channels, deleting millions of comments, disabling comments on videos that could inadvertently attract predators, and changing the way its algorithm recommends videos. But content moderation isn’t a new problem for the platform, and it’s not clear what a long-term fix might look like.