YouTube isn’t running ads on videos about the recent Momo Challenge resurgence, even those coming from respected news organizations and popular creator commentators.

Multiple news organizations including CBS, ABC, CNN, Fox, and local affiliate channels have uploaded segments on Momo — a viral hoax about a creepy sculpture by Keisuke Aisawa that’s alleged to promote self-harm. These videos aren’t monetized, and some have warning windows alerting viewers to disturbing content.

YouTube confirmed to The Verge that any content featuring Momo is a violation of YouTube’s advertiser-friendly content guidelines, and therefore can’t receive ads — including news videos. Some videos, including one from CBS, also have warning screens that appear before the video plays referring to the content as “inappropriate or offensive.” This isn’t a new policy, but rather YouTube enforcing rules outlined in its advertiser guidelines.

None of the videos that The Verge played — ranging from authoritative sources to popular YouTube commentators like Philip DeFranco — had ads placed on or before them. It’s something that DeFranco anticipated in his own video, joking about whether or not ads would run because of the content, and then later confirmed in a followup tweet.

Momo started off as an urban legend dreamt up in a creepypasta subreddit, but became a worldwide phenomenon in 2018 after an Indonesian newspaper wrote that a 12-year-old girl had committed suicide after participating in the “Momo Challenge.” The alleged challenge involved kids texting a strange number on WhatsApp, and then receive instructions to harm themselves. But multiple outlets have reported that the challenge appears to be an urban legend that keeps growing.

YouTube may restrict ads on “sensitive topics,” even in news reports

The hoax around the Momo challenge “has been perpetuated by local news stations and scared parents around the world,” Taylor Lorenz writes in The Atlantic. Many news organizations considered some of the most authoritative in the world have uploaded news segments to YouTube playing into the fears, with The Guardian’s Jim Waterson calling mainstream coverage “some of most irresponsible journalism in this country for ages.”

YouTube is in a tricky place right now. Advertisers are already wary of continuing their ad campaigns after a controversy surrounding predatory messages appearing in comments on videos that feature children. YouTube has taken aggressive action since the controversy first started a couple of weeks ago, including recently announcing that nearly all comment sections on videos that feature minors will be removed. Not running ads on videos about Momo — an already complicated story accused of misleading viewers and playing into parents’ worst fears — makes sense.

The latest resurgence of the Momo hoax is based on alleged YouTube videos that had images of Momo telling kids to hurt themselves. YouTube has publicly commented on the situation, tweeting that no one at the company has seen “recent evidence of videos promoting the Momo Challenge on YouTube,” and reiterating that “videos encouraging harmful and dangerous challenges are against our policies.”

YouTube also has even stricter guidelines on where advertising can appear, so as to shield brands from appearing beside more controversial content. Videos that specifically focus on “content that features or focuses on sensitive topics or events,” such as the Momo Challenge, aren’t considered to be advertiser friendly. YouTube says that even if this content is “presented for news or documentary purposes,” it may not be considered suitable. Given that the Momo Challenge touches on suicide, it may not be eligible for ads.

YouTube is trying to make its platform more family friendly and advertiser friendly in general, tackling an assortment of issues over the past few weeks. The company has removed tens of millions of predatory comments, and terminated more than 400 channels who were found to be posting predatory messages. The company also terminated several channels, including FilthyFrankClips, for inserting disturbing content into children’s programming.

The company reiterated that it will continue to take action when creators violate policies in ways that harm the community-at-large.