The video, posted to YouTube in September, was innocuous enough. Part of a regularly updated series, it showed two young girls, seemingly about nine years old, practicing gymnastics. The 10-minute clip, filmed by an adult, was the kind of low-fidelity, homemade footage that YouTube was built on.

A commenter asked why the “sicko” comments weren’t deleted

But many of the comments below the video, which received more than 140,000 views, were disturbing: several chimed in to say the girls were “beautiful.” About a month ago, a comment was posted asking for likes if viewers had an erection — a comment that 19 people responded to with a thumbs-up. Other commenters described sex acts. Last week, a comment left in Spanish provided a WhatsApp number, saying they had videos to share. One commenter claimed the video had appeared on a child pornography site, and another asked why no one was deleting the “sicko” comments on the page. (The video description provided a PO box, as well as an email address for “business” enquiries, but a request to speak has gone unanswered.)

YouTube is still dealing with criticism after a series of bizarre videos apparently targeting child viewers were unearthed. The company said last week that it was working on a new policy that would help age-restrict those videos. But that problem is only one aspect of YouTube’s fraught relationship with kids.

Earlier this year, the company took heat over whether it was sufficiently policing predatory comments on videos of children. The controversy started with some prominent YouTube users themselves, who made videos on the subject that quickly edged into unproven conspiracy theories about a “ring” of pedophiles. The comments on videos of children, however, are provably real, and it’s not clear how equipped YouTube is to handle them.

It’s trivially easy to find sexual comments on videos of children

It’s trivially easy to find sexual comments on videos of children exercising, going for a day at the beach, or trying the ice bucket challenge. Often these videos are uploaded by the children themselves. Google requires that YouTube users are either 18 years old, or at least 13, with a parent or guardian’s permission. In practice, those age restrictions are not always enforced.

Most concerning are the instances where commenters reach out for direct contact. One, posted on another gymnastics video of a young girl, offered a phone number to text. The comment was posted about 10 months ago. Below the video were other predatory comments. Others chimed in to denounce those commenters, and cited the work of a YouTube personality who had made some of the original claims about predatory commenters on the platform. The video had almost 1.5 million views, and it’s not difficult to find other, similar videos with hundreds of thousands of views, in several languages.

When The Verge provided several videos with disturbing comments to YouTube, the company removed the videos entirely, saying they violated the company’s community guidelines, and also terminated a user who had generated a playlist of similar videos of children. But other videos from some of these same uploaders, featuring similarly troubling comments, remained available.

“YouTube strictly prohibits sexual content involving minors.”

“YouTube strictly prohibits sexual content involving minors and we have multiple systems in place to take swift action on this content,” the company said in a statement. “We actively work with NCMEC and others in the industry to prevent child sexual abuse imagery from ever being uploaded to YouTube and to report abuse to law enforcement. We have special flagging tools for NGOs to alert us to content and teams that work around the clock reviewing reported content 24 hours a day to quickly remove comments and videos that violate our policies.”

The videos with disturbing comments are not necessarily the kinds of video that would neatly fall under the category of sexual abuse. A YouTube spokesperson said in the statement that, when it removed the videos The Verge cited, it sent a note to the uploaders asking them to be cautious when they post videos of minors, and to consider making those videos private. The company said it makes similar decisions in related instances involving minors.

YouTube has been dealing with problems like this for some time. A New Zealand publication noted last year that commenters were making unnerving requests on videos of kids performing viral “challenges,” and sometimes attempting to make contact. YouTube similarly deleted the videos when it was informed, but other videos remained.

More than 92 million videos were removed in 2015

Like other massive platforms, YouTube says it relies on users to flag posts that violate its guidelines. Last year, the company wrote in a blog post that more than 90 million people flagged videos over the preceding decade; more than 92 million videos were removed in 2015. Still, the company has developed a reputation over the years for particularly heinous comments, and it’s not always able to catch offenders on even unexpectedly popular videos.

In August, the BBC spoke to some Trusted Flaggers, part of a program in which YouTube gives special weight to users who accurately flag problems. The flaggers told the outlet that the company had a backlog that may have prevented offending videos, including those involving children, from being swiftly removed.

The company says it uses automation to flag some videos, and checks the platform for known videos of child exploitation. But it’s easy to see why these sorts of videos, of kids doing gymnastics, or playing, would be a problem for automated moderation systems: they’re unobjectionable in themselves, but placed in a deeply disturbing new context.