YouTube has touted its YouTube Kids app as a great way for preschoolers to more safely watch videos, supposedly filtered for their appropriateness.

But some clearly out-of-bounds videos have slipped through the cracks — from a bizarre subgenre on YouTube using children’s characters in freakish, violent or sexual situations. Examples include Nickelodeon characters dying in a fiery car crash and pole-dancing in a strip club, and popular British kids’ character Peppa Pig drinking bleach or enduring a horrific visit to the dentist.

Now YouTube has adopted a new policy that it says will do more to block such outré content from landing on YouTube Kids. The move comes after YouTube this summer said it would pull advertising from content that portrays family-entertainment characters (such as, say, Mickey Mouse or Ronald McDonald) engaging in “violent, sexual, vile, or otherwise inappropriate behavior.”

“We’re in the process of implementing a new policy that age restricts this content in the YouTube main app when flagged. Age-restricted content is automatically not allowed in YouTube Kids,” YouTube director of policy Juniper Downs told the Verge, which first reported the change.

Related Roman Atwood Extreme Stunt Show Coming to YouTube Red Netflix Races Ahead as Chief Rivals Hulu, Amazon Navigate Exec Transitions

According to YouTube’s policy for age-restricted content, as part of determining if videos should be blocked from YouTube Kids, moderators will evaluate vulgar language, violence and disturbing imagery, nudity and sexually suggestive content, and the portrayal of harmful or dangerous activities.

In tandem with YouTube’s policy-review team, YouTube Kids relies in part on parents to identify content that shouldn’t be in front of children. “Our systems work hard to filter out more mature content from the app. But no system is perfect,” Balaji Srinivasan, YouTube Kids engineering director, wrote in a blog post last week. “If you find a video that you think should not be in the app, you can block it and flag it for review.”

Since YouTube Kids launched in February 2015, the algorithimically-driven app has been criticized for lacking controls to restrict kid-unfriendly videos, as well as allowing commercially-oriented content targeted at kids.

In an update that started rolling out last week, YouTube Kids now has a new setup process that gives parents and guardians more detailed information about what content to allow access to in the app. Users also can set up to eight profiles for individual children, and the design of the YouTube Kids app will change based on a kid’s age (for example, younger kids will get less text, while older kids will get more content on the home screens). The app also lets parents block access to individual videos, disable search, and set usage time limits.

The YouTube Kids app is currently available in 37 countries and has more than 11 million weekly active viewers, according to the Google-owned video platform. To date, YouTube Kids has generated more than 70 billion views.