Epic Games has stopped plugging Fortnite pre-roll ads on YouTube videos following a discovery that ads were playing on videos predators used to exploit children.

“We have paused all pre-roll advertising,” an Epic Games spokesperson told The Verge. “Through our advertising agency, we have reached out to Google/YouTube to determine actions they’ll take to eliminate this type of content from their service.”

Pre-roll ads, which play before a video starts, for companies like Grammarly and Google Chromebooks played on some of the videos that creator Matt Watson found for his original explainer that sparked conversation. Other companies with featured ads on videos, which starred children and whose comments sections were full of comments from predators, have also asked YouTube to rectify the situation. A spokesperson for Peloton, the company behind the popular exercise bike, told Wired that “it was working with its media buying agency to investigate why its adverts were being displayed against such videos.”

Many of these videos came to light in a video by Watson, who demonstrated that searching for something like “bikini haul,” in which women show various swimsuits they have purchased, can often lead to exploitative videos of children. The comment sections are often full of predators timestamping certain parts of a video that sexualizes the child or children in the scene, although the videos themselves aren’t pornographic in nature. Watson’s video quickly circulated online, and a lengthy post on Reddit remained on the front page for quite some time.

“Through our advertising agency, we have reached out to Google/YouTube to determine actions they’ll take to eliminate this type of content from their service.”

“YouTube’s recommended algorithm is facilitating pedophiles’ ability to connect with each-other, trade contact info, and link to actual child pornography in the comments,” Watson wrote on Reddit. “I can consistently get access to it from vanilla, never-before-used YouTube accounts via innocuous videos in less than ten minutes, in sometimes less than five clicks.”

A YouTube spokesperson told The Verge, “We took immediate action by deleting accounts and channels, reporting illegal activity to authorities and disabling violative comments.” Ensuring that advertisers commercials don’t run on videos with disturbing content is something that YouTube has worked on for some time.

“Any content — including comments — that endangers minors is abhorrent and we have clear policies prohibiting this on YouTube,” the spokesperson said. “There’s more to be done, and we continue to work to improve and catch abuse more quickly.”

This has happened in the past, and the result is something the creator community refers to as the “adpocalypse.” Advertisers threatened to pull ads from the platform following a series of articles that highlighted terrorism and hateful content running on the platform. Concerns from advertisers grew after YouTube’s biggest creator, PewDiePie, garnered international coverage over a video that contained anti-Semitic imagery.

YouTube has given advertisers more control over what genre of videos their ads run on as well as certain types of videos they don’t want their ads to run on, over the last couple of years. For example, advertisers can tell YouTube they want to run ads on videos that fall under sports and music, but not video games, and YouTube will distribute ads on channels that fall under those categories. Hypothetically, if a channel that features young children performing gymnastics or showcasing choreography, and their channel meets YouTube’s AdSense requirements, then an advertisement for a Google Chromebook could appear on a video that has also attracted predators in the comments section.

It’s unclear when Epic Games will bring Fortnite pre-roll ads back to YouTube.