Screenshot : YouTube

A dvertising for The Nun, an upcoming horror film centered around the inexplicable suicide of a nun in 1950s Romania, is predictably terrorizing. But a YouTube ad for the film has seemingly crossed the line, driving a flood of users to complain about it online and, ultimately, get it removed from the platform .




A tweet cautioning users about the ad had over 100,000 retweets as of Tuesday morning . L ast night, YouTube responded to the message, stating that the ad “violates our shocking content policy and it’s no longer running as an ad.” Asked for comment, YouTube confirmed to Gizmodo the tweet was accurate and the ad had been removed.

The ad in question is just a few seconds long, and while horror is subjective, it’s easy to see why a lot of users would feel affronted by it. The ad shows an on-screen volume indicator turning all the way down followed by a jump scare of a nun. You can watch a re-uplod of the ad here.


While the full trailer is longer and arguably more disturb ing, it’s not served up to viewers who might just be browsing innocuous videos online only to be interrupted by a screaming nun . “IF I SEE THE NUN TRAILER POP UP AS ANOTHER AD WHEN IM WATCHING YOUTUBE IMMA THROW HANDS. IM TIRED OF GETTING SCARED IN THE MIDDLE OF A JAMES CHARLES VIDEO,” one user wrote on Twitter. “IM LITERALLY SHAKING AND TEARING UP RIGHT NOW I JUST WANTED TO PLAY EPIPHANY BUT WHY DID YOUTUBE GAVE ME A FUCKING THE NUN JUSMPSCARE AS AN AD AT 5 FUCKING AM,” wrote another.

Google, YouTube’s parent company, does have an advertising policy that prohibits shocking content. A ds feat u ring “ violent language, gruesome or disgusting imagery, or graphic images or accounts of physical trauma” as well as promotions “ that are likely to shock or scare ” are not allowed. Google lists crime scene photo s and execution videos as cases of the former, and ads indicating that the viewer “may be in danger, be infected with a disease, or be the victim of a conspiracy ” as examples of the latter. Of course, these are all real-life horrors , not marketing campaigns for a fictitious horror film. But it’s evident that content doesn’t have to be nonfiction to appall. It could just be a really fucked up ad .

This whole debacle is unsurprising, given YouTube’s less than celebrated history of content moderation. In June, three parents reached out to the Advertising Standards Authority (ASA), a non-statutory UK watchdog organization, after their kids saw ads for a horror movie before kid- aimed content like Frozen songs , a Lego tutorial, and Minecraft videos . One of the ads, which were for Insidious: The Last Key, allegedly showed a woman “lying on a floor immobile, bloodied and distressed while a humanoid creature crept towards her and then probed at her with claw-like fingers and pierced her skin,” according to the ASA. Ultimately, the ASA told Sony Pictures to ensure such ads weren’t targeted at children in the future.

And aside from horror films, YouTube has also spectacularly failed at moderating content related to suicide, porn, addiction, and gun modification tutorials, as well as disturbing content targeting children. Just t o name a few.


[Polygon]