Reddit banned two contentious but popular groups that regularly featured human injury and death following the widespread sharing of videos of the New Zealand terrorist incident on Friday.

The social network moved to eliminate the /r/watchpeopledie and /r/gore subreddits less than 24 hours after a shooter killed at least 49 people at two mosques in Christchurch and streamed the event live on Facebook. Internet users were able to capture the video before Facebook took it down, uploading clips of the incident on platforms including Twitter and YouTube. People then shared links to those videos in the now-banned Reddit groups, which critics and some Reddit users had questioned for years.

"We are very clear in our site terms of service that posting content that incites or glorifies violence will get users and communities banned from Reddit,” a company spokesperson said in a statement. “Subreddits that fail to adhere to those site-wide rules will be banned."

Reddit’s decision to remove /r/gore and the seven-year-old /r/watchpeopledie subreddits illustrates the tensions social networks face as they police content that violates their rules in real time. Since the incident, Facebook, YouTube, Reddit, and Twitter have all struggled to keep video of the massacre from spreading, playing a game of digital whack-a-mole as users uploaded content that continued to avoid the detection of algorithms and content moderators.

On Thursday night, a Reddit spokesperson told BuzzFeed News that /r/watchpeopledie, where links led to videos of people being executed or hit by cars, was allowed on the site because it provided a service to members — some of whom the company said were medical professionals or first responders — to learn about or cope with death. By Friday morning, however, Reddit moved to end /r/watchpeopledie, which had more than 300,000 subscribers, and /r/gore, as a result of members continually linking to videos of the New Zealand incident while moderators failed to act or even encouraged their posting.

“The video stays up until someone censors it,” one moderator on /r/watchpeopledie wrote Thursday night. “This video is being scrubbed from major social media platforms but hopefully Reddit believes in letting you decide for yourself whether or not you want to see unfiltered reality.”

The web has long offered places to share graphic imagery, from Rotten.com, a now-defunct website where users could post shocking photos of dead bodies, to mainstream platforms like Twitter, which allowed the proliferation of terrorist beheading videos on the grounds that they were newsworthy. Among popular internet sites, Reddit has gone further than most, allowing communities that share and comment on subjects ranging from pornography, incels, and death. At times, Reddit has removed controversial groups, including one encouraging fat-shaming and another that promoted the Pizzagate child trafficking conspiracy, for violating various aspects of its terms of service.

These decisions have always led to debate among users and sparked arguments about free speech on the platform. Reddit, as a private company, has no legal obligation to protect users’ rights to free speech.

A Reddit spokesperson declined to say whether more channels would be banned in light of the sharing of details or information glorifying the New Zealand massacre, but noted the company would be proactive in attempting to track users and groups that broke its terms of service. Critics, however, said the company was only acting in light of the negative publicity surrounding the deadly shooting.

“The only thing that changed between yesterday and today was Reddit getting negative publicity about those subreddits existing,” a Twitter user named Colin Sullender wrote. “The platform has turned a blind eye towards the content there for years, much of it far worse than the NZ shooting video.”