About one year after the YouTube ad-pocalypse shook up the online video website, the company is handing down a punishment to another YouTube star for posting an obscene video. Logan Paul, a YouTube creator with 15 million subscribers, has been removed from Google's Preferred ad platform. YouTube also won't feature Paul in the fourth season of Foursome, a YouTube Red show, and Paul's other Originals projects have been put on hold. This comes nearly two weeks after Paul posted a video of him visiting Aokigahara in Japan, also known as the "suicide forest," and prominently featuring a dead human body in the video and in the video's thumbnail.

YouTube's punishment comes after the company made this original statement about this incident:

Our hearts go out to the family of the person featured in the video. YouTube prohibits violent or gory content posted in a shocking, sensational or disrespectful manner. If a video is graphic, it can only remain on the site when supported by appropriate educational or documentary information and in some cases it will be age-gated. We partner with safety groups such as the National Suicide Prevention Lifeline to provide educational resources that are incorporated in our YouTube Safety Center.

Amidst outrage from the YouTube community and some celebrities, Logan Paul removed the video from YouTube and issued two apologies before announcing he would take time off from YouTube "to reflect."

How did a dead body end up on YouTube?

A little more than a week after appearing on Top Chef, Paul uploaded the video in question on December 31, 2017. It showed him and some friends entering Aokigahara and finding the hanging body of a man who presumably had committed suicide. The video's thumbnail also included an image of the body. The video gleaned millions of views and sat on the YouTube Trending page for a while before community members and celebrities including Aaron Paul and Chrissy Teigen called Paul out for his offensive and disrespectful video. Paul removed the video from his channel shortly thereafter.

YouTube issued its original statement above, and, a few days later, the company said it was looking into "further consequences" for Paul. The newly announced actions appear to be those consequences. The company also issued an open letter to its community on Twitter, acknowledging its "lack of communication" and slow response time.

Google Preferred is the company's premiere advertising platform, and removing Paul from it means he won't make as much money from monetizing his YouTube videos as he previously did. Putting his YouTube Red and Originals projects "on hold" shows that YouTube isn't ready to have him star in any of its huge productions made for its paid video platform.

However, it's unclear how Paul's video was allowed to stay up on YouTube as long as it did. YouTube's guidelines do not allow violent or gory content posted in a manner that's meant to be shocking, sensationalist, or disrespectful. While it's likely that YouTube would have eventually flagged the video, its systems and moderators did not do so before the video garnered millions of views and likes.

Reminiscent of past controversies

YouTube has chosen to police its creators more than ever before after last year's ad-pocalypse, which began as advertisers pulled their ads from YouTube after learning some ran in front of extremist and offensive content. Last year's Pewdiepie controversy also contributed to the ad-pocalypse—YouTube's most-subscribed-to creator, whose real name is Felix Kjellberg, was removed from Google Preferred, had his YouTube Red show cancelled, and was dropped by Disney Maker Studios after a compilation of clips of him making anti-Semitic jokes went viral.

In the first half of 2017, YouTube introduced a lot of new rules for creators as well as tools that allow advertisers to better control where their ads appear on the website. The new YouTube guidelines and rules surrounding offensive and distasteful content have been seen as unclear to many, as YouTubers big and small have their content demonetized or flagged (sometimes incorrectly so) by YouTube's system.

The system is clearly still a work in progress, as shown by YouTube's slow response to the Logan Paul situation. Community members were some of the loudest voices calling upon YouTube to act, but noticeably absent from the outraged parties were individual advertisers. So far, there have been no reports of advertisers pulling ads from YouTube in response to the Logan Paul situation. It's possible that, after YouTube's response to the ad-pocalypse, advertisers feel more comfortable with the effectiveness of YouTube's new tools and the online video company's ability to police creators.

While community members and some of the public have called for Paul's channel to be terminated, it doesn't appear that's going to happen any time soon. Paul would have to accrue three account strikes within three months for YouTube to terminate his account (YouTube reportedly hit Paul's account with one strike for this incident).

Now, there are two ways YouTube could proceed after handing down Paul's punishment: the company could consider the situation handled and do nothing further, or we could see YouTube instate new community guidelines or rules that pertain to this incident. Considering the fact that this situation hasn't damaged YouTube monetarily (at least not immediately), we may not see any additional rules from YouTube. The company handled the situation with the rules it had in place, albeit slowly. But the repercussions from the Logan Paul incident will likely be felt by creators, large and small (mostly small), on which YouTube cracks down even harder to prevent content like this from being posted in the future.