Whenever YouTube institutes a tougher moderation stance, a common debate emerges over censorship — especially from notable conservative voices.

Questions over YouTube’s moderators and the power they hold were raised this week after notable conservative pundits, gun advocates, conspiracy channels and other right-wing voices received community strikes or were locked out of their channels. Creators who are affected by lockouts, strikes and suspensions are referring to it as the “YouTube Purge,” claiming that YouTube is purging all right-wing or pro-gun content. The move follows the company’s attempt to clamp down on dangerous content following the Parkland shooting.

A YouTube representative acknowledged that new moderators, hired as part of the company’s plan to employ 10,000 people to help oversee content and respond to flags, may have been a little overzealous.

“As we work to hire rapidly and ramp up our policy enforcement teams throughout 2018, newer members may misapply some of our policies resulting in mistaken removals,” a statement from YouTube given to Bloomberg reads.

Channels have been reinstated and strikes have been removed, but it’s led to a new movement from conservative voices concerned about censorship from YouTube moderators. Sargon of Akkad, whose real name is Carl Benjamin, is a well-known right-wing YouTuber who came to prominence during GamerGate, a hate-fueled movement and harassment campaign that began in 2014. Benjamin’s YouTube channel became a talking point this week when he announced that Google locked him out of his account, saying “it was being used in a way that violated Google’s policies.”

“Without warning, Google have suspended my account, which prevents me from logging into my YouTube accounts,” he wrote on Facebook. “My YouTube channels had zero strikes. The purge is here.”

Benjamin was granted access to his account a day later, and in a new video, stated that he believed in a “rather Draconian fashion, YouTube are applying a strike to my account from an event that happened over a year ago.” Benjamin said that he “doesn’t want to point any fingers,” but considers the timing of the lockout suspicious.

“The timing of this coming during the YouTube purge, in which dozens of right-wing and often gun channels were just outright banned, or channels about conspiracy theories, whether for or against were just outright banned,” Benjamin said. “I don’t actually know what the results of all of this is going to be because this is just a working theory, so I don’t know if my channel is going to disappear or not.”

Benjamin’s channel is just one of the channels supposedly affected by the so-called purge. Tim Harmsen, who runs the popular Military Arms Channel on YouTube, posted a video on Feb. 24 claiming that his channel received a strike because three of his videos related to guns contained content that moderators said “encourages illegal activities or incites users to violate YouTube’s guidelines.”

YouTube has been trying to cut down on dangerous content for quite some time. A blog post from YouTube’s team published on Aug. 1, 2017 outlined new moderation guidelines for potentially harmful content. The post came just a couple of weeks after the company faced scrutiny for extremist content on YouTube that allegedly factored into one of the attackers involved in the London Bridge terrorist attack in June 2017.

“We’ll soon be applying tougher treatment to videos that aren’t illegal but have been flagged by users as potential violations of our policies on hate speech and violent extremism,” the blog post reads. “If we find that these videos don’t violate our policies, but contain controversial religious or supremacist content, they will have some features removed.”

The events between August 2017 and today have led YouTubers to declare the company was launching an attack on right-wing voices, many of which were being regarded as carrying content with “potential violations of our policies on hate speech and violent extremism.”

Many of these voices claim YouTube is lumping right-wing and conservative political channels in with the conspiracy videos inundating the site.

Conspiracy videos are a genre on the rise

YouTube is currently trying to battle dangerous content and hateful ideologies on its platform.

After the Parkland shooting, and conspiracy videos on YouTube gaining critical attention from press around the world, YouTube’s moderators seemed to go into overdrive. While conservatives and right-wing voices see it as a purge, using the strikes being applied to Alex Jones’ Infowars channel as proof that YouTube was trying to censor people like them, YouTube sees it as overreach from trying to fix its growing problem.

Jonathan Albright, the research director at the Tow Center for Digital Journalism, compiled data that showed how dangerous related and recommended videos are on YouTube that stem from different channels. Albright discovered that by typing in the term “crisis actor,” a phrase used to attack outspoken Parkland shooting survivors advocating for gun control in America, results that stemmed from Alex Jones’ channel and related content led to disturbing, hateful videos. There was crossover between conspiracy videos on the platform and Jones’ content, which could have led to moderators’ efforts to clamp down on harmful content.

Albright declared that although Jones was just one piece in the puzzle, the immense growth of conspiracy videos and the recommended suggestions that loop in other notable creators with similar philosophies, is dangerous.

“From my experience, in the disinformation space, all roads seem to eventually lead to YouTube,” Albright said. “This exacerbates all of the other problems, because it allows content creators to monetize potentially harmful material while benefiting from the visibility provided by what’s arguably the best recommendation system in the world.

“While they don’t need to outright censor it, there must — at the very least — be policies put in place that include optional filters and human moderators to help protect children and other vulnerable people from this material.”

YouTube is trying to work on its algorithm, changing how recommendations are used and what videos are served to people. Unfortunately, that means YouTube needs to play a game of whack-a-mole every once in a while and channels get hit as moderators try to tackle content they think can encourage dangerous activity.

Anthony Fantano, a popular YouTuber who often provides commentary on YouTube trends, told Polygon that it’s up to YouTube to figure out what is a conspiracy video or channel and what is political commentary. Being able to decipher between the two, Fantano said, allows for safer content on the platform while enabling free speech.

“YouTube has not figured out a way to ensure that their algorithms should not be promoting these [conspiracy] videos,” Fantano said. “These videos should not be landing on the front page or the trending page. These videos should not be trending or the first thing that you see on the platform when you’re looking up information about school shootings.

“Where YouTube is really failing on this forefront ... you know, I’m a free speech purist, I believe that if someone has something to say, and YouTube holds itself to those free speech ideals, then they should allow it. But if it’s not, then they should come out and say it.”

With more attention being paid to YouTube now than ever before, and the company trying to appease both creators and advertisers alike, the question is what’s next? How do political commentators and gun advocates co-exist?

They probably don’t

Robert Kyncl, YouTube’s chief business officer who oversees creators on the platform, told popular YouTuber Casey Neistat a couple of weeks ago that YouTube operates under four ideologies: freedom of speech, freedom of information, freedom of opportunity and freedom to belong.

“People creating communities on YouTube with people where they can share a little bit more,” Kyncl cited as an important aspect of YouTube’s beliefs in the interview.

As YouTube struggles to define what type of content it allows on its platform, and what type of creators it doesn’t, it’s exacerbating a growing schism between a user base that, in part, thinks of YouTube as a public space where almost anyone can upload videos and the company itself, eager to reign in advertiser-unfriendly embarrassments, from hate speech to disinformation. It’s easy to agree that hate speech, and those who cater toward hateful ideologies under the guise of political discourse, should not be allowed on the platform, but for others, who claim that they’re being forced off the platform because of content they view as benign, the “purge” is something they’re anxious about.

Fantano said it’s only a matter of time before people start leaving YouTube and finding homes on platforms and websites that house content with little oversight. Steemit’s DTube, a video platform that has little moderation, and Minds, a website designed for people who tend to face scrutiny on more popular social and video platforms, are just a few examples of this mentality in action.

“I think that ultimately it will be a net positive,” Fantano said. “It’s bothered me personally for a while that YouTube ... the competition has sort of become stagnant.”

Fantano understands that, as a private company, YouTube executives are allowed to do whatever they want. In the face of concerns over purges, growing conspiracy problems and a general feeling of unease among its creator base, Fantano is asking that YouTube just talk to people about what kind of platform it wants to be.

“YouTube is free to welcome or not welcome whatever type of content they want on the platform, I just wish they’d be a little bit more clear about it. I feel like if you don’t want to have conspiracy-based content on the platform because you feel like there’s a moral conundrum there with having a platform that is spreading this misinformation by way of being able to host it, I wish they would come out and say it.

“I just wish YouTube was a little bit more transparent, even going forward, with what they do and don’t want on the site.”