SteemIt’s CEO Ned Scott doesn’t believe in censorship; a stance that has won him and his platform fans in recent months.

The appeal of video platform, DTube, which runs on the Steem blockchain database, is almost directly tied to what many creators allege has been happening on YouTube for more than a year: the “YouTube Purge,” an alleged condemnation of right-wing political channels, pro-gun advocates and conspiracy theorists, that’s led to claims of censorship on Google’s video platform.

As YouTube attempts to crack down on content it deems hateful, bullying or promoting dangerous conspiracy theories, people are looking for alternatives. DTube is a decentralized video platform with little to no moderation that uses cryptocurrency and blockchain technology to pay its users. BitChute is similar, but whereas DTube takes much of its design inspiration from YouTube, BitChute looks like an older version of LiveLeaks. The creators of BitChute describe themselves as a “small team making a stand against Internet censorship because we believe it is the right thing to do.”

BitChute and DTube don’t rely on advertising revenue. Instead users can send peer-to-peer payments.

It’s a tantalizing prospect for YouTube users who feels like they’ve been pushed off the platform, even if the company feels otherwise. The question is whether an alternative platform can actually compete with YouTube and take some of YouTube’s biggest creators.

BitChute and DTube appeal to a very specific audience

The front page of BitChute greets visitors with videos on very specific topics: Donald Trump, Hillary Clinton, censorship and conspiracy theories like “PizzaGate.” Conspiracy videos capitalize on recent tragedies, alleging that survivors of the Parkland high school shooting are crisis actors. DTube isn’t much different.

It’s the type of content that, if they were on YouTube, advertisers wouldn’t want their ads placed on. YouTube has filters for some of its biggest advertisers to help ensure their ads don’t appear on videos they don’t feel comfortable with. Those filters include “Tragedy and Conflict;” “Sensitive Social Issues;” “Sexually Suggestive Content;” “Sensational & Shocking;” and “Profanity & Rough Language,” according to CNN. YouTube isn’t taking these videos down. Type “PizzaGate” into YouTube’s search bar and you’ll find more than 205,000 results, but the chances of these videos being monetized are much slimmer.

SteemIt CEO Ned Scott told Polygon that because YouTube is so reliant on advertisers, the company has to worry about those concerns when thinking of how best to run its platform. SteemIt takes a different approach; one that DTube, which runs on Steem, seems to echo.

“If someone reports a video for infringing on copyright, it’s our legal responsibility to take the video down and investigate, which we’ll do,” Scott said. “But we aren’t policing content.”

Thanks to their laissez faire moderation, DTube and BitChute are becoming home to controversial and disturbing topics. And some of DTube and BitChute’s biggest proponents are notable voices speaking out against the purge on YouTube.

“But we aren’t policing content”

Dave Cullen is an Irish YouTuber better known as Computing Forever. He gained prominence on YouTube for his ultra-nationalistic, xenophobic views, speaking out about immigration in Ireland. In a recent video, “The Storm is Coming #YouTubePurge,” he explored the idea of finding a new home at YouTube alternatives. Most of these creators are still on YouTube to some extent, but there are ongoing conversations about what comes next.

“It’s really down to us,” Cullen said. “I hope you’ll support the people who have been affected in the way that they have. It’s just inevitable, whatever happens. I would encourage you to follow everyone you can, myself included, on the alternative platforms and please make that extended effort.

“Because before too long, that’s going to be home. I have a feeling.”

It’s important to acknowledge who some of the biggest proponents are for platforms like BitChute and DTube are. They have the support of prominent alt-right voices, like Cullen; Stefan Molyneux, who is best known for his stance on eugenics and white supremacy; Mike Cernovich, one of the founding leaders of the alt-right; Jack Posobiec, a DeploraBall inauguration party organizer and a pro-Trump figure who headed multiple misinformation campaigns; Ethan Ralph, best known for helping to spearhead the hateful GamerGate movement; and conspiracy theorist Paul Joseph Watson.

“Social platforms are a natural place to test some of these theories”

Companies like Gab, which has been described as the go-to social platform for the alt-right, have publicly shown their support for BitChute. BitChute itself plays into the concept of YouTube censoring content.

The heart of the issue is still how people perceive YouTube and how it polices content; whereas many users see YouTube as a public forum, the fact remains that YouTube is a private company.

YouTube can do whatever it wants

First Amendment activists are quick to cry foul when platforms like YouTube, Twitter and Facebook moderate content in any way, but that’s the company’s prerogative. YouTube isn’t a government body; it’s a business that can moderate its content as it sees fit.

There’s a legal principle that’s often used when discussing this matter: promissory estoppel. Promissory estoppel essentially refers to an informal promise that a company has made, which its users feel beholden to. When Robert Kynlc, YouTube’s head of business, told YouTuber Casey Neistat that the companies four core beliefs are freedom of speech, freedom of information, freedom of opportunity and freedom to belong, people took that to mean any type of speech was allowed.

Woodrow Hartzog, a professor of law and computer science at Northeastern University, told Wired that the issue with the principle is that it’s too broad.

“Social platforms are a natural place to test some of these theories, because of the power that they have and the importance of free speech in our democracy,” Hartzog said.

“What YouTube does restrict, and apply community guideline strikes for, is hateful content”

YouTube is one of the world’s biggest social platforms, and it’s trying to crack down on dangerous content. That’s why conspiracy videos are being removed and why, the company says, moderators may have been a little too aggressive with flagging content and handing out strikes. Still, the company isn’t trying to shut down channels en masse, nor is it trying to restrict content.

PragerU, a right-wing “university” that was designed to exploit YouTube and Google’s algorithm, recently noticed that its videos were restricted. The channel, which has racked up close to a billion views, accused YouTube of censorship. YouTube told The Guardian those accusations were meritless, adding that the videos “weren’t excluded from Restricted Mode [a mode that only showcases certain content] because of politics or ideology.”

YouTube’s hateful content problem is growing

What YouTube does restrict, and apply community guideline strikes for, is hateful content. Some of Infowars’ Alex Jones’ videos were taken down recently because they violated the company’s rules on cyberbullying and harassment. One of those videos referred to David Hogg, a survivor of the Parkland shooting, as a crisis actor. That action against Infowars helped kickstart mainstream discussion about the YouTube Purge,

Even though there are logical, clearly communicated reasons for why certain videos were taken down or removed, it hasn’t stopped cries of censorship. Anthony Fantano, a popular music critic on YouTube who recently started uploading to DTube, told Polygon that he believes YouTube has a right to do what they want. But he wants YouTube to be clear about its approach.

“I feel like if you don’t want to have conspiracy-based content on the platform because you feel like there’s a moral conundrum there with having a platform that is spreading this misinformation by way of being able to host it, I wish they would come out and say it,” Fantano said. “I just wish YouTube was a little bit more transparent, even going forward, with what they do and don’t want on the site.”

Fantano also said that having more competition will be a “net positive” for the creator community, noting that it bothered him personally that “the competition has sort of become stagnant.”

“It allows content creators to monetize potentially harmful material”

DTube and BitChute offer a very specific kind of competition right now: conspiracy videos and right-wing talking heads. Conspiracy videos created under the false pretense of political observation, which many researchers and academics view as dangerous, is something that YouTube is trying to crack down on.

Jonathan Albright, research director at the Tow Center for Digital Journalism, gathered data to prove just how big YouTube’s conspiracy video problem is.

“From my experience, in the disinformation space, all roads seem to eventually lead to YouTube,” Albright said. “This exacerbates all of the other problems, because it allows content creators to monetize potentially harmful material while benefiting from the visibility provided by what’s arguably the best recommendation system in the world.”

Fantano agrees, to an extent. He calls himself a “free speech purist,” but told Polygon that YouTube needs to come out and say what kind of content belongs on its platform.

“Where YouTube is really failing on this forefront ... I believe that if someone has something to say, and YouTube holds itself to those free speech ideals, then they should allow it,” Fantano said. “But if it’s not, then they should come out and say it.”

There is a difference between free speech and content that spreads hateful or harmful ideologies. It could be argued that videos appearing on DTube and BitChute, for the most part, fall under the latter category. YouTube isn’t purging its creators or waging war on conservative voices, but it is trying to tackle conspiracy videos and make its platform friendly to all. YouTube isn’t trying to stop people from leaving and going to alternative platforms, but it’s certainly not going to let content that it believes promotes bullying or hateful ideologies just live on its platform.

Update: SteemIt CEO Ned Scott sent Polygon a statement after this story was published, saying, “Steemit ensures the website is compliant, and that there are systems in place that allow things like hate speech to be flagged and removed.”