On Wednesday, the website Reddit, known as the “front page of the internet,” banned 11 communities that advocated or glorified racism, ethno-nationalism, and violence. The communities kicked off the site included r/NationalSocialism, r/EuropeanNationalism, r/Nazi, and r/DylanRoofInnocent, as well as a variety of smaller subreddits.

“We will take action against any content that encourages, glorifies, incites or calls for violence or physical harm against an individual or a group of people,” the website said in a statement.

Reddit’s new policy follows similar steps taken by Twitter, which recently outlined more aggressive rules against hateful imagery and hate speech on its site. The announcement also came hot on the heels of a Daily Beast story which reported that Lane Davis, a 33-year-old Redditor known as Seattle4Truth, had stabbed his father to death after he accused Davis of being a Nazi and a racist. Davis was a prolific poster on the r/The_Donald, one of the main online forums for Trump supporters, and had previously interned for Milo Yiannopoulos.

Naturally, Reddit’s decision to ban certain communities drew the ire of some users, who asked why more left-leaning subreddits like r/Socialism and r/Communism haven’t been banned as well — despite the fact that they don’t specifically and regularly glorify violence against minorities, as outlined in the Reddit policy change. Users on the social media site Gab — a platform popular with white nationalists — also decried the move, and said it was evidence that white nationalists needed to move away from traditional social media platforms or post more selectively, to make sure that they don’t alert users to their white nationalist sentiments.


The new bans on Reddit, which is used by more than 172 million people per month, are another example of tech companies finally abandoning their hands-off approach in favor of a more combative stance against hate speech. There is empirical evidence that the bans will have a positive effect as well. In 2016, when Reddit banned toxic subreddits like r/coontown and r/fatpeoplehate, researchers found that hate speech decreased by as much as 80 to 90 percent for some users.

However, simply banning users from popular websites and social media platforms won’t erase white nationalists’ ability to meet and coordinate online. While media attention often focuses on the likes of Facebook, Twitter, and YouTube, white nationalists often gather first in less-popular websites like Gab and the ever-anarchic 4chan — which was a key site in facilitating the rise of online white nationalism. Prior to the violent white supremacist rally in Charlottesville, Virginia earlier this year, the chat app Discord was also extremely popular among white nationalists. Today a quick visit to Gab and 4chan shows that racist sentiments still thrive on those platforms.

A report from earlier this week by the Institute for Strategic Dialogue highlighted the online savvy of racist groups, noting how white nationalists’ online growth enabled divided and ideologically different groups to connect and coordinate their efforts. The report cites the recent German elections, where white nationalist groups used social media and memes to help mobilize votes for the far-right AfD party, as an example.

“They’re not all knuckleheads,” author Jacob Davey told VICE. “The big thing here is that they are increasingly becoming more and more sophisticated… they’re constantly learning from each other.”


The report also found current attempts to counter white nationalist speech severely lagging. “Counter-speech measures must go beyond popular social media platforms,” it read. “They must penetrate alternative platforms and burst extreme-right bubbles with campaigns that build on a thorough understanding of internet culture and counter-culture.”