Editors' Note: This story includes references to hate speech and other language that readers may find offensive.

In September, a group of black women penned an impassioned letter to the people who run Reddit entitled: "We have a racist user problem and reddit won't take action."

Posted by the username of pro_creator, who serves as a moderator on the subreddit /r/blackladies, it was cosigned by the moderators of more than 60 other subreddits.

"Since this community was created, individuals have been invading this space to post hateful, racist messages and links to racist content, which are visible until a moderator individually removes the content and manually bans the user account," the message said.

"reddit admins have explained to us that as long as users are not breaking sitewide rules, they will take no action," the letter added.

Therein lies the issue. Reddit has a hate speech problem, but more than that, Reddit has a Reddit problem.

A persistent, organized and particularly hateful strain of racism has emerged on the site. Enabled by Reddit's system and permitted thanks to its fervent stance against any censorship, it has proven capable of overwhelming the site's volunteer moderators and rendering entire subreddits unusable.

Moderators have pled with Reddit for help, but little has come. As the letter from /r/blackladies mentions, the bulk of what racists perpetrate on the site is within Reddit's few rules. And the site's CEO has made clear, even through criticism surrounding high-profile events like the celebrity nude leak, that those rules are not going to change.

This has put the front page of the Internet in a tenuous position. Having just completed a funding round, the site is poised to begin monetizing. That will mean convincing advertisers to put ads next to its user-generated content.

It is a situation in which an unstoppable force meets an immovable object. Hate speech on Reddit is proving uncontainable while Reddit refuses to change.

The situation has left moderators — essential cogs in the site's operation — as the site's last line of defense against some of the darkest parts of the Internet.

It is a battle they are losing.

Down with the upvotes

Image: Bob Al-Greene, Mashable

It's just not that hard to manipulate Reddit.

Motivated racists have proven capable of affecting everyone from smaller groups like /r/blackladies to huge subreddits like /r/news, which has more than 3.9 million subscribers.

Reddit relies on a democratic “upvote” and “downvote” system that surfaces or buries content and comments. It’s a system that can be gamed by motivated groups.

Allied redditors can vote en masse to push content and comments to the top of subreddits, a move known as "brigading." This is frowned upon — but it’s not technically against the rules. The site also allows users to quickly create anonymous accounts.

Bands of anonymous, racist users can completely overrun smaller subreddits, which is what happened to /r/blackgirls, a predecessor to /r/blackladies.

“Our sub was created after a previous sub we'd frequented was overrun [by] hate groups,” pro_creator said in an email to Mashable. The user requested anonymity out of fear of “doxxing,” or the public disclosure of personal information online. The abuse “would come in waves as they grew upset with being rejected and banned.”

Racist redditors had previously congregated at /r/n*ggers, a subreddit that was eventually banned for its open attempts to brigade other subreddits, including /r/blackgirls. (Note: A Reddit spokesperson told Mashable after the original publication of this article that this subreddit was banned for allowing posts inciting violence).

A year and a half later, /r/blackladies, which bills itself as "designed specifically to be a safe space for black ladies on Reddit," is dealing with the same problem. Moderators are growing weary.

In addition to the upvote and downvote system, moderators, know as “mods,” are also a key part of Reddit. These unpaid volunteers regulate each subreddit, some of which have millions of subscribers. They have the power to block comments and ban users from their particular parts of the site.

In the face of the types of organized attacks that hate groups have mounted on subreddits large and small, those tools are woefully inadequate, moderators say.

Tyler Lawrence, a moderator of a variety of subreddits including /r/news, said that consistent and coordinated attacks have caused him to consider drastic action.

"This has become such a huge issue in /r/news alone that I've at multiple points considered outright closing comment sections to prevent hateful brigading from racist communities within Reddit," Lawrence told Mashable in an email.

Moderators’ pleas have almost entirely fallen on deaf ears. Reddit’s commitment to remaining as open as possible is well documented. Most recently, Reddit CEO Yishan Wong penned a defense of the site’s lack of action concerning its role in disseminating leaked celebrity photos.

Moderators who spoke with Mashable are fatalistic about the site’s future. If Reddit was built in part by the darker corners of the site, why would it change now?

“There's no desire to address the various -isms that have grown to dominate the site, so it doesn't seem like it will be resolved any time soon. Which is unfortunate, because the attitudes displayed by a good number of Reddit's target demographic are firmly on the wrong side of history,” pro_creator wrote. “The site is positioning itself as a playground for racists and misogynists. And if racism and sexism are paying the bills, why would they move against it?”

Reddit declined to respond to questions on this topic.

The beginning

Image: Bob Al-Greene, Mashable

Reddit at its core is a group of communities.

The site's structure and format — relying on the voting system to elevate or bury content and comments — made it the ideal place for users with any number of interests to connect. Reddit now hosts thousands of sections, known as subreddits, and served more than 170 million unique users last month.

Censorship is the site's mortal sin, even when being applied to the most odious content. This laissez faire ideology is an ingrained part of the platform, lending it a certain legitimacy. All are welcome and governed by the same rules.

This led to the site playing host to a certain amount of racism and hate speech. Racism on the Internet preceded Reddit, and it will exist if the site ever goes away. But there was a relative peace among the various groups, which operated under something of an unspoken detente. You stay in your corner, we stay in ours.

That is until the 2012 shooting of Trayvon Martin by George Zimmerman.

"The Zimmerman trial really stands out in my mind. It served as a rally point for racists everywhere," said Logan Hanks, a former Reddit programmer, in an email to Mashable.

"This manifested on Reddit as a lot of new racist memes popping up here and there, drama around racists squatting on the 'TrayvonMartin' subreddit to mock the African-American community, and an uptick in bullying directed at minority subreddits," he said. "It became a prime opportunity to mock and harass minorities on Reddit."

Since then, a battle has raged between Reddit's corps of volunteer moderators and racist activists.

"After the Zimmerman trial, they were briefly dispersed, but never entirely gone, and this year they've returned as strong and bold as ever," Hanks said.

Battlefield Reddit

Image: Bob Al-Greene, Mashable

Racism is nothing new to the Internet, but rarely has it been so organized and on a platform that can quickly put it in front of millions of users.

Numerous moderators who spoke with Mashable for this story say that hate groups are coordinating to disrupt large, mainstream sections of the site and occupy others.

Moderators can delete posts and comments that violate subreddit rules. Some have taken screenshots of attacks in hopes of providing evidence to admins — Reddit employees that help run the site — and spurring them to take action. Examples can be found here, here and here. There's also evidence of plans to take these efforts to Twitter.

Lawrence, the moderator, sent the following screenshot as an example of the type of action that he has had to deal with on a near-daily basis.

Image: Tyler Lawrence/Reddit Screenshot

Many subreddits have their own rules, enforced by moderators. It is up to them to regulate content and comments with limited tools. They can block users and delete comments, but these efforts are sometimes not enough. While Reddit has 65 employees, it relies on thousands of unpaid mods.

"The tools available to mods mainly offer limited reactive approaches, so they have to monitor submissions 24 hours a day to remove slurs and ban each new account created specifically to bully them," said Hanks, who was known to be a particularly active admin during his time at Reddit.

"Whenever they were hit by a particularly hard deluge they would escalate to us, and sometimes we were able to stem the tide briefly," Hanks said. "If things get too bad, they have to close their subreddit until the bullies and trolls forget about them and move on."

It's also a strain on the mods. Ryan Perkins, a moderator of several subreddits, said in an email that he had lost count of the number of racist commenters he has had to ban.

"This makes moderating any reasonably large subreddit with an eye towards being inclusive actually quite a lot of very emotionally and mentally taxing work," he said.

Racism is only one type of hate speech on Reddit. The site has seen similar battles surrounding misogyny and more recently the GamerGate fiasco.

Reddit has taken some action against organized hate groups. Banning the original hub for anti-black hate speech was a big step, but one that lacked much impact. Banning either a subreddit or a user is among the most aggressive moves that Reddit administrators can take. It also barely changes anything. New subreddits are easily formed, and new usernames created.

The moderators Mashable spoke with pointed to “the Chimpire,” a group of subreddits that had become the new hub for hate speech on Reddit.

Two moderators associated with the Chimpire told Mashable through Reddit’s messaging system that brigading was forbidden in their subreddits and denied organized attempts at vote manipulation.

No help in sight

Successful platforms that began with a spirit of openness have learned to quickly change as they attempted to turn into successful businesses.

Facebook and Twitter decided, whether as part of a moral or business decision, that freedom of expression on its platforms has limits. Tumblr cracked down on porn.

Reddit recently announced a $50 million round of funding. It has been eight years since Condé Nast parent company Advance Publications bought Reddit, and it’s no secret that the site is trying to figure out how to monetize.

The recent celebrity leak just about coincided with news of the fundraising round, putting the site in an awkward position. In this case, Reddit took action. A subreddit called /r/TheFappening that had been created to host the leaked pictures was eventually banned.

That move drew no shortage of criticism within Reddit for a perceived double standard.

"The core in this case is the same as the core in the celebrity hacking scandal, except in that instance, they only removed the subreddits once they received significant media coverage and legal pressure," said Lawrence, the /r/news moderator.

Reddit is walking a fine line. The site is trying to be tough on content that could harm its prospects while also catering to its users that demand Reddit retain its anything-goes foundation.

In the calculus between Reddit’s ideals, its business, its users and its moderators, the site seems to have decided that it can most afford to lean on the moderators.

This has left them frustrated and angry, but still redditors for now.

“We are here, we do not want to be hidden,” the letter on /r/blackladies concluded, “and we do not want to be pushed away.”

UPDATE: A Reddit spokesperson that contacted Mashable after the original publication of this article stated that brigading is against the site's rules on vote manipulation. Link sharing for the purpose of asking friends to vote is in violation of these rules. Like-minded people can still vote in groups.