Think being a Facebook moderator would be an easy job? You might think again after hearing about the company's exhaustive and sometimes contradictory content moderation guidelines.

The folks over at The Guardian got their hands on and posted snippets of Facebook's "internal rulebook on sex, terrorism, and violence." Facebook moderators use these guidelines — which cover everything from revenge porn to self harm, hate speech, and racism — to determine what type of content to allow on the social network, and what to delete.

The guidelines, for instance, say that the comment "someone shoot Trump" is not permissible and should be deleted, according to the report. Other disturbing and violent comments, however, such as "fuck off and die" and — brace yourself for this next one — "to snap a bitch's neck, make sure to apply all your pressure to the middle of her throat" are accepted on the platform.

Many of the rules have caveats, The Guardian discovered, based on its analysis of more than 100 secret training manuals, spreadsheets, and flow charts. Videos of abortions are okay, as long as they don't show nudity, the report notes. Art showing nudity and/or sexual activity is permissible, but only if it's "handmade" and not created digitally. Images of children being physically abused or bullied are allowed, so long as the photos are not sexual, sadistic, or celebratory.

"Many moderators are said to have concerns about the inconsistency and peculiar nature of some of the policies," The Guardian reported. "Those on sexual content, for example, are said to be the most complex and confusing."

Facebook's Head of Global Policy Management Monika Bickert in a statement to PCMag said keeping people safe on the platform is "the most important thing we do."

"We work hard to make Facebook as safe as possible while enabling free speech," Bickert wrote. "This requires a lot of thought into detailed and often difficult questions, and getting it right is something we take very seriously."

The leak comes after Facebook last month pledged to "do better" after a man in Cleveland shot and killed an elderly individual and posted a video of the murder on the social network. Following that incident, Facebook CEO Mark Zuckerberg said the company will be adding 3,000 people to its community operations team to "review the millions of reports we get every week, and improve the process for doing it quickly." That's on top of the 4,500 reviewers Facebook currently employs.

The Cleveland murder followed several disturbing incidents captured on Facebook Live, from shootings to sexual assault. The social network has also had to grapple with teens and tweens live streaming their own suicides; Facebook has since integrated its suicide prevention tools into Live.

Related Facebook Pledges Formal Crackdown on Hate Speech in Europe

The Guardian's so-called "Facebook files" reveal that the company sometimes purposefully allows controversial content — such as videos of violent deaths and photos of animal abuse — to help raise awareness of certain issues. The company advises its moderators to mark all videos of violent deaths as "disturbing." That tag is only used for the most "extremely upsetting" images of animal abuse, however.

Other permissible comments on the social network, according to the report, include: "kick a person with red hair," let's beat up fat kids," "little girl needs to keep to herself before daddy breaks her face," "you assholes better pray to God that I keep my mind intact because if I lose I will literally kill HUNDREDS of you," and "unless you stop bitching I'll have to cut your tongue out."

Meanwhile, in addition to its human reviewers, Facebook uses automated systems to assist enforcement of these rules. The company also relies on community members to report questionable content, and has promised to make it easier for people to do that.

"We're going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help," Bickert wrote.

Further Reading