Facebook had to assess nearly 54,000 potential cases of revenge pornography and “sextortion” on the site in a single month, according to a leaked document.

Figures shared with staff reveal that in January Facebook had to disable more than 14,000 accounts related to these types of sexual abuse – and 33 of the cases reviewed involved children.

The company relies on users to report most abusive content, meaning the real scale of the problem could be much greater.

But the Guardian has been told that moderators find Facebook’s policies on sexual content the hardest to follow. “Sexual policy is the one where moderators make most mistakes,” said a source. “It is very complex.”



Facebook admitted this was a high priority area and that it was using “image-matching” software to stop explicit content getting on to the site. It also acknowledged it was difficult to draw a line between acceptable and unacceptable sexual content.

A slide showing number of revenge porn cases Facebook dealt with. Photograph: Guardian

“We constantly review and improve our policies,” said Monika Bickert, ‎ head of global policy management at Facebook. “These are complex areas but we are determined to get it right.”

The company declined to comment on the figures in the document. “We receive millions of reports each week but we do not release individual figures,” it said.

The use of Facebook for the proliferation of pornography as well as the rise of revenge porn and sextortion have become some of the biggest challenges for social media groups. They are coming under huge political pressure to do more to keep abusive and illegal content off their platforms or face substantial fines.

Documents seen by the Guardian, which form part of the Facebook Files, show for the first time the detailed rules applied by the company to police sexual content published on the site – as well as the scale of the challenge faced by moderators tasked with keeping Facebook clean.



One slide showed that in January moderators alerted senior managers to 51,300 potential cases of revenge pornography, which it defines as attempts to use intimate imagery to shame, humiliate or gain revenge against an individual.

In addition, Facebook escalated 2,450 cases of potential sextortion – which it defines as attempts to extort money, or other imagery, from an individual. This led to a total of 14,130 accounts being disabled. Sixteen cases were taken on by Facebook’s internal investigations teams.

One 53-slide document explains Facebook has introduced two “hotkeys” for moderators to help them quickly identify potential cases of sextortion and revenge porn, which it refers to as “non-consensual intimate imagery”.

Besides these two areas, which Facebook ranks alongside child exploitation and terrorism in importance, the Facebook Files set out various issues facing the service when it comes to sexual content.

They explain that the social media site allows “moderate displays of sexuality, open-mouthed kissing, clothed simulated sex and pixelated sexual activity” involving adults. The documents and flowcharts then set out what is permitted on Facebook in detailed sub-categories called “arousal”, “handy-work”, “mouth work”, “penetration”, “fetish” and “groping”.

The use of sexualised language is also addressed. Facebook decides whether to allow or ban remarks based on the level of detail they contain.

One Facebook document, titled Sexual Activity, explains it is permitted for someone to say: “I’m gonna fuck you.” But if the post adds any extra detail – for instance, where this might happen or how – it should be deleted if reported.

According to this 65-slide manual, other general phrases allowed on Facebook include: “I’m gonna eat that pussy”; and “Hello ladies, wanna suck my cock?”

We allow nudity when it is depicted in art … We do not allow digitally created nudity or sexual activity Facebook manual

Facebook also allows sexual references that have a “humorous context”. The example it uses to illustrate the point involves a joke about a little boy interrupting his parents having sex. Facebook said some of these examples “appear to be out of date”, but it declined to say which ones or when the policy had changed.

Until recently Facebook had allowed comments such as “I’d like to poke that bitch in the pussy” and “How about I fuck you in the ass girl?” Asked specifically about these comments, Facebook said it would now remove them if they were reported.

“Not all disagreeable or disturbing content violates our community standards,” said Facebook. “For this reason we offer people who use Facebook the ability to customise and control what they see by unfollowing, blocking or hiding posts, people, pages and applications they don’t want to see.

“We allow general expressions of desire but we don’t allow sexually explicit detail.”

The files also show Facebook is constantly updating certain policies – reacting to criticism that it has been too slow to delete some sexually graphic content, while simultaneously being too strict about other material.



Last September, Facebook was condemned for removing the Pulitzer-prize-winning “Napalm girl” photograph from the Vietnam war because it showed a naked child. After a row over censorship, Facebook relented.

The files, seen by the Guardian, reveal that Facebook has tried to avoid similar situations arising again by issuing fresh rules. One document explains that under Facebook’s new “terror of war” guidelines, there are “newsworthiness exceptions”.

Though the documents does not define newsworthy, it says Facebook now allows, among other things, “photographs of naked babies so young they clearly cannot stand unless the photo closes in on the baby’s genitals … [and] images of adult nudity in the context of the Holocaust”.

However, Facebook says images from the Holocaust depicting naked children should be removed if users complained.

The Guardian has been told moderators are struggling to make sense of other guidelines on sexual imagery. Under these rules, Facebook says it will “allow all handmade and digital nudity … [and] allow handmade sexual activity”. But moderators are told to “remove digital sexual activity” if reported.

However, the accompanying slides make clear it is sometimes difficult to draw a distinction between the two. One allowed artwork shows a topless woman riding on a giant, erect penis.

The Guardian's moderation policy The Guardian's moderation approach is bound by guidelines, which we have published here, and our moderators are all directly employed by the Guardian and work within our editorial team. The moderation team regularly receives training on issues such as race, gender or religious issues, and applies that training in service of those public guidelines. When making decisions, our moderators consider the community standards, wider context and purpose of discussions, as well as their relationship to the article on which they appear. We post-moderate most discussions, and rely on a mixture of targeted reading, community reports and tools to identify comments that go against our standards. We have an appeals process and anyone wanting to discuss specific moderation decisions can email moderation@theguardian.com. When requested, reasons for removal may be shared with those affected by the decision. All discussions on the Guardian site relate to articles we have published; this means we have specific responsibilities as a publisher, and also that we aim to take responsibility for the conversations we host. We make decisions about where to open and close comments based on topic, reader interest, resources and other factors.

The document explains: “We allow nudity when it is depicted in art like paintings, sculptures, and drawings. We do not allow digitally created nudity or sexual activity. We drew this line so that we could remove a lot of very sexual digital nudity, but it also covers an increasing amount of non-sexual digitally made art. The current line is also difficult to enforce because it is hard to differentiate between handmade art and digitally made depictions.”

In an earlier document, Facebook moderators had been warned to delete images of Giambologna’s 16th-century statue the Kidnapping of the Sabine Women in the Loggia dei Lanzi in Florence if reported.

They were also told to delete, if reported, images of the Rape of Europa – paintings that depict the mythological story of the abduction of Europa by Zeus. The updates seen by the Guardian do not make clear whether these images are allowed or not.

Facebook has also developed detailed policies around “sexual solicitation” on the site. According to its rules, providing contact information is allowed, and solicitation using acronyms is also permissible.

But if the post includes any extra information – such as mentioning sexual acts “in a non medical/ scientific/ educational context” then the post should be deleted if it is flagged up.

Facebook said it was “building better tools to keep our community safe”, adding: “We’re going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help.”