Leaked guidelines on cruel and abusive posts also show how company judges who ‘deserves our protection’ and who doesn’t

Facebook has only recently banned users from posting photos and images mocking people for having illnesses and other serious health conditions, the Guardian can reveal.

The company said the policy had changed in recent months but declined to give details.

Facebook responded after the Guardian presented it with examples from leaked internal documents advising moderators to ignore certain images mocking people with disabilities. The manuals included photos of people with Down’s syndrome.

The company said these images were “not allowed” and would now be taken down.

Under the new guidelines, any photograph of a person with a disability and a mocking caption will be removed if reported by users. But mocking remarks on their own do not have to be deleted.

“We ask people who discuss these topics in an insensitive way to do so transparently so that others may engage with or challenge them,” Facebook said. “If you make these jokes on Facebook you cannot do so anonymously. We force you to publish your name next to it, otherwise we unpublish the page.”

Documents also show the site allows the “sharing of footage of physical bullying” of children under seven, as long as there is no caption.

The social media group has ruled that anyone with more than 100,000 followers on a social media platform is a public figure, with “no exceptions for minors”.

'No grey areas': experts urge Facebook to change moderation policies Read more

The details appear in documents that detail how Facebook attempts to deal with cruel, insensitive and abusive posts on the site.

The training manuals for moderators say Facebook regards bullying as “an attack on private persons with the intent to upset or silence them”. But they add that you are only “a ‘private person’ if you are not a public figure”.

According to the documents, public figures include politicians, journalists, people “with 100,000 fans or followers on one of their social media accounts”, or people “who are mentioned [by name or title] in the title or subtitle of five or more news articles or media pieces within the last two years”.

Under the headline “People excluded from protection”, one document adds: “We want to exclude certain people who are famous or controversial in their own right and don’t deserve our protection.”

The types of groups and individuals excluded from protection include Jesus, the mass murderer Charles Manson, Osama bin Laden, rapists and domestic abusers, any political and religious leaders before 1900 and people who violate hate speech rules.

The documents say stars such as the singer Rihanna can be protected if the posts about them include their photo, and a caption that matches a “cruelty topic”.

The slides explain: “Rihanna is famous in her own right for being a singer. She was also a victim of domestic violence. You can mock her for her singing, but not for being a victim of domestic violence.”

However, the documents state moderators do not have to automatically delete a post, if flagged, that says: “Rihanna, why are you working with Chris Brown again? Beats me” – as long as the image used in the post is not of the singer.

Monika Bickert, head of global policy management at Facebook, said: “We allow more robust speech around public figures, but we still remove speech about public figures that crosses the line into hate speech, threats, or harassment. There are a number of criteria we use to determine who we consider a public figure.”



The documents tell moderators to only automatically remove posts reported to them that combine photos of individuals with abusive captions about them.



For instance, a picture of an unidentified woman, taken from behind, who has blood on the back of her trousers, can be left on the site because, the slide says, “menstruation and no additional context = ignore”.

The Guardian's moderation policy The Guardian's moderation approach is bound by guidelines, which we have published here, and our moderators are all directly employed by the Guardian and work within our editorial team. The moderation team regularly receives training on issues such as race, gender or religious issues, and applies that training in service of those public guidelines. When making decisions, our moderators consider the community standards, wider context and purpose of discussions, as well as their relationship to the article on which they appear. We post-moderate most discussions, and rely on a mixture of targeted reading, community reports and tools to identify comments that go against our standards. We have an appeals process and anyone wanting to discuss specific moderation decisions can email moderation@theguardian.com. When requested, reasons for removal may be shared with those affected by the decision. All discussions on the Guardian site relate to articles we have published; this means we have specific responsibilities as a publisher, and also that we aim to take responsibility for the conversations we host. We make decisions about where to open and close comments based on topic, reader interest, resources and other factors.

Another document describes what is allowed around physical bullying. It says “sharing footage of physical bullying where no further commentary” is made is allowed. Moderators should also ignore images of physical bullying of a child under seven even if they include unkind commentary – such as laughter and name-calling.

One slide shows a picture of a well-known American wrestler angrily tearing off his shirt, with the caption: “When you find out your daughter likes black guys.” Facebook says this can be ignored – because the wrestler is not black.

Another photo shows parents and children fleeing a school shooting in the US, with a caption that reads: “Hope the parents kept their Christmas receipts.” Facebook says this does not have to be deleted because the children in the photograph “are the survivors not the victims. If the people depicted in the image were the victims the right action would have been to delete”, if users complained.

Facebook said: “We recognise that … [some] jokes might be considered to be cruel and insensitive and the line between satire, humour, and inappropriate content can be grey.”



Contact the Guardian securely Read more

Facebook also insisted it does not allow the mocking of people with disabilities.

That is a U-turn on policy set out in documents which were given to moderators within the last 12 months.

Previously moderators were told: “We don’t delete photos where there’s a PDITI [person depicted in the image] that’s being mocked for having a serious disease or disability.”

The document included pictures of people with Down’s syndrome accompanied by mocking captions; moderators were told they did not have to remove these posts if reported.

Asked about this, Facebook said the policy no longer applied and had been “changed in recent months”. A spokesman said: “These images are not allowed.”