Damian Collins, culture chair, called Facebook's inability to moderate 'very disturbing' after an investigation



Facebook was in the dock last night over its refusal to remove dozens of disturbing sexualised images of children.

MPs vowed to make the social network’s bosses explain their ‘appalling’ failure after it said that a string of offensive pictures did not breach its community standards.

The site’s moderators approved numerous photographs of under-16s posing provocatively, as well as stolen images of children published on a Facebook page called ‘hot xxxx schoolgirls’.

One image appeared to be a freeze-frame from a child abuse video.

Many of the images were posted on private Facebook groups aimed at paedophiles eager to share content. They were often accompanied by obscene comments and – in one case – a specific request to share ‘child pornography’.

The BBC used Facebook’s reporting button to raise the alarm about 100 of these sexualised images of children – automatically sending them to the social network’s moderators for inspection.

But despite repeated pledges by Facebook to remove ‘any sexual content involving minors’, the moderators only agreed to remove 18 of them.

It said the other 82 could stay. The technology giant only agreed to remove them all when the whistleblowing journalist, Angus Crawford, contacted Facebook executives.

Facebook asked the reporter to send over the photographs he thought were problematic – then reported him to the police for doing so.

One MP said it was extraordinary that the BBC had been reported when it was trying to ‘help clean up the network’.

Yesterday, Facebook admitted its moderators made a mistake but repeatedly refused to explain what went wrong.

It simply issued a statement from its policy director Simon Milner confirming that the images had now gone.

‘We have carefully reviewed the content referred to us and have now removed all items that were illegal or against our standards. This content is no longer on our platform.

'We take this matter extremely seriously and we continue to improve our reporting and take-down measures.

'Facebook has been recognised as one of the best platforms on the internet for child safety,’ he said.

Reporters in the BBC probe highlighted 100 images that it thought breached Facebook's social media policy but only 82 of them were removed. They included pictures that sexualised children. Stock image

MPs and charities condemned Facebook’s failures as unacceptable and called for action against it.

They also urged the police to prosecute the social network over any illegal images on its website, and suggested a change in the law to make it harder for such companies to ignore dangerous content.

An NSPCC spokesman said: ‘Facebook’s failure to remove illegal content from its website is appalling and violates the agreements they have in place to protect children.

Leering over snaps of girls Dozens of the images the BBC reported to Facebook were stolen pictures of ordinary teenagers circulated by paedophiles. They featured pubescent girls in ‘highly sexualised poses’, often partly naked or arranging themselves provocatively while wearing school uniform. In one image, two teenage girls are seen hugging whilst they are almost naked. In another, a pubescent girl is pictured in an ill-fitting pink trainer bra. The teenagers appear to have posed willingly for the pictures – but with no idea of how they would be used. Instead of being shared among their friends, they were hijacked by groups of predatory men who use Facebook groups to share their taste for under-age girls. There they leave obscene comments about the teenagers, and some openly discussed sharing more serious child abuse material. One of the most extreme pictures deemed acceptable by Facebook appeared to be a freeze-frame from a child abuse video. It was published alongside a request to share ‘child pornography’. Advertisement

'It also raises the question of what content they consider to be inappropriate and dangerous to children.’

Anne Longfield, the Children’s Commissioner for England, said Facebook’s behaviour was ‘deeply disappointing and deeply disturbing’.

‘I find it hard to believe that individuals at Facebook had seen these images and made a decision that they were okay and hadn’t breached their community rules,’ she said.

‘They were very explicit, they were very sexualised photos of children and some of them clearly had been taken without the children knowing.’

Labour MP Helen Goodman, a former shadow culture minister, said Facebook – and not the BBC journalist – should answer to the police.

‘It sounds as if these images are in breach of legislation and so I think it’s a law-enforcement issue,’ she said.

Tory MP Damian Collins vowed to haul Facebook bosses before the powerful culture, media and sport select committee, which he chairs.

‘We need to understand what has gone wrong,’ he said.

‘Facebook has a social responsibility to tackle this.’

He added that there could be grounds for ‘creating a new offence of knowingly failing to act’.

Labour’s Chris Matheson, another member of the culture committee, said: ‘Facebook’s response is unbelievable.

There should be zero tolerance of any sexualised images of children, let alone images of sex abuse.’

Facebook also did not remove the images of five convicted paedophiles reported to them. And when the BBC sent Facebook the suspect images, the company reported the journalists to the police for sharing them. Stock image

Mystery of how many firm employs to check content Facebook users are able to flag inappropriate content by clicking the ‘report’ button, accessible via the small downward arrow in the top right-hand corner of each Facebook post. A multiple choice questionnaire then appears, asking the user why the content they have flagged shouldn’t be on Facebook. It could be because it features ‘nudity or pornography’, because it ‘humiliates’ someone, because it is ‘inappropriate, annoying or not funny’, or because it is a photograph of themselves or their family that they do not want to be shared on the social network. When the user clicks one of those boxes, it automatically generates a report and sends it to Facebook’s ‘thousands’ of people moderating content around the world. One of the moderators then assesses the post according to the social network’s guidelines, and decides whether it indeed needs taking down. If it is allowed to remain, they will automatically send the user who complained a message stating that it did not breach ‘community standards’. According to the social network, the moderators are ‘highly trained experts who provide 24/7 cover’. It would not disclose how many of them are in the UK, but some are based in Dublin, where Facebook has its European headquarters. A spokesman for the technology giant said: ‘We continue to refine the way we implement our policies to keep our community safe, especially for people that may be vulnerable or under attack. ‘Facebook is constantly improving its reporting and reviewing system so we can give people free expression on Facebook, but so we also remain a safe community.’ Advertisement

The National Crime Agency, which runs the Child Exploitation and Online Protection Centre (CEOP), said it is ‘vital’ social media platforms report and remove indecent content.

The agency would not disclose whether it is investigating the BBC or the whistleblowing journalist.

Mr Milner said: ‘It is against the law for anyone to distribute images of child exploitation.

'When the BBC sent us such images we followed our industry’s standard practice and reported them to CEOP.

'We also reported the child exploitation images that had been shared on our own platform.