Facebook was forced to report the BBC to the UK's National Crime Agency (NCA) after the broadcaster shared with the company screenshots of "sexualised images of children" that it had copied from the site, Ars understands.

On Tuesday, Facebook was bombarded with criticism after the BBC claimed that the free content ad network had failed to nix 82 images, even though they appeared to clearly break the firm's own "community standards" rules.

Ars has learned that Facebook had requested links to the offending material from the BBC —which reportedly included "pages explicitly for men with a sexual interest in children," and "an image that appeared to be a still from a video of child abuse, with a request below it to share "child pornography"—but instead the broadcaster provided screenshots taken from the site.

In its story, the BBC said: "When provided with examples of the images, Facebook reported the BBC journalists involved to the police and cancelled plans for an interview."

However, the BBC's investigation may have been overshadowed because it appears to have fudged the procedure for handling such atrocious and illegal material.

Ars has asked the BBC if it followed the Crime Prosecution Service (CPS) and Association of Chief Police Officers' (ACPO) guidelines on exemptions relating to individuals who "are acting to combat the creation and distribution of images of child abuse."

It declined to respond directly to our questions, however. The BBC said:

In 2016 a BBC investigation found that paedophiles were using secret groups on Facebook to post and swap sexually suggestive images of children which led to one man receiving a four year prison sentence. One year later in this follow up investigation we found similar images still on the site, which we reported using Facebook’s own moderation procedures. Some of the images were then removed, but the majority remained on the site. When the BBC approached Facebook with its findings, Facebook agreed to an interview on the condition the BBC provided examples of the remaining material which had been deemed acceptable by Facebook’s own moderation procedure. The BBC provided that evidence to Facebook.

According to the CPS, extreme care must be taken with illegal material that shows children being sexually abused. Its guidelines state: "Investigation should not involve making more images, or more copies of each image, than is needed in all the circumstances."

The BBC could have sought help from the NCA or ACPO on how to handle the material, which the CPS says "will give additional certainty to individuals and organisations who are likely to need, frequently, to 'make' indecent photograph or pseudo-photograph and, provided the conditions were adhered to, such activities would not be subject to a criminal investigation as it would not be in the public interest to prosecute."

But it's unclear whether the corporation did this prior to taking the copied images to Facebook. Facebook's policy director Simon Milner said in a statement:

We have carefully reviewed the content referred to us and have now removed all items that were illegal or against our standards. This content is no longer on our platform. We take this matter extremely seriously and we continue to improve our reporting and take-down measures. It is against the law for anyone to distribute images of child exploitation. When the BBC sent us such images we followed our industry’s standard practice and reported them to CEOP [Child Exploitation & Online Protection Centre]. We also reported the child exploitation images that had been shared on our own platform. This matter is now in the hands of the authorities.

Ars asked the Internet Watch Foundation to comment on this story. We wanted to know if it considered social networks to be a challenge or barrier to the charity when it comes to removing illegal images of child sex abuse. It told us:

We act on material that is criminal and fails UK law. Typically, this will be of children aged 10 and under. Even the lowest level images will involve serious sexual abuse. We have 130 members who work closely with us and these include social media companies. We have been working with Facebook since 2009 and most recently they participated in a pilot programme to create and implement The IWF Image Hash List. This is a list of digital fingerprints of known images of child sexual abuse which they can deploy across their service to ensure that any duplicate images are not uploaded in the first place. In 2016 less than 1 percent of the criminal images and videos actioned by the IWF were on social media platforms.

Facebook has been downloading the IWF's hash list daily since August 2015. But the organisation's chief Susie Hargreaves warned: "We are never complacent, the reality is that new images and videos of children being sexually abused appear every day."