New photo-matching technology that allows users to easily report intimate pictures posted without consent has been praised by campaigners

Facebook is launching a series of tools designed to crack down on the sharing of so-called revenge porn.

The new tools will allow users to easily report any intimate photos posted without consent that they see on the social network, which will flag the pictures in question to “specially trained representatives” from the site’s community operations team, who will “review the image and remove it if it violates [Facebook’s] community standards”.

Facebook will also use “photo-matching technologies” for pictures the site is already aware are being shared non-consensually. This is likely to be similar to the PhotoDNA image hashing system, which is already used to identify child abuse imagery and terrorist material, and prevent further sharing.

The site will also disable accounts for sharing such images “in many cases”.

Antigone Davis, Facebook’s head of global safety, said: “These tools, developed in partnership with safety experts, are one example of the potential technology has to help keep people safe. Facebook is in a unique position to prevent harm, one of our five areas of focus as we help build a global community.”

Davis cited CEO Mark Zuckerberg’s manifesto about the future of Facebook, in which the company’s chief executive wrote: “As we build a global community, this is a moment of truth. Our success isn’t just based on whether we can capture videos and share them with friends. It’s about whether we’re building a community that helps keep us safe — that prevents harm, helps during crises and rebuilds afterwards.”

In April 2015 it was made an offence in England and Wales to share private sexual images or video without the subject’s consent, and according to the most recent Violence Against Women and Girls report, 206 people were prosecuted for such offences in the law’s first year.

In a statement provided by Facebook, Laura Higgins, the founder of the Revenge Porn Helpline, UK, supported the changes. “We are delighted with the announcement made by Facebook today,” she said.

“This new process will provide reassurance for many victims of image-based sexual abuse, and dramatically reduce the amount of harmful content on the platform,” Higgins added. “We hope that this will inspire other social media companies to take similar action and that together we can make the online environment hostile to abuse.”

Legally, “revenge pornography” is treated very differently in different jurisdictions, with the key question often coming down to whether the image is a “selfie”, in which case a copyright claim may be brought, or taken by the person who posted or it. But some nations and states have updated their statutes to create a special class of offence: in California, for instance, revenge porn is defined as posting explicit images taken “under circumstances where the parties agree … that the image shall remain private”. The subject must also suffer “serious emotional distress”.

Last year, a Belfast 14-year-old launched a legal case against Facebook over the publication on the site of a naked image, which the girl’s parents say was extracted from her through blackmail. The trial, which Facebook attempted to have thrown out of court in early October, focused on the fact that the photograph was repeatedly removed from the “shame page” on which it was posted. Yet each time it was removed, the girl’s lawyers say, it was reposted, with Facebook doing nothing to permanently block the image.

The new photo-matching technologies would have helped prevent the repeated reposting of an already-blocked image. But they can do little to pre-emptively block the first posting. Technologically, such systems rely on matching images to a pre-existing database of images intended to be blocked. To highlight a revenge porn image posted for the first time, a user still needs to flag it as such to Facebook.

The site does also remove images for containing consensually posted nudity, which is against its terms of service. It has faced accusations of being overly broad in how it applies the rule, with breastfeeding mothers and mastectomy survivors among those who have complained after photos were removed.