FACEBOOK wants to stop revenge porn before it happens — by getting people to send in their own naughty sex pictures.

The social network is testing a system which uses image recognition to identify revenge porn and automatically delete it, reports The Sun.

But this can only work if Facebook has copies of the dirty pictures in the first place.

Zuckerberg’s firm is running a pilot scheme in Australia and has teamed up with the Office of the eSafety Commissioner to trial its new idea.

Julie Inman Grant, eSafety Commissioner, said people who were worried about “image-based abuse” should give Facebook their sexy pics.

“It would be like sending yourself your image in email, but obviously this is a much safer, secure end-to-end way of sending the image without sending it through the ether,” she told ABC.

She reassured potential revenge porn victims that the pictures wouldn’t be kept forever.

“They’re not storing the image, they’re storing the link and using artificial intelligence and other photo-matching technologies,” Grant said.

“So if somebody tried to upload that same image, which would have the same digital footprint or hash value, it will be prevented from being uploaded.”

In a statement, Antigone Davis, Facebook head of global safety, said “the safety and wellbeing of the Facebook community is our top priority”.

“As part of our continued efforts to better detect and remove content that violates our community standards, we’re using image matching technology to prevent non-consensual intimate images from being shared on Facebook,” she continued.

“These tools, developed in partnership with global safety experts, are one example of how we’re using new technology to keep people safe and prevent harm.”

This story first appeared on The Sun.