Revenge-Porn ‘Deepfakes’ Are Here To Spoil Humanity

AI-generated pornography – known as “deepfakes” – is becoming more convincing, seamless and real. People with rudimentary computing knowledge can now use artificial intelligence to swap the faces of actors in pornographic videos with those of people they know. Welcome to a new, terrifying era of revenge porn.

In January last year, a new app was released that gave users the ability to swap out faces in a video with a different face obtained from another photo or video – similar to Snapchat’s “face swap” feature. It’s an everyday version of the kind of high-tech computer-generated imagery (CGI) we see in the movies.

Today I Discovered 'Deepfakes' And I Don't Want To Live On This Planet Anymore Fresh off the news that Reddit brought the banhammer down on r/deepfakes and a worried direct message from a friend on Twitter, I realised I should probably go and find out what 'deepfakes' actually are. Turns out, I probably didn't want to know. Read more

You might recognise it from the cameo of a young Princess Leia in the 2016 Star Wars film Rogue One, which used the body of another actor and footage from the first Star Wars film created 39 years earlier.

A Reddit user created another version of young #PrincessLeia from #RogueOne in #FakeApp, which looks more realistic. This tech is crazy! https://t.co/FftqcepVYp pic.twitter.com/fUOad5k9ft — 80 LEVEL (@EightyLevel) January 30, 2018

Now, anyone with a high-powered computer, a graphics processing unit (GPU) and time on their hands can create realistic fake videos – known as “deepfakes” – using artificial intelligence (AI).

Sounds fun, right?

The problem is that these same tools are accessible to those who seek to create non-consensual pornography of friends, work colleagues, classmates, ex-partners and complete strangers – and post it online.

The evolution of deepfakes

In December 2017, Motherboard broke the story of a Reddit user known as “deep fakes”, who used AI to swap the faces of actors in pornographic videos with the faces of well-known celebrities. Another Reddit user then created the desktop application called FakeApp.

It allows anyone – even those without technical skills – to create their own fake videos using Google’s TensorFlow open source machine learning framework.

The technology uses an AI method known as “deep learning”, which involves feeding a computer data that the computer then uses to make decisions. In the case of fake porn, the computer will assess which facial images of a person will be most convincing as a face swap in a pornographic video.

Known as “morph” porn, or “parasite porn”, fake sex videos or photographs are not a new phenomenon. But what makes deepfakes a new and concerning problem is that AI-generated pornography looks significantly more convincing and real.

Another form of image-based sexual abuse

Creating, distributing or threatening to distribute fake pornography without the consent of the person whose face appears in the video is a form of “image-based sexual abuse” (IBSA). Also known as “non-consensual pornography” or “revenge porn”, it is an invasion of privacy and a violation of the right to dignity, sexual autonomy and freedom of expression.

In one case of morph porn, an Australian woman’s photos were stolen from her social media accounts, superimposed onto pornographic images and then posted on multiple websites. She described the experience as causing her to feel:

physically sick, disgusted, angry, degraded, dehumanised

Yet responses to this kind of sexual abuse remain inconsistent. Regulation is lacking in Australia, and elsewhere.

Recourse under Australian criminal law

South Australia, NSW, Victoria and the ACT have specific criminal offences for image-based sexual abuse with penalties of up to four years imprisonment. South Australia, NSW and the ACT explicitly define an “intimate” or “invasive” image as including images that have been altered or manipulated.

Jurisdictions without specific criminal offences could rely on more general criminal laws. For example, the federal telecommunications offence of “using a carriage service to menace, harass or cause offence”, or state and territory offences such as unlawful filming, indecency, stalking, voyeurism or blackmail.

But it is unclear whether such laws would apply to instances of “fake porn”, meaning that currently, the criminal law provides inconsistent protection for image-based sexual abuse victims across Australia.

Recourse under Australian civil law

Victims have little recourse under copyright law unless they can prove they are the owner of the image. It is unclear whether that means the owner of the face image or the owner of the original video. They may have better luck under defamation law. Here the plaintiff must prove that the defendant published false and disparaging material that identifies them.

Pursuing civil litigation, however, is time-consuming and costly. It will do little to stop the spread of non-consensual nude or sexual images on the internet. Also, Australian civil and criminal laws will be ineffective if the perpetrator is located overseas, or if the perpetrator is an anonymous content publisher.