iStock/SensorSpot

The rise of fake, AI-generated porn was, perhaps, inevitable. A small community on Reddit has created and fine-tuned a desktop application that uses machine learning to morph non-sexual photos and transplant them seamlessly into pornographic videos.

The FacesApp, created by Reddit user DeepFakesApp, uses fairly rudimental machine learning technology to graft a face onto still frames of a video and string a whole clip together. To date, most creations are short videos of high-profile female actors.


The app was first reported on by Motherboard. The technology has been toyed around with on Reddit since late last year – but the desktop app that lets anyone create their own falsified pornographic videos was only released earlier this week. At the time of writing, 29,000 people are subscribed to the subreddit.

"If you think about technological developments, it's not a surprise," says Clare McGlynn, a professor at the Durham Law School who specialises in pornography regulations and sexual abuse images. "Because of the availability of this technology it is probably happening a lot more than we are aware of."

Read next These Chrome extensions protect you against creepy web tracking These Chrome extensions protect you against creepy web tracking

The FakesApp software is currently hosted on Google Drive. A separate mirror of the download has also been uploaded to Mega. The person behind the Google Drive account responded to an initial request for an interview but had not answered any questions at the time of publication. Google had also not responded to questions about whether the hosting of the software on its servers was in violation of its terms.

Updating the law

While it appears no videos containing non-celebrities have been created yet, there are plenty of examples of creating non-consensual sexual images using static image doctoring. "If people don't know it is Photoshopped, they are assuming that this is an image of yourself," McGlynn says. "The abuse can be the same, the harassment can be the same, the adverse impact on your family and employer. I don't think there's a basis on which to say the harm is less because it is Photoshopped." AI-generated fake pornographic videos will only worsen this issue.


Earlier this month, an Australian man was sentenced to 12 months in jail after Photoshopping images of his stepdaughter onto images of women engaged in sexual acts and beastiality. The man described them as artworks.

A North Jersey woman has also spoken out about how potential employers found forged nude images of her online. The pictures were initially taken from her MySpace account and the man accused of creating them faces trial. The incident only came to light after he was arrested for invading the privacy of another woman.

In a long list of examples, Indian actress Jyothi Krishna has also spoken out about her photo being used in fake pornographic image. In 2015, two teenage boys in India were arrested for creating fake videos of other actresses.

Read next The best VPN services tested for speed, reliability and privacy The best VPN services tested for speed, reliability and privacy

In the UK, the law around faked sexual images and videos is murky. There isn't a specific offence that covers creation of fake non-consensual images. But Max Campbell, a defamation, privacy and harassment solicitor at Brett Wilson LLP says people creating such videos could be charged with a number of criminal offences and civil proceedings.


"It may amount to harassment or a malicious communication," he explains. "Equally, the civil courts recognise a concept of ‘false privacy’, that is to say, information which is false, but which is nevertheless private in nature." There are also copyright issues for the re-use of images and video that wasn't created by a person. A British city worker appeared in court under harassment charges after he was accused of editing a woman's face onto porn images and then posting them online.

The UK law that covers revenge 'porn' came into force in 2015. It mentions films but doesn't clarify whether faked images are covered. In Scotland the law states charges can be brought if a film or video is "altered".

"All these technologies are being created and we're all having to deal with it, so now is the time to talk about what are our ethical standards here," McGlynn says. "What are the things we should be thinking about before people use these apps, and one of them should be about the impact on the victim. The law should reflect that."

Creating a fake

Creating these fake videos isn't technically difficult. A tutorial explains how system to works and what hardware is needed, with most of the steps happening automatically.

Read next Free VPNs are a privacy nightmare. You shouldn’t download them Free VPNs are a privacy nightmare. You shouldn’t download them

Two types of content are needed to produce a video: the original footage a person wants to transplant a head onto and a database of images or videos of the person's head they want to transplant into the video. A machine learning algorithm is then trained to shape and position the face. When this is complete, it can then be added to the individual frames of video.

The Deepfakes subreddit itself is mixed with non-consensual fake videos of celebrities and people with questions about AI. One conversation asks about existing databases of images to be used in the machine learning process, while another questions what the minimum GPU requirement is. Above is a thread titled: "Blowjobs when?".

At present, the FakeApp is mostly being used to add the faces of female celebrities to existing porn videos. But the way the AI-powered system works means it could be used to add any face to any video. In the same way that non-consensual sex abuse images are put online – often known colloquially as 'revenge porn' – this AI system could easily be used to create fake pornographic videos as a means of harassment.


A separate tool, created by a pornographic website, can also be used to match the faces of friends or celebrities with porn stars – adding another tool to an increasingly concerning arsenal. MegaCams, the firm behind the face-matching tool, told The Memo: "The matter if this is ethical or not is our call, it’s up to the user using it."

One of the most popular threads on the Deepfakes subreddit claims the practice isn't necessarily bad. Reddit user Gravity_Horse writes: "What we do here isn't wholesome or honorable, it's derogatory, vulgar, and blindsiding to the women that deepfakes works on." Despite this the user says it isn't done with "malicious intent". The Redditor continues: "I have never heard of a revenge porn photo created through photoshop that was taken seriously by peers, nor any celebrity fakes that were taken as real by the public."

One Reddit user has asked whether they would be able to buy a fake video featuring their girlfriend. "Like if I wanted to see my girlfriend fucking another guy for instance, is this a service that is offered?" Another asked: "Is it possible to edit your crush in a porn video?"