It's not going to fool anyone who looks closely. Sometimes the face doesn't track correctly and there's an uncanny valley effect at play, but at a glance it seems believable. It's especially striking considering that it's allegedly the work of one person—a Redditor who goes by the name 'deepfakes'—not a big special effects studio that can digitally recreate a young Princess Leia in Rogue One using CGI . Instead, deepfakes uses open-source machine learning tools like TensorFlow, which Google makes freely available to researchers, graduate students, and anyone with an interest in machine learning.

The video was created with a machine learning algorithm, using easily accessible materials and open-source code that anyone with a working knowledge of deep learning algorithms could put together.

There’s a video of Gal Gadot having sex with her stepbrother on the internet. But it’s not really Gadot’s body, and it’s barely her own face. It’s an approximation, face-swapped to look like she’s performing in an existing incest-themed porn video.

Artificial intelligence researcher Alex Champandard told me in an email that a decent, consumer-grade graphics card could process this effect in hours, but a CPU would work just as well, only more slowly, over days.

According to deepfakes—who declined to give his identity to me to avoid public scrutiny—the software is based on multiple open-source libraries, like Keras with TensorFlow backend. To compile the celebrities’ faces, deepfakes said he used Google image search, stock photos, and YouTube videos. Deep learning consists of networks of interconnected nodes that autonomously run computations on input data. In this case, he trained the algorithm on porn videos and Gal Gadot’s face. After enough of this “training,” the nodes arrange themselves to complete a particular task, like convincingly manipulating video on the fly.

Fake celebrity porn, where images are photoshopped to look like famous people are posing nude, is a years-old category of porn with an ardent fan base. People commenting and voting in the subreddit where deepfakes posts are big fans of his work. This is the latest advancement in that genre.

So far, deepfakes has posted hardcore porn videos featuring the faces of Scarlett Johansson, Maisie Williams, Taylor Swift, Aubrey Plaza, and Gal Gadot on Reddit. I’ve reached out to the management companies and/or publicists who represent each of these actors informing them of the fake videos, and will update if I hear back.

Like the Adobe tool that can make people say anything, and the Face2Face algorithm that can swap a recorded video with real-time face tracking, this new type of fake porn shows that we're on the verge of living in a world where it's trivially easy to fabricate believable videos of people doing and saying things they never did. Even having sex.

“This is no longer rocket science,” Champandard said.

The ease with which someone could do this is frightening. Aside from the technical challenge, all someone would need is enough images of your face, and many of us are already creating sprawling databases of our own faces: People around the world uploaded 24 billion selfies to Google Photos in 2015-2016. It isn’t difficult to imagine an amateur programmer running their own algorithm to create a sex tape of someone they want to harass.

Read more: Facial Recognition for Porn is a Privacy Nightmare Waiting to Happen

Deepfakes told me he’s not a professional researcher, just a programmer with an interest in machine learning.

“I just found a clever way to do face-swap,” he said, referring to his algorithm. “With hundreds of face images, I can easily generate millions of distorted images to train the network,” he said. “After that if I feed the network someone else's face, the network will think it's just another distorted image and try to make it look like the training face.”

In a comment thread on Reddit, deepfakes mentioned that he is using an algorithm similar to one developed by Nvidia researchers that uses deep learning to, for example, instantly turn a video of a summer scene into a winter one. The Nvidia researchers who developed the algorithm declined to comment on this possible application.