A troubling new video that appears to show Wonder Woman star Gal Gadot performing in a short adult film has shed startling light on what could happen when machine learning falls into the wrong hands.

The video, created by Reddit user deepfakes, features a woman who takes on the rough likeness of Gadot, with the actor’s face overlaid on another person’s head.

It was made by training a machine learning algorithm on stock photos, Google search images, and YouTube videos of the star – and experts warn the technique is ‘no longer rocket science.’

Scroll down for video

A troubling new video that appears to show Wonder Woman star Gal Gadot performing in a short porn film has shed startling light on what could happen when machine learning falls into the wrong hands

THE CONCERNS The algorithm was trained on real porn videos and images of Gal Gadot, allowing it to create an approximation of the actor’s face that can be applied to the moving figure in the video. As all of this is freely available information, it could be done without that person's consent. And, as Motherboard notes, people today are constantly uploading photos of themselves to various social media platforms, meaning someone could use such a technique to harass someone they know. Advertisement

The unsettling new video spotted by Motherboard might not fool anyone, but it is a stark reminder of the growing concerns over the ease with which machine learning could be used to create fake porn starring a particular person without their consent, along with other malicious content.

And, it’s not the first.

Deepfakes has made similar videos of other stars, too, including Taylor Swift and Game of Thrones’ Maisie Williams, according to Motherboard, which says it has notified the management companies and publicists of those affected.

The Redditor relied on open-source machine learning tools to create the fake porn videos.

The algorithm was trained on real porn videos and images of Gal Gadot, allowing it to create an approximation of the actor’s face that can be applied to the moving figure in the video.

The video, created by Reddit user deepfakes, features a woman who takes on the rough likeness of Gadot, with the actor’s face overlaid on another person’s head. A clip from the video is shown

‘I just found a clever way to do face-swap,’ deepfakes told Motherboard.

‘With hundreds of face images, I can easily generate millions of distorted images to train the network.

‘After that if I feed the network someone else’s face, the network will think it’s just another distorted image and try to make it look like the training face.’

The amateur video has worrying implications, showing how freely available resources could be used to create fake films in just a matter of days or even hours.

The video was made by training a machine learning algorithm on stock photos, Google search images, and YouTube videos of the star (pictured above as Wonder Woman) – and experts warn the technique is ‘no longer rocket science’

SCARLETT JOHANSSON ROBOT SPARKS CONCERNS Ricky Ma Wai-kay, 42, built his life-sized robot, dubbed Mark 1, from scratch, for a sum of HK$380,000 (£37,103). The robot responds to a set of programmed verbal commands spoken into a microphone. Besides simple movements of its arms and legs, turning its head and bowing, Mr Ma's robot, which has dark blonde hair and realistic eyes, and wears a grey skirt and cropped top, can create detailed facial expressions. In response to the compliment, 'Mark 1, you are so beautiful', its brows and the muscles around its eyes relax, and the corners of its lips lift, creating a natural-seeming smile, and it says, 'Hehe, thank you.' A video showing the bizarre creation in action also shows Mark 1 thanking its owner when he compliments it and giving him a wink. A 3D-printed skeleton lies beneath Mark 1's silicone skin, wrapping its mechanical and electronic parts. About 70 percent of its body was created using 3D printing technology. Advertisement

And, as Motherboard notes, people today are constantly uploading photos of themselves to various social media platforms, meaning someone could use such a technique to harass someone they know.

‘Everyone needs to know just how easy it is to fake images and videos, to the point where we won’t be able to distinguish forgeries in a few months from now,’ AI researcher Alex Champandard told Motherboard.

‘Of course, this was possible for a long time but it would have taken a lot of resources and professionals in visual effects to pull this off.

‘Now it can be done by a single programmer with recent computer hardware.’