While media, politicians, and technologists panic over the risk of deepfakes impacting elections, a new study has found that the vast, vast majority of deepfakes are pornographic in nature. On top of that, to the surprise of absolutely no one, all of the pornographic deepfakes analyzed in the study exclusively targeted women.

The news acts as a reminder that although in the future political actors may adopt deepfakes for the purposes of disinformation, at the moment their use is squarely in their original, designed purpose: to target and harass women.

"[A] key trend we identified is the prominence of non-consensual deepfake pornography, which accounted for 96% of the total deepfake videos online," the study, titled The State of Deepfakes and authored by cybersecurity company Deeptrace , reads.

Do you know anything new about deepfakes? We'd love to hear from you. You can contact Joseph Cox securely on Signal on +44 20 8133 5190, Wickr on josephcox, OTR chat on jfcox@jabber.ccc.de, or email joseph.cox@vice.com.

The company found a total of 14,678 deepfake videos online. According to Giorgio Patrini, CEO and chief scientist at Deeptrace, the company then examined the gender of targets in videos from five deepfake porn sites (7,144 videos) and 14 YouTube channels (under 500 videos). Videos on four top, dedicated deepfake pornography websites also had over 134 million views, and all but 1 percent of the subjects featured in deepfake pornography videos were female actors and musicians working in the entertainment sector, the report adds.

Motherboard first reported the existence of deepfakes in late 2017. In June, we highlighted that "more outlets started reporting on the phenomenon and panic ensued as media theorists considered the implications of video losing its inherent veracity, especially when it came to news and politics. But it all began with the sex—and a long legacy of toxic male culture and willful ignorance of consent that’s come to a glitchy, moaning, pixelated head."

The new report adds, "Deepfake pornography is a phenomenon that exclusively targets and harms women. In contrast, the non-pornographic deepfake videos we analyzed on YouTube contained a majority of male subjects."