Peter Summers / Anthony Devlin / WIRED

Last week, two videos started circling news sites with the promise of an unbelievable twist in UK politics. One showed Labour leader Jeremy Corbyn apparently backing incumbent prime minister Boris Johnson in the upcoming general election; another showed Johnson putting his weight behind Corbyn. The videos, of course, were deepfakes – videos manipulated to make it look, convincingly enough, as if the politicians had said something they had not.

The videos once again stoked concerns around deepfakes being used to undermine democracy by spreading fake news. These particular videos were not made with the intention of misleading the public; they were, like previous deepfakes depicting celebrities such as Mark Zuckerberg and Kim Kardashian, the work of artist Bill Posters in collaboration with think tank Future Advocacy, which claims they made them to bring attention to the issue of deepfakes and online disinformation. The fear is that deepfakes could be used to falsify a politician’s words or actions in order to mislead voters and threaten the integrity of the democratic process.


The reality is, deepfakes are already undermining democracy – without the need for any faked political messages. The disproportionate focus on deepfakes potentially targeting male politicians ignores those who are actually being victimised by this technology today. A report released in September by Deeptrace, which develops tools to spot deepfakes, found a total of 14,678 deepfake videos online – 96 per cent of which were pornographic in nature. On the top five sites for deepfake pornography, 100 per cent of the videos – which usually work by superimposing a person’s face on a porn performer’s body so as to make it look as if they are engaging in a sexual act – featured women.

While this statistic is often acknowledged in discussions around deepfakes, however, the real threat posed to women by deepfakes is often glossed over, or considered separately to the perceived threat posed to politicians by deepfakes. Sure, deepfake pornography is bad, but hey, a fake-news deepfake of a politician could threaten the integrity of our democracy.

Read next The NHS Test and Trace app has two flaws: QR codes and people The NHS Test and Trace app has two flaws: QR codes and people

But here’s the thing: pornographic deepfakes of women do threaten the integrity of our democracy. Like so-called revenge porn (a rather salacious euphemism to describe the act of maliciously sharing a person’s intimate images without their consent), deepfake pornography is used as a tool to humiliate, demean and silence women. When women’s voices are silenced simply because they are women; when women are humiliated because they are women; when women are subjugated because they are women – these all pose a threat to democracy. A technology employed to these ends is a threat to democracy. The fact that we don’t immediately see it that way only further reflects how insidious this kind of misogyny can be.

We have already seen real-world examples of deepfake pornography being used explicitly to silence and discredit women’s voices and prevent them from taking part in politics. Indian investigative journalist Rana Ayyub, no stranger to online abuse, found herself the victim of a pornographic deepfake spread on WhatsApp, Twitter, Facebook and Instagram in a clear attempt to discredit her work and humiliate her into silence. Ayyub was harassed and doxxed off the back of the deepfake, and says that she has since been much more cautious in what she posts online.


In the US, politician Katie Hill recently found herself a victim of nonconsensual pornography when sexual photographs of her were published without her permission. As law professor Danielle Citron has pointed out, it’s not difficult to imagine a woman attempting to run for political office in the future and finding herself the target of a deepfake video intended to undermine her to her supporters, or shame her so much that she feels compelled to give up – even though the content is entirely fake.

In this way, deepfakes are used as yet another tool to control and oppress women – to keep them ‘in their place’. The same is true of deepfake pornography that features celebrities (the vast majority of deepfake videos target actresses and musicians); they deny these women agency and send a clear message: dare to aspire to any sort of power as a woman, and you open yourself up to the risk of being attacked in this way. This risk is not present for men, who are not subject to deepfake pornography, and who, even if they were, would not be judged in the same way as women, owing to society’s persistent double standards when it comes to sex.

The intimidation and silencing of women – through whatever means – is not compatible with a healthy democracy. While we should be wary of the potential for deepfakes to manipulate and falsify political messaging, we need to recognise the very real threat posed by deepfakes used to target and harass women. But perhaps it’s not that surprising that people seem so keen to set aside that 96 per cent of pornographic deepfake videos and focus on the rest. Although the non-pornographic deepfakes analysed by Deeptrace made up only four percent of the total, 61 per cent of these featured men.

Read next These Chrome extensions protect you against creepy web tracking These Chrome extensions protect you against creepy web tracking

More great stories from WIRED

⏲️ What would happen if we abolished time zones altogether?

🍎 Prepare Yourself for the Biggest Apple Launch of All Time

🏙️ Inside the sinking megacity that can't be saved

💰 Meet the economist with a brilliant plan to fix capitalism


🎮 Long Read: Inside Google Stadia

📧 Get the best tech deals and gadget news in your inbox