Fake nudes have a long history. Before DeepNude, there was Photoshop; before the internet, there was Screw.

In the 1970s, the porn magazine Screw fired shots at feminist Gloria Steinem with a nude drawing of the Ms. founder, surrounding her image with penises and encouraging readers to play a game of “Pin the Cock on the Feminist.” Two decades later, Spy Magazine undressed then First Lady Hilary Rodham Clinton, presenting her in bondage gear on the cover of their February 1993 issue. Indeed, DeepNude’s status as merely the latest iteration in a longstanding trend has already been invoked by some of its defenders. “What you can do with DeepNude, you can do very well with Photoshop,” the program’s creator told Motherboard in an interview.

“Women don’t sit around alone in their bedrooms writing technology that objectifies men”

While technology may aid in the mission of men like DeepNude’s creator, who goes by the name Alberto, the app isn’t actually a project about the potential of machine learning and A.I. Alberto has cited gadgets like supposed X-Ray specs — which he saw ads for in the backs of magazines from the 1960s and 1970s — as the inspiration for DeepNude. The desire made manifest by the app’s code is one that we’ve seen over and over in pop culture, a desire for control over women’s bodies and sexuality, and a willingness to shame women who step out of line by offering their bodies up for public consumption.

Andi Zeisler, co-founder of Bitch Media, was equally affected by those X-Ray spec ads, though in a different way than Alberto. “The idea of the invasion of privacy was just wildly upsetting to me,” she says, noting that it wasn’t just ads in the backs of comic books that conditioned her to expect this treatment. Movies like M*A*S*H, Fame, and Revenge of the Nerds all feature scenes of men spying on women undressing or in the shower without their knowledge or consent. “That was such a trope,” Zeisler says — one that sent the message that women’s bodies are intended for male consumption. Those who resist this “natural” state of affairs are to have their bodies, and their image, wrested from their control by force.

“Women don’t sit around alone in their bedrooms writing technology that objectifies men,” says Adam MacLean, founder of PostShame and #PostShame, a social media campaign that encourages aspiring leaders to move past the shame associated with personal indiscretions and sexual behavior in order to focus on fixing the real issues facing our nation. And when men are publicly undressed, it rarely becomes the talking point that women’s naked bodies do. Jennifer Lawrence’s leaked nude photos defined her public image for years; Kanye West’s leaked nudes barely made a blip on the radar. Our society is set up to weaponize women’s sexuality against them — and that, more than A.I. or easy image manipulation, is what makes a program like DeepNude so dangerous.

The scariest thing about DeepNude may not be the software itself: It’s our unwillingness to see DeepNude, and other faked nude images, as the technological vanguard of a larger campaign of harassment and abuse of women. Forcibly undressing women or circulating nudes without consent are broadly recognized as abhorrent, even criminal acts, but the law is less evolved on this kind of emerging A.I. technology.

“The biggest difference with deepfakes [is] about how the laws deal with them,” says Carrie Goldberg, a Brooklyn-based lawyer whose firm specializes in providing support to the victims of revenge porn. “Most current revenge porn laws only criminalize the distribution of real images, not fake ones.” Laws that focus on copyright and ownership of one’s image provide limited support to people whose harassment comes courtesy of an entirely manufactured image.

So long as our discussion focuses on who owns the images that are being distributed, or how they were created, rather than the harm they are being used to cause, the abuse and bullying of women will continue: with or without advanced tech and A.I.

Back in January, newly inaugurated Congress member Alexandria Ocasio-Cortez found herself mired in a nude photo scandal — not because someone had leaked a private picture, or spent time and energy manufacturing a fake nude, or even took the two seconds to transform a clothed pic into a nude using machine learning and A.I. Ocasio-Cortez made the news because a naked photo of someone else entirely had been circulated with her name attached.

To her credit, Ocasio-Cortez refused to be cowed by the photo. Instead, she shifted the conversation from probing questions about her body and her sex life to the abusive nature of circulating naked images, both real and fake, without the subjects’ consent. We should all take inspiration from that response. Because as long as we continue to uphold our society’s toxic beliefs about women and sex, treating women as objects for public consumption whose sexuality simultaneously defines and dehumanizes them, we’ll continue to give power and ammunition to the misogynists eager to shame and humiliate women like Ocasio-Cortez.

Fully recognizing women as complex human beings who are entitled to both privacy and rich, complicated sexual lives — giving women the same sympathy, respect, and understanding we give men — is a far more effective tactic than calls to halt the progress of technology, or whack-a-mole-like attempts to get ahead of the next iteration of DeepNude. Unfortunately, it’s also a strategy that takes a lot more work.

“There has to be an absolute paradigm shift in how we understand women’s roles and how comfortable we are with… any sexuality that does not privilege men at the center,” says Zeisler. “It seems like a much more sci-fi prospect than any kind of technology.”