Reddit delivered a serious blow to creators of deepfakes with a ban on the primary subreddit for AI-assisted face-swapped videos and images — a community over 90,000 users strong at the time of its demise. In that regard, Reddit is late to the party. Platforms such as Discord, Gyfcat, Pornhub, and even Twitter have already made their anti-face-swap porn policy clear. But while Reddit has finally removed the offending content in question, the damage is done. Pandora’s box is already open, and there’s no stuffing this technology back in.

The morality and the legality of deepfakes are murky issues. Swapping one person’s face onto another’s body is not inherently malicious, but the practice isn’t just used for creating digital stunt doubles. Most deepfakes are pornographic in nature, with users replacing the faces of porn stars with their favorite celebrities. But the tech also enables potential abuse for anyone who puts their face online, who could then end up appearing to star in porn against their will.

Despite creating a new genre of non-consensual fake porn, however, there are many deepfakes creators who believe that their videos aren’t harmful to the people they portray. This curious cognitive dissonance runs rampant throughout the community. While some users on Reddit have vehemently disavowed the practice, likening the pornographic insertion of people’s faces to something like a digital assault, others drew the line at the use of popular political figures like Michelle Obama. Still others argued that celebrities are fair game, while smaller YouTubers and civilians are not. A post on the now-deleted subreddit that debated whether or not it’s okay to deepfake your crush was met with a lot of pushback; another post compared the technology to an episode of Black Mirror. “The general consensus in the comments was that they had some qualms with it, but let their sex drive take over their ethical questions,” the original poster later told The Verge over DM.

One user, a 32-year-old accountant from New York who asked to only be identified as Mike, says he doesn’t view the practice as harmful. “[It’s] taking the consensual nudity of an adult porn star and simply placing a different face on it,” he says. “In a world where people are hacking celebrities’ real personal nude photos and where paparazzi are paid very well to stalk celebrities to take actual real photographs of them in compromising positions, I think fake manipulations are the least of our worries.”

“If a person featured in a deepfakes video said they didn’t consent to it, I don’t think it would change my opinion because I already know they didn’t consent to it.”

Mike — like many others The Verge spoke with from the subreddit — says the practice is no more harmful than a mental fantasy, in part because these videos are often marked as fakes with watermarks. “If a person featured in a deepfakes video said they didn’t consent to it, I don’t think it would change my opinion because I already know they didn’t consent to it,” he says. “It’s digital manipulation they never signed up for, much like any non-pornographic Photoshop/After Effects manipulations that have been done by people in the past.”

A post in the now-deleted subreddit from user Gravity_Horse, addressed to anyone who “opposes” deepfakes and the community at large, includes an acknowledgment that the practice at large is morally questionable. “To those who condemn the practices of this community, we sympathize with you,” Gravity_Horse wrote. “What we do here isn’t wholesome or honorable, it’s derogatory, vulgar, and blindsiding to the women that deepfakes works on.”

But in an email with The Verge, Gravity_Horse adds that the goal of most deepfakes creators is not to harm or defame anyone. “I’m sorry for those who are hurt or betrayed by these creations. But there’s something important to be said here,” Gravity_Horse says. “The technology is only becoming more and more advanced ... People are going to be scared. And I genuinely sympathize with them. But since the technology can’t be uninvented, we have to advance with it. I’m far more a proponent of the deepfakes algorithm itself and its potential rather than what it’s currently being used for. But then again, welcome to the internet.”

Many of these sentiments were echoed on the subreddit and directly to The Verge by dozens of users who have little concern about the moral issues at hand or the personal harm deepfakes can cause to their unwilling “stars.” If they don’t make these images and videos, they argue, someone else will. Some users, like Gravity_Horse, even contend that deepfakes could potentially eliminate revenge porn by effectively making any video questionable. “As it becomes more mainstream and near impossible to tell fantasy from reality, anything is subject to being fake,” Gravity_Horse says.

But the argument that fake porn is not harmful is a flawed one, and claiming that it’s benign because it’s “not real” is a glib self-justification. Sociologist Katherine Cross, who has contributed to The Verge previously, points to the community’s lack of consensus on what is fair to fake — Emma Watson and Michelle Obama, versus your crush down the street — as proof that the practice is more than just an exaggerated fantasy.

“[Deepfake users] understand, intuitively, that this is more real than they want to admit,” she tells The Verge via email. “If it’s all totally harmless and essentially unreal, they wouldn’t mind putting together deepfake porn of people they know. But of course, they do, and it’s because they understand the symbolism of all this. What are the semiotics of a woman, in a pornographic frame, on her knees [giving] a blowjob and why does it make you so uncomfortable to put your mother or sister in that role? Why, then, do it to a woman you don’t know?”

The problem for Cross isn’t with porn itself, but the consequences of forcing someone into such a scenario, authentic or not. “It’s a sort of cheat code for getting around your own morality,” she says. “[They say,] ‘I can hurt people because it’s just a game, and it’s not real hurt.’ But of course, at the same time, you really want to do the thing that’s hurting people because it’s real enough to give you pleasure. Pornographic deepfakes don’t merely exist as abstract art projects; they’re masturbatory aids, to put it bluntly. If they’re real enough for these men to get off, they’re real enough for the people they depict to lodge objections. They have, in a real way, been dragooned into something they didn’t want to do, or wouldn’t do.”

“This is more real than they want to admit”

Defenders of deepfakes are correct in saying the technology itself is not inherently bad, however, and can have benign applications. Subreddits featuring non-pornographic fakes remain online, and their content includes a video by one user who swapped his wife’s face on to Anne Hathaway’s to give her a spot on The Tonight Show; others have found creative acting opportunities for Nicolas Cage.

In an email to The Verge, Reddit user and VXF artist Benjamin Van Den Broeck says he is interested in the tech “from a professional standpoint only.” (Van Den Broeck, whose credits include Robot Chicken and Moral Orel, says he has never made a deepfake nor wants to, but previously posted in the subreddit about its political applications.) “Before [deepfakes], you needed a team of artists working around the clock to do even a slightly convincing job of a face/body swap,” he says. “This algorithm is breaking the barriers of uncanny valley, providing a scarily accurate faceswap over a single gaming computer, possibly in as short as 24 hours. No team, no render farm, no money.”

He says that while this technology has the potential to improve film effects, it could have serious consequences in the hands of the masses. “Even an unconvincing face swap applied to a community unfamiliar with new tech (like the third world) will not be instantly disregarded as a [deepfake],” Van Den Broeck says. “Trump’s pee pee tape will, instead, be an example of the opposite. If a real tape were to surface, they could claim it as a deepfake.” He calls revenge porn just “the tip of the iceberg,” pointing to the eventual possibility of deepfakes done live.

Although mainstream web platforms are chasing pornographic deepfakes off the internet, that won’t stop them from being created. Instead, users will continue to push out to edge platforms that will have them. The separation of the deepfakes that are allowed to exist on platforms like Twitter or Reddit, versus the ones that aren’t, is in itself a moral judgment on what is right or wrong. That may dissuade some people from creating new pornographic deepfakes, but the community at large will just find somewhere new to go. As one user wrote in a now-deleted post, “I have some philosophical qualms with [deepfakes], but it doesn’t stop me from jerking it.”