The grosser parts of the internet have a new trick: Using machine learning and AI to swap celebrities’ faces onto porn performers’. The result? Fake celebrity porn seamless enough to be mistaken for the real thing. Early victims include Daisy Ridley, Gal Gadot, Scarlett Johansson, and Taylor Swift. Originally reported by Motherboard, this nasty trend has been brewing for months, acquiring its own subreddit. And now that someone has made an app—drastically lowering the technical threshold would-be creators have to clear— it’s presumably about to become much more prevalent.

For reasons that are eye-poppingly obvious, these videos—which their creators refer to as "deepfakes," after the redditor who created the process—are terrible. It’s a noxious smoothie made of some of today's worst internet problems. It’s a new frontier for nonconsensual pornography and fake news alike. (Doctored videos of political candidates saying outlandish things in 3, 2... .) And worst of all? If you live in the United States and someone does this with your face, the law can’t really help you.

To many vulnerable people on the internet, especially women, this looks a whole lot like the end times. “I share your sense of doom,” Mary Anne Franks, who teaches First Amendment and technology law at the University of Miami Law School, and also serves as the tech and legislative policy advisor for the Cyber Civil Rights Initiative. “I think it is going to be that bad.”

She should know. Franks helped write much of the US’s existing legislation that criminalizes nonconsensual porn—and it's not going to help. It’s not that Franks and lawmakers weren’t thinking about the implications of manipulated images. It’s that the premise of any current legislation is that nonconsensual porn is a privacy violation. Face-swap porn may be deeply, personally humiliating for the people whose likeness is used, but it's technically not a privacy issue. That's because, unlike a nude photo filched from the cloud, this kind of material is bogus. You can’t sue someone for exposing the intimate details of your life when it’s not your life they’re exposing.

And it's the very artifice involved in these videos that provides enormous legal cover for their creators. “It falls through the cracks because it’s all very betwixt and between,” says Danielle Citron, a law professor at the University of Maryland and the author of Hate Crimes in Cyberspace. “There are all sorts of First Amendment problems because it’s not their real body.” Since US privacy laws don’t apply, taking these videos down could be considered censorship—after all, this is “art” that redditors have crafted, even if it’s unseemly.

You can’t sue someone for exposing the intimate details of your life when it’s not your life they’re exposing.

In case after case, the First Amendment has protected spoofs and caricatures and parodies and satire. (This is why porn has a long history of titles like Everybody Does Raymond and Buffy the Vampire Layer.) According to Citron, claiming that face-swap porn is parody isn't the strongest legal argument—it's clearly exploitative—but that’s not going to stop people from muddying the legal waters with it.

So What Now?

Does that mean that victims have zero hope of legal recourse? Not exactly. Celebrities will be able to sue for the misappropriation of their images. But that usually applies to commercial contexts—like, say, if someone took a social media photo of Gal Gadot’s and then used it to promote a strip club without her consent—and commercial speech doesn’t have nearly the protection individual citizens’ does.

For the average citizen, your best hope is anti-defamation law. When Franks realized that revenge porn law wouldn't include language about false images, she recommended that lawmakers update their anti-defamation statutes to handle it—but in many cases, that hasn’t happened yet. And Franks thinks claimants will have difficulty proving that the creators intended to cause them emotional distress. So far, these videos do seem to have been created for the pleasure of the creator rather than the humiliation of the object of their desire. “Inevitably, someone will point out how many young men had posters of Princess Leia in their bedrooms as a masturbation fantasy,” Franks says. “Is the harm just that you found out about? Legally, we need to be able to articulate what is the harm, not just that it makes us feel icky.” And in such a fringe case as AI-enabled porn, that hasn’t happened yet.