Scarlett Johansson said fighting against “deepfake” porn videos, in which someone’s face is superimposed onto a clip they didn’t actually appear in, has become a “lost cause.”

The “Avengers” star commented in a Sunday Washington Post story about how artificial intelligence software is being used to create increasingly realistic-looking fake porn videos — and how woman are inordinately targeted. One bogus video, with what is falsely described as “leaked” footage of Johansson, has been viewed more than 1.5 million times on a popular porn site.

“Nothing can stop someone from cutting and pasting my image or anyone else’s onto a different body and making it look as eerily realistic as desired,” Johansson told The Post. “The fact is that trying to protect yourself from the Internet and its depravity is basically a lost cause.”

Also Read: Scarlett Johansson Leaves Manager Rick Yorn After 2 Decades

She added that it’s “just a matter of time before any one person is targeted” by deepfake videos.

Compounding matters, the videos exist in a gray area where there is little legal recourse in the United States, as Wired pointed out earlier this year. Wired explained that the videos are safeguarded by their own phoniness: You can sue someone if they grab a nude photo of yourself from the cloud and share it without your consent. But deepfake videos aren’t violating privacy laws, because they’re not exposing a real likeness.

Not all targets of deepfakes are celebrities. The Post led its story with a 40-something woman who felt “violated” after her face was used in a deepfake video. She said she was terrified it would ruin her marriage or career.

Also Read: Scarlett Johansson to Play Real-Life Woman Who Posed as Male Massage Parlor Owner

A few major tech platforms in the last year have taken steps to combat deepfakes. Earlier this year, Reddit banned its deepfakes subreddit — which had more than 15,000 followers — for violating its policy against “involuntary pornography.” In September, Google banned “involuntary synthetic pornographic imagery,” and allowed people to request that Google block any deepfake porn of themselves.