The owner of NotJordanPeterson.com, a website for generating convincing clips of Jordan Peterson saying whatever you want using AI, shut down their creation this week after the real Peterson announced his displeasure and raised the possibility of legal action.

While the site was up, a 21-second recording greeted visitors to the site, saying in Peterson's voice, "This is not Jordan Peterson. In fact, I'm a neural network designed to sound like Dr. Peterson." The clip implored the visitor to type some text into a box, that would be fed into a neural network trained on hours of Peterson's actual voice, and generated into audio that sounded a lot like the real thing.

"The Deep Fake artists need to be stopped, using whatever legal means are necessary, as soon as possible."

Several media outlets tested the program and published the results, making him pantomime feminist texts and vulgarities. Aside from the outrageous content, the results sounded a lot like the real thing.

It turns out that Peterson—a controversial Canadian professor known for his lectures defending the patriarchy and denying the existence of white privilege while decrying "postmodern neo-Marxists,"—did not find NotJordanPeterson.com flattering.

"Something very strange and disturbing happened to me this week," Peterson wrote on his website. "If it was just relevant to me, it wouldn’t be that important (except perhaps to me), and I wouldn’t be writing this column about it. But it’s something that is likely more important and more ominous than we can even imagine."

He then goes on to spend over 1,300 words decrying deepfakes—algorithmically-generated face-swapped videos, not fake audio but sometimes combined with fake voices—as a threat to politics, personal privacy, and veracity of evidence, and ends with a vague allusion toward making fake audio and video illegal. Or, possibly, suing creators.

"Wake up. The sanctity of your voice, and your image, is at serious risk," he wrote. "It’s hard to imagine a more serious challenge to the sense of shared, reliable reality that keeps us linked together in relative peace. The Deep Fake artists need to be stopped, using whatever legal means are necessary, as soon as possible."

After Peterson published this blog post, the NotJordanPeterson website shut down operations. "In light of Dr. Peterson's response to the technology demonstrated by this site...and out of respect for Dr. Peterson, the functionality of the site will be disabled for the time being," the site owner wrote.

The site owner told Motherboard that despite Peterson's hinting at legal action in his blog, Peterson isn't suing him, and he took NotJordanPeterson down after he saw his negative reaction. At the time of publication, Peterson has not responded to Motherboard's request for comment.

It's interesting to see a public figure like Peterson address deepfakes so directly. Plenty of other celebrities have been subject to the algorithmic face-swap and fake-audio treatment, including podcast host Joe Rogan, Nicholas Cage, and Elon Musk.

The AI models that generate fake video or audio rely on a huge amount of existing data to analyze and "learn" from. As it happens, refusing to shut the fuck up—as so many powerful men are wont to—is great training material for an AI algorithm to train a realistic model of someone on.

Before Peterson, the closest any powerful men have come to commenting on deepfakes as a phenomenon is Mark Zuckerberg, after an artist created a deepfake of him saying some insidious things. The media coverage of that satirical art project forced his platform to enact policies around handling fake video content.

But what Peterson is implying in this screed—that deepfakes, even as art, should be stopped, banned, and otherwise made illegal—is something legislators and AI ethicists have grappled with since the dawn of deepfakes two years ago. Many experts say that regulating deepfakes is a bad idea, because trying to do so could chill First Amendment rights and free speech online.

Peterson mentions Rep. Yvette Clark's proposed DEEPFAKES Accountability Act as a potential solution to his embarrassment, and what he sees as the dangers of deepfakes as a whole. The Electronic Frontier Foundation notes that in that bill, "while there is an exception for parodies, satires, and entertainment—so long as a reasonable person would not mistake the 'falsified material activity' as authentic—the bill fails to specify who has the burden of proof, which could lead to a chilling effect for creators."