But deep-fake technology takes deception a step further, exploiting our natural inclination to engage with things that make us angriest. As Jonathan Swift said : “The greatest liar hath his believers: and it often happens, that if a lie be believed only for an hour, it hath done its work, and there is no further occasion for it.”

Consider the image of Emma Gonzalez, a survivor of the Parkland High School shooting in February who has become a vocal activist. A manipulated photo of her tearing up the Constitution went viral on Twitter among gun-rights supporters and members of the alt-right. The image had been digitally altered from another photo appearing in Teen Vogue. That publication’s editor lamented: “The fact that we even have to clarify this is proof of how democracy continues to be fractured by people who manipulate and fabricate the truth.”

That fake was exposed — but did it really make a difference to the people who wanted to inhabit their own paranoid universe? How many people still believe, all evidence to the contrary, that Barack Obama is a Muslim, or that he was born in Kenya?

(The answer to that last question, by the way: two-thirds of Trump supporters believe Mr. Obama is a Muslim; 59 percent believe he was not born in America and — oh, yes — a quarter of them believe that Antonin Scalia was murdered.)

Now imagine the effect of deep fakes on a close election. Let’s say video is posted of Beto O’Rourke, a Democrat running for Senate in Texas, swearing that he wants to take away every last gun in Texas, or of Senator Susan Collins of Maine saying she’s changed her mind on Brett Kavanaugh. Before the fraud can be properly refuted, the polls open. The chaos that might ensue — well, let’s just say it’s everything Vladimir Putin ever dreamed of.

There’s more: The “liar’s dividend” will now apply even to people, like Mr. Trump, who actually did say something terrible. In the era of deep fakes, it will be simple enough for a guilty party simply to deny reality. Mr. Trump, in fact, has claimed that the infamous recording of him suggesting grabbing women by their nether parts is not really him. This, after apologizing for it.

If you want to learn more about the dangers posed by deep fakes, you can read the new report by Bobby Chesney and Danielle Keats Citron at the Social Science Research Network. It’s a remarkable piece of scholarship — although I wouldn’t dive in if your primary goal is to sleep better at night.