On October 10, Deputy Attorney General Rod Rosenstein gave a speech at the U.S. Naval Academy about encryption. I have a lot to say about his remarks, so this will be a long post. Much of Rosenstein’s speech recycled the same old chestnuts that law enforcement’s been repeating about crypto for years. I’m happy to roast those chestnuts. But his remarks went beyond the usual well-worn lines to a new level of inflammatory rhetoric that signals a change in American law enforcement’s approach to the crypto wars.

The “going dark” debate over encryption is largely a branding exercise. As UC Davis cryptography professor Phil Rogaway has pointed out, even the label “going dark” has a Lakoffian aspect to it, evoking our ancient fear of the dark. When we call this the “going dark” debate, we’re giving more power to that framing. Dictating the labels we use has been an important arrow in DOJ’s rhetorical quiver as it tries to persuade the American public that encryption is bad for us. What I would brand “strong encryption,” the DOJ likes to call “warrant-proof” encryption. We’re both referring to the same thing: encryption that does not provide a mechanism for law enforcement, or the provider of the encryption itself, to gain access to plaintext—even with a warrant.[1] Yet Rod Rosenstein and I use different rhetorical frames, because we have different answers to this question: Should there exist spaces in human society that cannot be policed?

This, too, is just another framing, but I believe it’s a fundamental question that almost never gets stated overtly. It’s clear what the DOJ’s answer is. “[T]here has never been a right to absolute privacy,” Rosenstein says, repeating a well-worn line. But the government has never been entitled to absolute surveillance, either. We can have quiet conversations face-to-face, or cast a letter into the fire. And some evidence can lie beyond the reach of the police, thanks to the Fifth Amendment and other privileges, which we have gradually expanded over time as a matter of sound public policy, informed by “reason and experience.” “Warrant-proof” is a cute term, but warrants are not magical talismans.

Rosenstein didn’t coin the phrase “warrant-proof” encryption, and it’s been called out before as just rhetoric. What’s new, at least to me, is the label Rosenstein used for its opposite: “responsible encryption.” Yet I’ve learned that even this term is not new: in 1996, the then-Director of the FBI was already referring to “socially-responsible encryption.” (Hat tip.) Now, in 2017, Rosenstein is reviving it. By “responsible,” he means “capable of granting law enforcement access to plaintext.” Given the DOJ’s answer to the question I posed above, it comes as no surprise that in this framing, by definition, end-to-end encryption of communications is irresponsible. Building a smartphone that’s encrypted by default, from which not even its manufacturer can extract plaintext data, is irresponsible.

Rosenstein’s rhetoric about “responsible encryption” encapsulates in two words a speech that repeatedly portrays encryption as a dangerous weapon used almost exclusively by wrongdoers. It portrays the tech companies that provide encrypted products and services as scofflaws[2] recklessly enabling those wrongdoers behind a fig-leaf of “absolute privacy.” (This is itself a rhetorical flourish, given that this is a credo almost none of these companies actually espouse, about which more later.)

But responsibility is transitive, not reflexive. Responsibility does not exist in a vacuum; it must be answerable. The phrase “responsible encryption” prompts the question, responsible to whom? To Rosenstein, tech companies must be answerable to law enforcement above all other masters, and it is irresponsible to do otherwise. We have seen what “responsible” encryption products look like: the Clipper chip, whose notorious security flaws helped to decide the crypto wars of the 1990s. The Clipper chip was responsible to the U.S. government. It was not responsible to its would-be users, who wanted to secure their phone conversations.

The notion of computer security as a core value is not totally lost on Rosenstein. He pays the usual lip service to encryption’s positive uses and makes the usual claim that the DOJ “understand[s] and encourage[s] strong cybersecurity.” Yet he refuses to acknowledge the agreement among computer security experts that the DOJ vision of “responsible” encryption necessarily means serious security shortcomings.

Instead, he invokes the lazy old excuse that the wizards of Silicon Valley, currently hard at work on “drones and fleets of driverless cars, a future of artificial intelligence and augmented reality,” just need to nerd harder. “Surely such companies could design consumer products that provide data security while permitting lawful access with court approval.” If they keep insisting that they can’t, it must be because they’re not being “responsible.” It’s easier to pretend that the objections to backdoors (a term he vehemently disavows) are not about technical realities, but rather, purely about policy choices—driven, Rosenstein insists, by greed.[3]

There is no room, in this worldview, for the notion of tech companies being responsible, answerable, to their legions of everyday users. There is designing to serve law enforcement, and there is designing to protect pedophiles and terrorists; that’s it. Rosenstein willfully ignores encryption’s use by millions of ordinary people for completely legitimate purposes. Tech companies don’t keep improving their encryption designs because they want to provide better security to their users (and thereby improve the security ecosystem overall). No, they are interested only in “selling products and making money.” Law enforcement, by contrast, is “in the business of preventing crime and saving lives.”

This is Rosenstein’s anti-crypto rhetoric at its most blatant, and its most insulting. Strong encryption does prevent crime, such as identity theft. That’s something “responsible” companies need to worry about at a time when massive data breaches regularly dominate the headlines. Strong encryption does save lives, such as by helping protect individuals from being stalked by abusive family members or intimate partners. That’s something a “responsible” law enforcement agency, charged with protecting and serving the public, should embrace. In a time when it’s open season on women, immigrants, Muslims, Black people, trans and gender nonconforming people, and anyone else who’s “other,” strong encryption helps ward off victimization—not just by private bad actors, but by the state too.

Perhaps anticipating critiques like those above, Rosenstein proactively paints himself as a victim under assault. In a speech that has already used the word “attack” 15 times, he mentions it one final time in the context of the claim that “Sounding the alarm about the dark side of technology is not popular. Everyone who speaks candidly about ‘going dark’ faces attacks by advocates of absolute privacy.” This man is the second highest ranking law enforcement official in the mightiest country the world has ever seen. Portraying himself as an inconvenient gadfly boldly stating unpopular truths, who is then “attacked,” like a Cassandra or Socrates, by the arrayed armies of… civil liberties advocates (who, he goes on to say, are only in it for the money), is frankly bizarre.

It is also a clever means of turning the focus away from the thoroughly discredited ideas he is rehashing, and onto those of us who have had to discredit them, over and over again, ever since law enforcement started “sounding the alarm” about encryption two decades ago. The “absolute privacy” stance is one which, as noted, few companies offering encrypted devices and services actually take—as any large tech company’s transparency report on government demands for user data will readily reveal. It’s also a rarity among privacy and civil liberties advocates. (To say nothing of the actual cryptographers and other information security professionals whose expert opinions Rosenstein does not even acknowledge, since their conclusions do not fit into the only two categories of motivation he can think of for espousing strong encryption: “profit” and “sincere concern” for privacy.)

Yet branding everyone who is against weakened crypto as “advocates of absolute privacy” is a sly way of reframing the debate. It forces civil liberties activists to respond to that framing and it channels those responses. For people who embrace Rosenstein’s label (and some do), whatever they say can safely be ignored, as privacy absolutists are not to be taken seriously. The other option is to push back against the hyperbole and deny that one is a privacy absolutist (and thereby allow the DOJ to pit those who are not against those who are, driving a wedge between people who are fundamentally on the same side in this debate).

If you fall into the latter camp, Rosenstein has gotten his foot in the door. He has put your commitment to privacy and security up for negotiation. Surely you are reasonable people, and you can be persuaded to move your preferred balance point between privacy and law enforcement—or security and security—to favor the law enforcement side a little more. There is little incentive to do that, given that there’s been a clear winner in the latest round of the crypto wars and it’s not the DOJ. But if you don’t budge, then law enforcement can claim you’re refusing to have “mature conversations” about encryption. That’s the line the government trots out every time its “efforts to engage” with tech companies do not “bear fruit,” in Rosenstein’s words.

But, he warns darkly, the time for talking is over. In October 2015, then-FBI Director Jim Comey had backed off a “legislative remedy” to the “going dark” issue, promising instead to “continue the conversations with industry”—that is, pressure tech companies to “voluntarily” change their encryption designs in closed-door meetings held outside of public view and accountability. Two years later, it appears the DOJ has given up on those conversations. Tech companies won’t knuckle under and design their encryption the way law enforcement wants them to unless a law passes that forces their hand. (Rosenstein euphemistically calls this companies’ “willing[ness] to make accommodations when required by the government.”)

Rosenstein even goes so far as to hold up oppressive governments as an instructive example. Tech companies, he says, have proved willing to compromise their products in order to do business in countries with “questionable human rights records,” for insalubrious purposes such as censorship. Therefore, he reasons, those companies should be willing to adopt “responsible” encryption that permits access by U.S. law enforcement. That is, if it’s OK for a company to accede to oppressive states’ demands, then it’s even more OK to do so for our own government, given its ostensibly greater respect for human rights and the rule of law. (Rosenstein invokes American respect for the rule of law repeatedly in his remarks, delivered, you’ll recall, to a room full of Navy servicemembers—the unsubtle implication being that their devotion to strong encryption makes the tech companies un- or anti-American.)

This remarkable line of reasoning inverts one of the common policy arguments for governments to embrace strong encryption. Advocates of strong encryption like to argue that if a Western democratic government adopts an anti-encryption national policy, then we have no moral leg to stand on when countries with dismal human rights records do the same. Call it a “slippery slope” argument. Rosenstein thinks American tech companies should fall up that slippery slope. He does not see, in their global market dominance, an opportunity to spread our values abroad. (Perhaps I should not be surprised, given America’s abdication since January 20 of its previous role—fraught though it was—as a champion of democracy, freedom, and human rights on the world stage.)

Here’s the thing: Rosenstein has a point. When a company submits its tech products to security reviews by China or source code audits by Russia, when it aids official censorship or helps a regime persecute journalists, it opens itself up to a chorus of “me toos” by those wanting a similar deal. It has indicated that its commitment to its users’ security, privacy, and human rights can be bought off; that it will follow unwise laws as the price of continuing to do business in a particular market. Rosenstein, as said, is unable to believe that tech companies might be answerable to their users, rather than to money or the state—and they have proved him right.

If that’s the case, Rosenstein seems to be saying, the U.S. may as well go ahead and pass its own unwise law, mandating that technology companies weaken their products’ security by kneecapping their encryption. The government won’t tell them how; it will merely require them to be able “to achieve the crucial end”: court-ordered law enforcement access to plaintext. If such a law (however ill-advised) passes, Rosenstein foresees that the companies currently vexing him will have no choice but to follow suit. And even if “less-used platforms” don’t comply, getting “only major providers” to weaken their crypto “would still be a major step forward.”

What Rosenstein is capitalizing on here is the shifting winds of public opinion, which lately have become more hostile to the giants of Silicon Valley. The whiff of regulation is in the air, and Rosenstein is cannily fanning it in the direction of the encryption debate. His speech was not truly directed at the Navy choir to which he was preaching: it was aimed at the big companies like Apple and Facebook (owner of WhatsApp) whose continuing efforts to better secure their users’ data have so infuriated Rosenstein and his colleagues in law enforcement. The message: “You wouldn’t do this the easy way, so now let’s try it the hard way.”

The gloves are off; the long knives are coming out. It’s a scary story just in time for Halloween, courtesy of the zombie encryption debate that refuses to die.