When the government argues that it’s “going dark” because of ubiquitous end-to-end encryption, it often stresses how encryption thwarts counterterrorism investigations. When FBI Director Christopher Wray testified before Congress late last year, the first encryption-related example he gave was of FBI “agents and analysts … increasingly finding that communications and contacts between groups like ISIS and potential recruits occur in encrypted private messaging platforms.”

But is terrorism really why law-enforcement agencies need access to encrypted data? According to Joshua Geltzer’s characteristically thoughtful post on Just Security, the problem isn’t so simple. Geltzer, former senior director for counterterrorism at the Obama National Security Council, argues that requiring encrypted services to provide third-party (i.e., government) access could inadvertently push “savvy” terrorists (and some of their less-savvy colleagues) towards encrypted alternatives. Geltzer gives the example of secure-messaging services like Telegram and Signal: Telegram uses a “corporate structure designed to maximize flexibility and minimize accountability to governments,” while Signal can be set to delete messages shortly after they’re sent. Geltzer’s point echoes the argument that government-access mandates will fail because bad actors will switch to the many easy-to-use, global alternatives that are outside the effective reach of U.S. law enforcement or its international partners. As Phil Zimmerman, creator of the landmark encryption software Pretty Good Privacy, argued, “When crypto is outlawed, only outlaws will have crypto.”

What this underscores, however, is not the ineffectiveness of third-party-access requirements, but rather that the discussion is focused on the wrong use case. The issue isn’t terrorism, it’s crime.

Before leaving the government, I worked briefly as a federal prosecutor. A vivid lesson I learned was just how un-savvy most criminals are, especially when it comes to their digital activities. Our defendants gave no thought to even basic information security. It wasn’t as if they used iPhones for secure data storage but Twitter when they wanted to make it easier for the cops to catch them; they used whatever tool was most convenient, and it was pure happenstance whether the tool would allow the police to recover the incriminating evidence stored on it. We were lucky if they used Twitter (which wasn’t encrypted); we were unlucky if they used iPhones. (It’s amazing how many people are shocked when they get caught tweeting gun and drug sales.)

My experience—which accords with what I’ve heard from many seasoned prosecutors—illustrates the critical importance of default settings. It’s been widely known for decades that only a sliver of users ever change the settings on their devices, or even know that the settings are there for the changing. And if users can’t be bothered to change easily accessible settings, they certainly won’t go to the trouble of switching smartphones or messaging apps just to frustrate law enforcement. But when WhatsApp decides to make end-to-end encryption a default setting on its already immensely popular messaging program, the communications of a billion people are suddenly warrant proof. That’s the stuff of law-enforcement nightmares.

There’s no question that sophisticated bad actors—whether terrorists or spies—won’t just settle for the default setting. They’ll always find a way to encrypt their communications, whether by adopting products that don’t fall under national laws mandating third-party access or by taking technological countermeasures. (For instance, bad actors can sideload secure messaging apps that might otherwise be restricted from the Apple or Android app stores).

But end-to-end encryption won’t cripple counterterrorism investigations. (If this were a serious concern, one would expect a former NSA director to lead the charge against end-to-end encryption, not support its wide deployment.) There aren’t that many would-be terrorists, and the ones who exist get ample attention from the FBI and U.S. intelligence agencies. At such a high ratio of good guys to bad guys, the government can generally get around encryption where it needs to, whether by paying millions of dollars for third-party hacking tools, exploiting software and hardware vulnerabilities to hack devices, or engaging in physical surveillance. (The same logic also applies to counterintelligence investigations.)

Sadly this approach won’t work for ordinary crime. (By “ordinary” I don’t mean minor. Murder, rape, child exploitation, financial fraud—these are all serious, but nevertheless common, offenses.) Ordinary crimes occur at a scale that makes bespoke encryption workarounds impracticable. This is especially true for state and local law enforcement, which investigates and prosecutes the vast majority of crime. The FBI could afford (literally) to hack the San Bernardino terrorist’s iPhone, but, had the crime been “just” murder, the San Bernardino County Sheriff’s only option might have been to toss the locked iPhone into evidence—just like police departments across the country.

There’s a lesson here for the government: Don’t frame the encryption issue as a national security problem. It’s tempting for the government to do so, as the FBI did after the San Bernardino shooting or U.K. Prime Minister Theresa May did in the wake of a string of terrorist attacks; the public is most sympathetic to the government’s position right after high-profile security crises, and in today’s climate, terrorism is for the majority still the most salient threat.

But the government should resist the temptation; overusing national security justifications will cause skeptics to question whether the government is arguing in good faith. Perhaps more importantly, it diverts attention and effort from a badly needed next step in the encryption conversation: more and better data on how encryption is hampering criminal investigations across the country, at the federal, state and local levels.