Terrorists are going dark

This is the opening shot to the whole terrible conversation. The FBI loves to talk about criminals and terrorists "going dark" — a scary way of saying "talking in a manner not accessible by court order." If only Apple and Google would stop them from going so dark! The phrasing is important: "going dark" suggests they weren’t in the dark already. We used to be able to listen in, and now we can’t.

The problem is, that just isn’t true. Sometimes the "going dark" lie takes the form of a specific claim, as in the discredited reports that WhatsApp or the PlayStation network were used to plan the Paris attacks. But it’s also false in a general sense. There’s just no reason to think that the FBI is having a harder time tracking criminal activity than it did 15 years ago.

It's no harder to track criminal activity now than it was 15 years ago

The bureau is having more warrants come back empty, sure, but that’s because there are more warrants to serve. Fifteen years ago, it would have been unthinkable to order Microsoft to turn over a private file from a personal computer, or ask Verizon for a transcript of an unflagged phone call from three months earlier. But the shift to mobile has made those records seem much more accessible. Files are all in the cloud anyway, and texts are a lot easier to store than audio. Most of what the FBI wants is already sitting on a server somewhere. The bureau feels entitled to all that data and gets angry when companies refuse. But without the technological shifts made possible by encryption — email, SMS, cloud storage, and so on — most of these warrants would never be written up in the first place.

Maybe you think the FBI should have access to all that data. Many principled people agree! If they have a warrant, it’s perfectly constitutional, which is more than you can say for the NSA. But the fact is, you’d be fighting for a massive expansion of surveillance power. Saying otherwise just starts the entire conversation out on a lie.

Tech companies aren’t cooperating with the government

This one is the lie both sides can agree on, as the FBI rushes to show how tech companies are dodging warrants and companies rush to show how far they’re willing to go to protect user privacy.

Apple is currently fighting a drug warrant that would require it to pull non-cloud messages from a user’s phone. At the same time, Microsoft is fighting a US court order for data held on servers in Ireland. They’re important cases, with US companies staring down their own government over privacy issues.

The vast majority of government requests are fulfilled

But as important as those cases are, they’re the exception to the rule. The move to the cloud really has made data more accessible, and for the most part the FBI has no trouble getting it. The right court order will still get police into your Gmail and iCloud accounts, which probably also includes your phone’s photos and chat logs. Facebook served more than 800 wiretap orders last year in the US alone. Despite all the high-profile legal pushback, the vast majority of government requests are fulfilled.

That doesn’t mean feds get everything they want. They’d like real-time PRISM-style access to everything on the network. Failing that, they’d like fewer legal challenges to court orders. You can’t always get what you want. But right now, feds are framing the debate as an all-or-nothing choice, which glosses over the huge amount of access they already have.

What the FBI wants is impossible to implement

This one comes from the other side, the groups pushing back against the FBI’s proposals. The most truthful version of this argument came in November, when some of the world’s most respected cryptographers wrote a paper in The Journal of Cybersecurity saying the FBI’s proposals were "unworkable in practice." The paper itself is generally right, but somehow that "unworkable" phrase has transformed into the belief that what Comey is proposing is genuinely impossible, incompatible with even the most basic forms of security on the web.

The misunderstanding is so deep that when cryptographer David Chaum came out with his preferred solution last week — a so-called "backdoor with nine different padlocks on it" — it was heralded from some corners as a genuine technical breakthrough. All those techies said it couldn’t be done!

But retaining all that data isn’t technically impossible; it just opens up a huge and unnecessary security hole. It means services can’t delete anything, and whatever database holds those records is going to become target number one for attackers. Whatever system you put in place to protect that database better be absolutely flawless because it will be the first system they try to break. Security is hard enough without painting a target on your back.

(Since I keep bringing up Gmail as an example of warrant-friendly crypto, it’s worth remembering that this is exactly how the NSA attacked it, breaking into Google’s private network to pull bulk email in unencrypted form. China probably gave it a shot, too!)

Sometimes the government implements horrible and destructive policies, and everyone just has to deal with it

Having said that, it’s all entirely possible. It would be a huge, sustained headache for anyone in the information protection business, but no more intrusive than, say, emission regulations for cars. It would make it impossible to implement specific systems like end-to-end encryption and most forms of forward secrecy, but complementary tools like domain awareness would be relatively unaffected. It would also put US-based software at a long-term disadvantage, just like export restrictions on key length did in the '90s. The effect would certainly be weaker security and more breaches. But not only is all that possible, it’s completely in line with US tech policy of the past 20 years. Sometimes the government implements horrible and destructive policies, and everyone just has to deal with it. That’s why this whole conversation is so important.

Which brings us to lie #4…

It’s about encryption

Of course, we’re all calling it "the encryption debate" (including me, in the title of this very post), so this one’s on all of us. The name is useful for privacy groups too because it forces feds to come out as "against encryption," which sounds really silly to anyone who isn’t employed by the federal government.

But really the argument we’re having has nothing to do with encryption. It’s all about access.

The FBI is perfectly happy with encryption as long as all it’s doing is protecting your credit card number and making sure no one other than Google can see your email. What they don’t like is when encryption is used to lock them out — or worse, when the data they want isn’t retained at all. Put very simply, they don’t want you to be able to have a conversation on the internet that they can’t somehow monitor, given the right legal authorities. As long as you aren’t using encryption to do that, you’re just fine in the feds' eyes. On the other side, the feds' biggest target is protocols like Signal that don’t keep metadata logs at all.

It’s a little tricky because, as we learned in 2013, the NSA is also attacking the fundamentals of cryptography, planting vulnerabilities in random number generators to be exploited later on. But that’s a necessarily secret campaign, and it’s hard to imagine warrants ever fitting into it. What the FBI and Congress want is different, and making it happen will be less a matter of espionage than political clout.

Regulating tech companies will help us stop terrorist plots

This is the most powerful lie, the one we heard after Paris and again after San Bernardino. If only we could have found out where the terrorists were talking and listened in, the whole tragedy could have been averted. What if digging up a few crucial iMessages could have saved dozens of lives?

The problem is, there’s no evidence that that’s true. Hindsight investigations have found lots of tragically dropped leads in the run-up to recent attacks, but they’ve mostly been either available information that was ignored or pre-existing flags within the intelligence system. Both the Paris and San Bernardino plots seem to have been hatched in person, leaving as little online footprint as possible.

There’s little evidence of ISIS planning attacks from US-owned tech platforms

Even beyond specific attacks, there’s little evidence of ISIS and other terror groups planning attacks from US-owned tech platforms. The one private chat tool we know ISIS affiliates are using, Telegram, is based in Germany. Cracking open those channels would be significantly more complicated than passing a US law.

That doesn’t mean that putting a backdoor in iMessage wouldn’t help catch criminals — but they wouldn’t be terrorists. Based on the cases we’ve already seen, they’re most likely to be drug dealers, trade-secret thieves, or generals cheating on their wives. In short, people who don’t expect anyone to come looking for them. Maybe you think it’s worth mandating server access to solve those cases. It’s a worthwhile conversation to have. But instead, we’re talking about terrorism and then proposing systems that would be used on run-of-the-mill domestic felonies.

* * * * *

What would the conversation look like without these ideas? It’s hard to say. It would be less confused, and probably a lot less friendly to government interests, but I genuinely don’t know how the public would respond to the real ideas involved. Until they’ve heard them, it’s impossible to know.

There are real problems at the heart of this debate, fundamental questions of liberty and security and how technological progress can change that balance. There are questions about the deep state and how institutions like the FBI or NSA can be held accountable to the people they nominally serve. We have to come up with some sort of answer for these questions, and to do that, we need to be able to talk about what’s actually at stake. So far, we haven’t been able to.