In the constant battle to keep information secure, consumers have a powerful weapon on their side: strong encryption, which locks their data into unbreakably coded form, allowing people to transfer account information, personal data and messages without fear of being hacked. It also lets them store it safely—for example, on smartphones, which are effectively becoming wallets for our most sensitive information and thoughts.

But it’s not just law-abiding citizens who take advantage of newly ubiquitous encryption. It’s also criminals, who need to communicate without being overheard. Government agencies call it the “going dark” problem: An encrypted message essentially vanishes from their view. Law enforcement wants a federally mandated “back door,” a way to lawfully break encryption and read messages.

There lies one of the biggest emerging conflicts in the cyber realm. The shorthand is the “Crypto Wars,” and it drives much of the debate over cybersecurity policy. Should tech companies and the public be encouraged to encode their information as securely as possible to guard against theft? Or should the government be given tools to snoop, even if it severely weakens the protections of encryption?

By early this fall the most recent round of the encryption debate appeared to settle in favor of consumers and technologists: The White House announced it wouldn’t back any legislative proposal forcing companies to backdoor encryption. Officials also said the government wouldn’t pressure the tech industry to insert back doors into their products.

Then came the Nov. 13 Paris attacks, which reignited the debate. Though it’s not clear that the plotters used encryption to hide their tracks, it highlighted the potential risks. “[T]echnology exists today that allows terrorists and criminals to communicate in the shadows, using encryption that makes it impossible for law enforcement or national security authorities to do everything they can to protect Americans,” Sen. Chuck Grassley (R-Iowa) asserted in the days afterward.

For perspective, POLITICO turned to computer scientist Matt Blaze, a computer science professor at the University of Pennsylvania who—back in the early 1990s, when telephone encryption was the technology of the moment—discovered a serious programming flaw in the backdoor system that was supposed to let the National Security Agency listen in on Americans’ phone calls. His discovery effectively ended the program.

Since then Blaze has emerged as a leading researcher of cryptography and an important voice on encryption policy, and has come to believe that the entire debate misses something crucial: that today it’s impossible to build a back door that doesn’t also let in malicious hackers, so ultimately it’s time for law enforcement to broaden its perspective on encryption.

David Perera: Why has encryption become so central in the cybersecurity debate?

Matt Blaze: The first thing we need to talk about is that the security of computers and the Internet is a horrible and dangerous mess. Every week we hear about breaches of databases of Social Security numbers and financial information and health records, and about critical infrastructure being insecure.

In the early 1990s, Matt Blaze discovered a serious programming flaw in the backdoor system that was supposed to let the National Security Agency listen in on Americans' phone calls. His discovery effectively ended the program.

Maybe I should be more reluctant to admit this than I am, but computer science doesn't know how to build complex systems that work reliably. This has been a well-understood problem since the very beginning of programmable computers. As we build systems that are more and more complex, we make more and more subtle but very high-impact mistakes. As we use computers for more things and as we build more complex systems, this problem of unreliability and insecurity is actually getting worse, with no real sign of abating anytime soon.

We basically have only two real tried and true techniques that can help counter this. One of them is to make systems as simple as we can, and there are limits to that because we can only simplify things so much.

The other is the use of encryption. What encryption lets us do is say, "Yes, the Internet is insecure." Bad guys are able to compromise computers everywhere, but we're able to tolerate that because if they do intercept our messages, they can't do any harm with it.

DP: Is this just computers?

MB: Telephone handsets are particularly in need of built-in security. We have almost every aspect of our personal and work lives reflected on them and we lose them all the time. We leave them in taxis. We leave them on airplanes. The consequences of one of these devices falling into the wrong hands are very, very serious.

DP: How is it any less secure for individuals if Apple or Google hold onto a copy of the decrypting key, and when law enforcement serves a warrant, they decrypt the data?

MB: It's not quite that simple. In order for any smartphone manufacturer to decrypt the data on your phone, it has to hold onto a secret that lets it get that access. And that secret or that database of secrets becomes an extremely valuable and useful target for intelligence agencies.

So just as the local police department might want to decrypt a phone of a criminal suspect, so would the Chinese or the Russian or the Iranian intelligence agencies like to be able to do exactly the same thing.

If it were possible to hold onto this sort of database and really be assured that only good guys get access to it, we might have a different discussion than we're having. Unfortunately, we don't know how to build systems that work that way. We don't know how to do this without creating a big target and a big vulnerability.

DP: There are federal officials who say they believe a technological solution can be found—something that keeps our devices secure while allowing law enforcement to get access when they need it. You're saying there's absolutely none?

MB: I appreciate their faith in my field, but I don't share it. The people working in my field also are quite skeptical of our ability to do this. It ultimately boils down to the problem of building complex systems that are reliable and that work, and that problem has long predated the problem of access to encryption keys.

DP: If the encryption discussion is that straightforward, why is this still an issue?

MB: From a policymaker's point of view, [the back door] must look like a perfect solution. "We'll hold onto a separate copy of the keys, and we'll try to keep them really, really safe so that only in an emergency and if it's authorized by a court will we bring out those keys and use them." And, from a policy point of view, when you describe it that way, who could be against that?

It's only after you get down into the technical weeds—and they are admittedly rather weedy —that it becomes clear that this is much harder than it seems and not something we're going to be able to solve.

DP: Why does law enforcement—the FBI, U.S. attorneys, the New York County D.A.'s office—why do they care about this?

MB: It may be true that encryption makes certain investigations of crime more difficult. It can close down certain investigative techniques or make it harder to get access to certain kinds of electronic evidence.

But it also prevents crime by making our computers, our infrastructure, our medical records, our financial records, more robust against criminals. It prevents crime. On balance, the use of encryption, just like the use of good locks on doors, has the net effect of preventing a lot more crime than it might assist.

The perspective that law enforcement is presenting seems to be a very narrow one that's focused very, very heavily on investigations of past crimes rather than on preventing future crimes. It's very important for policymakers to take that broader view because they're the ones who are trusted to look at the big picture.

DP: Is there anything about the “going dark” debate you think is dissimulation?

MB: There's been a certain amount of opportunism in the wake of the Paris attacks, when there was almost a reflexive assumption that, "Oh, if only we didn't have strong encryption out there, these attacks could have been prevented." But, as more evidence has come out— and we don't know all the facts yet—we're seeing very little to support the idea that the Paris attackers were making any kind of use of encryption.

DP: This is not your first rodeo on this subject.

MB: No. And I fear it might not be my last.

DP: Can you describe what you did in the early '90s with the Clipper Chip?

MB: So, in 1993, in what was probably the first salvo of the first Crypto War, there was concern coming from the National Security Agency and the FBI that encryption would soon be incorporated into lots of communications devices, and that that would cause wiretaps to go dark. There was not that much commercial use of encryption at that point. Encryption, particularly for communications traffic, was mostly something done by the government.

AT&T, which was ironically my employer at the time, had just introduced a product in 1992 called the TSD 3600. It was a fairly clunky and very expensive telephone encryption device that you could buy and plug in between your telephone handset and the base of your phone. You could push a little button and it would digitize and encrypt your conversation. In fact, it was very similar to a device that was used by the Defense Department called STU III for classified calls.

It was very expensive. I think they were something like $1,400 each, and you'd have to buy at least two of them for them to be useful. It had a fairly limited market, but I think, perhaps reasonably, the government understood that if this was successful, things like it would get smaller and cheaper.

So AT&T released this product, and the government kind of panicked. They very quickly got the National Security Agency to design a replacement called the "Clipper Chip," for the encryption chip built into the device. What the Clipper Chip would do was perform very similar encryption to the original product but also send a copy of the key to the government. They persuaded AT&T to recall the phones that they had already sold and replace their product with one that incorporated the Clipper Chip in it.

This was pretty controversial. It was framed as a privacy versus national security debate, but I think for reasons that we've talked about, that wasn’t a complete framing of the issue.

I had just started working at Bell Labs at the time, and I got a hold of some Clipper Chip devices, and I decided to try to understand how they work. I did a little bit of reverse engineering of the protocols it used and the interfaces on the chip, and I discovered that some of the obvious things that a person might do to try to prevent the government key from being transmitted.

DP: You found a way to defeat the Clipper Chip.

MB: Clipper took a relatively simple problem, encryption between two phones, and turned it into a much more complex problem, encryption between two phones but that can be decrypted by the government under certain conditions and, by making the problem that complicated, that made it very easy for subtle flaws to slip by unnoticed. I think it demonstrated that this problem is not just a tough public policy problem, but it's also a tough technical problem.

DP: Any predictions? Does this debate ever end?

MB: Well, I think it's interesting because the 1990s ended with the government pretty much giving up. There was a recognition that encryption was important. In 2000, the government considerably loosened the export controls on encryption technology and really went about actively encouraging the use of encryption rather than discouraging it.

When the September 11th attacks happened, only about a year later, the crypto community was holding its breath because here was a time when we just had an absolutely horrific terrorist attack on U.S. soil, and if the NSA and the FBI were unhappy with anything, Congress was ready to pass any law they wanted. The PATRIOT Act got pushed through very, very quickly with bipartisan support and very, very little debate, yet it didn't include anything about encryption. That’s an encouraging sign because, ultimately, cooler heads prevailed, and there was a recognition that this technology is really critical for national security and for the U.S. economy.

If we try to prohibit encryption or discourage it or make it more difficult to use, we're going to suffer the consequences that will be far reaching and very difficult to reverse, and we seem to have realized that in the wake of the September 11th attacks. To the extent there is any reason to be hopeful, perhaps that's where we'll end up here.



Authors: