(Updated below)

I was invited to give a talk on surveillance at the Information Security Systems Association (ISSA) Baltimore Chapter yesterday, and the keynote speaker was Dr. John Levine of the NSA. He works on the “information assurance” side of the agency (charged with securing communications rather than breaking them) and had some interesting things to say on the NSA’s work trying to make mobile devices more secure for the military and other government users who need to exchange classified information.

As Levine described it, the problem faced by the NSA is the need to provide security for the devices their clients actually want to use. As he said (I’m a fast typist and captured most of what he said here verbatim but parts of these quotes may be paraphrases):

I’m a former army communications officer. I used a lot of NSA devices. They were gray boxes… But we’re not in a secure, point-to-point environment any more. Everyone has devices with them. There’s a given level of functionality that the consumer has today, and our warfighters and high government officials want that same experience.

According to Levine, that dynamic goes right to the top:

The president is getting his daily intelligence briefing on an iPad. Ten years ago we wouldn’t have done that, but that’s what the president wants, so that’s what he gets. Now, that iPad is neutered—it has no connectivity. It gets plugged into a docking station. We can do that for the president, but can we can’t scale that. So the question is, can we use commercial products that are secure?

Despite the command-and-control environment of the military, Levine described a situation in which the NSA feels forced to respond to popular demand and “give the people what they want.” Levine recalled a device called the Mobile Environment Portable Electronic Device, which was a specially designed, NSA-commissioned secure smartphone. The problem, he said, was that

it took us four years to build, and by time it came out, it flopped. Warfighters won’t use them, they’ll use consumer devices. The specs were written four years earlier, and meanwhile the consumer experience had advanced. The iPhone came out during that time.

Levine concluded, “Using commercial devices to process classified phone calls, using commercial tablets to talk over wifi—that’s major game-changer for NSA to put classified information over wifi networks, but that’s what we’re going to do.” One way that would be done, he said, was by buying capability from cell carriers that have networks of cell towers in much the way small cell providers and companies like Onstar do.

Interestingly, Levine described an agency that is being forced to adopt a more realistic and practical attitude toward risk. “It used to be that the NSA squeezed all risk out of everything,” he said. Even lower-levels of sensitivity were covered by Top Secret-level crypto. “We don’t do that now—it’s levels of risk. We say we can give you this, but can ensure only this level of risk.” Partly this came about, he suggested, because the military has an inherent understanding that nothing is without risk, and is used to seeing things in terms of tradeoffs: “With the military, everything is a risk decision. If this is the communications capability I need, I’ll have to take that risk.”

It would be nice to see that kind of sophisticated attitude toward risk reflected in the larger “war on terror,” where the official discourse fails to acknowledge that we must always trade off risks of one kind against another—with the result, all too often, being policies based on terrible tradeoffs (such as passenger screening, dragnet surveillance practices like the NSA’s own, and cybersecurity).

Levine observed that security is not a top priority for cell phone manufacturers or carriers. “We need to communicate with industry,” he said. “Dealing with industry is tough because if something costs more than a nickel, phone manufacturers are not interested in putting it in their device.”

One interesting question is, does this reflect a market failure of some kind? Or, does it just reflect what is for most people a rational tradeoff between price/convenience and the level of security threat that they face? Much of what the NSA would like to see in consumer products probably falls in the category of wearing a bullet-proof vest to the office: there’s no doubt that it’s an effective security measure, but most people don’t find the trade-off makes sense (a point often made by Bruce Schneier).

Levine did say that the consumer companies were getting better at security, mentioning as an example building the ability to disable a phone’s camera (we don’t know of any phones that have a hardware camera-off switch but perhaps he means they’re in the works). If the NSA is pushing companies to take steps like that, which increase individuals’ control and can protect their privacy, that’s great. On the other hand, we do need to keep in mind that the agency is also motivated by its impulsion toward secrecy—not a good thing given the current reality in which the lack of effective countervailing checks on classification has resulted in runaway secrecy that dangerously deprives the American people of their ability to democratically oversee their own government.

Information assurance can be good for privacy but can also block the next whistleblower who is trying to expose things that will help our country but cause pain (in the short term) for a national security bureaucracy.

Finally, given recent comments by the NSA director, it’s always worrisome to hear that the national security state is putting pressure in any way whatsoever on private communications companies. We don’t want to see security and top-down control given priority over openness and individual control.

On the other hand, as Levine also pointed out about the smartphone industry, “the DOD space is such a drop in bucket compared to what they sell to consumers, they’re not that interested.”

Update (Sept. 14):

A colleague alerts me to this story I'd missed about an Apple patent on a system for allowing phones' cameras or other functionality to be remotely disabled from a central source. I'm not sure if this is what Levine was alluding to, but it hardly fits within the category of "increasing individual users' control"—to the contrary it appears to be all about imposing centralized, top-down control over devices, exactly the wrong direction to go in.