Another Reason The NSA Can't Prevent Terrorist Attacks: Protecting Its Methods Is More Important Than Protecting The Public

from the I-sincerely-hope-I'm-overstating-this-possibility dept

The NSA insists everything that's been exposed so far by Snowden's leaks is direly necessary to protect us from terrorists. It still has trouble pinpointing any instances where bulk records collections and widespread internet data harvesting have prevented attacks, but it continues to assure us of its need to continue building its haystacks unimpeded.



The NSA's fight against terror is being hampered by its own greed. Too much data has proven to be just as useless as too little. And that's only part of the problem. It's preventative efforts only go so far. Bruce Schneier's post on the delayed reaction to Syria's chemical weapons attack highlights the limitations inherent to intelligence agencies.

We recently learned that US intelligence agencies had at least three days' warning that Syrian President Bashar al-Assad was preparing to launch a chemical attack on his own people, but wasn't able to stop it…



More interestingly, the US government did not choose to act on that knowledge (for example, launch a preemptive strike), which left some wondering why.

Rather than thinking of intelligence as a connect-the-dots picture, think of it as a million unnumbered pictures superimposed on top of each other. Which picture is the relevant one? We have no idea. Turning that data into actual information is an extraordinarily difficult problem, and one that the vast scope of our data-gathering programs makes even more difficult.

The third is that while we were sure of our information, we couldn't act because that would reveal "sources and methods." This is probably the most frustrating explanation. Imagine we are able to eavesdrop on al-Assad's most private conversations with his generals and aides, and are absolutely sure of his plans. If we act on them, we reveal that we are eavesdropping. As a result, he's likely to change how he communicates, costing us our ability to eavesdrop. It might sound perverse, but often the fact that we are able to successfully spy on someone is a bigger secret than the information we learn from that spying.

During the war, the British were able to break the German Enigma encryption machine and eavesdrop on German military communications. But while the Allies knew a lot, they would only act on information they learned when there was another plausible way they could have learned it. They even occasionally manufactured plausible explanations. It was just too risky to tip the Germans off that their encryption machines' code had been broken.

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community. Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis. While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

The first aspect is the sheer amount of data. As Schneier points out, connecting the dots is easy… in hindsight. In "realtime," it's impossible.Our intelligence agencies must realize this. But it seems the thirst for data is unquenchable. Gen. Alexander made it clear he wants to " collect it all ." The usefulness of these collections rely on the agency's unshakable faith that a better algorithm is just around the corner -- the final bit of filtering that will make millions of overlayed pictures suddenly snap into focus. Take it all, sort it out later and never mind the fact that the picture just gets more confusing with each additional collection.The second aspect Schneier points out is a lack of confirmation -- not enough proof to act preemptively. A lack of solid proof can often paralyze government entities, from the White House all the way down to public schools. Rather than make a mistake and suffer the fallout, they refuse to move at all, hoping that some final bit of info will arrive, pristine and transparent, and make that tough decision for them. But nothing's that crystal clear, not when tough decisionsto be made. Anyone can make the easy call. Leaders make the tough calls and not enough people qualify for that title.But the third aspect is the most chilling. It performs a very dark and very troubling calculation that weighs human lives against continued secrecy.Schneier is discussing this in the context of the Syrian gas attack, but it also contains unsettling implications for the never ending War on Terror. What if the NSA (or CIA or FBI) manage to uncover a terrorist plot via methods it considers too valuable to expose? Does it allow the attack to proceed rather than jeopardize a useful surveillance program?it do that, justifying its decision with the rationale that the protected program will save that many more lives in the future?The decision isn't likely to be completely binary. There are still options to pursue, as Schneier notes, citing an occasion when intelligence agencies-- hamstrung their own efforts in order to protect ongoing surveillance.The NSA, with the cooperation of other agencies,(possibly quite easily) manufacture plausible explanations as to how it got ahold of this intelligence without sacrificing the surveillance method. The other agencies certainly have had no trouble manufacturing cover stories, like the false paper trails, etc. they've used to hide illegal access to data But what if there wasn't time or the cover story too full of holes? What then? What if the attack wouldn't affect Americans? Would the NSA let that one go?More importantly, has the NSA earned the trust that's needed to believe it would sacrifice a valuable intel method rather than prevent an attack? At this point, the answer is no.On the bright-ish side, several methods have already been at least partially exposed. Inference and extrapolation help round out the picture. If the NSA can do X, then it stands to reason it can do Y. There's less to protect, surveillance-wise and so cover stories will be easier to generate. The fact that the NSA couldn't prevent Snowden from doing what he did and still doesn't seem to have any idea what he took also works in the public's favor. This makes mercenary decisions like the one above less likely simply because there's a very good chance that it will be swiftly exposed, and I don't believe the NSA is actually looking to coat its hands with more blood.

Filed Under: nsa, nsa surveillance, protection, sources and methods, surveillance, terrorism