On July 23, in a keynote address at the International Conference on Cyber Security at Fordham University, US Attorney General William Barr took up a banner that the Justice Department and Federal Bureau of Investigation have been waving for over a decade: the call for what former FBI director James Comey had referred to as a "golden key."

Citing the threat posed by violent criminals using encryption to hide their activities from law enforcement, Barr said that information security "should not come at the expense of making us more vulnerable in the real world." He claimed that this is what is happening today.

View more

"Service providers, device manufacturers, and application developers are developing and deploying encryption that can only be decrypted by the end user or customer, and they are refusing to provide technology that allows for lawful access by law enforcement agencies in appropriate circumstances," Barr proclaimed.

And this, he said, was making it increasingly difficult for law enforcement to surveil criminal activity. This blindspot is what also was allowing criminals to make their information and communications "warrant proof... extinguishing the ability of law enforcement to obtain evidence essential to detecting and investigating crimes," and allowing "criminals to operate with impunity, hiding their activities under an impenetrable cloak of secrecy."

In other words, the lawful surveillance capabilities of the government are "going dark," according to AG Barr.


"The net effect is to reduce the overall security of society," he continued. "I am here today to tell you that, as we use encryption to improve cybersecurity, we must ensure that we retain society's ability to gain lawful access to data and communications when needed to respond to criminal activity." AG Barr closed by saying that US citizens should accept encryption backdoors because backdoors are essential to our security.

In response, Gen. Michael Hayden, former director of the National Security Agency, said, "Not really."

Not really. And I was the director of national security agency — Gen Michael Hayden (@GenMhayden) July 23, 2019

Regardless of the accuracy of Barr's claims, encryption is certainly far more prevalent than it was even five years ago—back when freshly minted memoirist Edward Snowden gave the world a look at the workings of US intelligence agencies' digital surveillance capabilities. For better or worse, Snowden's data dump continues to shake up not just the world's view of communication privacy—it upended the world's view of information security in general.

Snowden's impact on the de jure of mass surveillance was perhaps less than he would have hoped for. But his revelations had wide-ranging effects on the tech industry and on the development of Internet and security standards. While Snowden opened up a dialogue about intelligence policy, "some of the most significant reforms were technical, not legal."


That's according to Ben Wizner of the American Civil Liberties Union, who has acted as Snowden's attorney. "The proliferation of encryption was rapidly accelerated," he says. "And the Internet is more secure today than it was in 2013. Technology companies realized that they had been operating under the wrong threat model."

After Snowden, Internet and technology firms could no longer ignore the threat posed by state-funded actors to their customers, said Mark Rumold, senior staff attorney at the Electronic Frontier Foundation (EFF). He went on:

Companies recognized guarding against state surveillance is a bottom line issue for them... It is a question of financial interest to these companies to be able to convince their users that their data is secure with them, so we saw a lot of companies take steps to roll out encryption in various ways and I think that there's no question that this enhances security and privacy.

Just how much those steps have hindered legal surveillance and investigation—attempts by law enforcement and intelligence agencies operating under the authority of a court-approved warrant—is in dispute. As information security professional Robert Graham pointed out in a recent blog post, there is no evidence of a surge in crime corresponding to the use of encryption. Such claims, he says, are "based on emotional anecdotes rather than statistics."

Even allegedly hard data presented by the government has been routinely inflated. In December 2017, FBI Director Christopher Wray claimed in Congressional testimony that, in the 2017 fiscal year, the bureau "was unable to access the content of approximately 7,800 mobile devices" using available tools. Wray made this proclamation a year after the government's highly public battle over encryption with Apple in the wake of the tragedy in San Bernardino, California. But that figure was vastly larger than the 880 devices the FBI had cited a year before, and a Washington Post investigation found that the number of inaccessible devices in 2017 was actually about 1,200 according to an FBI internal estimate.


So, is surveillance really "going dark"? Or is this, as Graham suggested, "a Golden Age of Surveillance," where even more privacy is required? Joseph Lorenzo Hall, Chief Technologist at the Center for Democracy and Technology (CDT), leans toward the latter.

"The FBI says they're 'going dark'," Hall told Ars. "Well yeah, because they've been staring at the sun."

Fixing overexposure

Much of the Internet has become more secure over the past five years. The Snowden revelations may not have directly caused the rise of secure Web protocols, but they sure helped motivate protocol development. While the threat of a "global observer" on the Internet had been theorized before Snowden, his evidence of that sort of capability immediately triggered a response from the technical community.

"The engineering community took the succession of Snowden revelations really seriously," Hall told Ars. Just 11 months after the first of the leaks, the Internet Engineering Task Force put out RFC 7258, "stating that pervasive monitoring is an attack," Hall noted.


To be fair, the Internet in 2014 had practically nowhere to go but up in terms of protecting privacy. Almost all of the fundamental building blocks of the Internet were, at the time, "almost completely insecure" since their creation, Hall explained. That's "because we were experimenting with them. And now we're retroactively having to go back and put security back on."

That shift in perception of the threat of mass surveillance was followed by significant improvements in securing Web traffic. That included much more security-focused operations at major Internet service providers. Two particular changes were accelerated by the Snowden revelations: adoption of secure HTTP (HTTPS) and TLS encryption by major Internet services, and the development of Transport Layer Security (TLS) 1.3.

HTTPS has had the biggest effect so far, and the changes in TLS will further close the door on surveillance. In 2013, less than 30% of Web traffic was encrypted, and less than 10% of websites supported secure connections. By 2017, more than half of the Web supported HTTPS, and today over 70% of Web traffic is encrypted, based on data from Google and Let's Encrypt. As of April 2019, 91% of webpages visited by US users were secured. Internationally, about 85% of webpages visited were encrypted.


Adoption of encryption for email traffic—both between client and server and from provider to provider—also grew dramatically as a direct result of the Snowden revelations. In early 2014, only about a quarter of the email traffic between Google and other providers was encrypted. Now, it's over 75%.

The adoption of encryption has had major implications for both the intelligence community and law enforcement, at least in terms of "traditional" Internet traffic. Much of the metadata we examined in our 2014 project with NPR that was usable for surveillance by the NSA's XKeyscore system has become much less accessible. We re-staged the tests recently, using ourselves as the victim. Many of the identifiers and other content we were able to pick out of passive traffic collection in 2014 have been dramatically reduced. That isn't to say that they're gone—they're just concealed within encrypted HTTPS and TLS traffic now, at least for standard Web and email traffic.

This practical consideration may be directly responsible for the NSA dropping "about" collection (searching the contents of traffic for communications that mention specific keywords or identifiers for persons of interest). But there are still other ways to gather surveillance data from Internet traffic that won't be going dark any time soon.



Gaps in the shade

Over the past few years, a number of attacks have been identified that could defeat SSL and TLS implementations, and various "man in the middle" attacks remain a threat. While use of certificate and public key pinning—locking in keys for sites on the browser—can prevent many man-in-the-middle attacks, problems with certificate pinning implementations have occasionally made browsers and mobile applications vulnerable.

TLS 1.3 is now in the hands of developers after a four-year standards process. The charter for the new version included making the protocol faster and more secure, and making it easier to analyze implementations of the standard for flaws before they're deployed. "We built forward secrecy into TLS 1.3," Hall said. "That means that you're much more hygienic with the types of keys used for encryption. Any kind of static keyed encryption system is old news, and no bueno."


And then there are those protocols that pass plain-text information about Internet destinations. DNS requests tell where your device is connecting to. TLS' Server Name Indication component also reveals which site an encrypted Web request is being made to if multiple sites are hosted through the same IP address (as is common with sites behind Cloudflare, Google, and other services).

For example, while there has been work to ensure that the Internet's Domain Name System can't be tampered with (via DNS SEC) or used for surveillance (via DNS over HTTPS, DNS over TLS, and DNSCrypt), the new security protocols aren't yet widely used. Mozilla has been working to change this, implementing Cloudflare's DNS over HTTPS in recent builds of the Firefox Web browser. But that move has raised protests. In the United Kingdom, the Internet Service Providers Association (ISPA) recently declared Mozilla a "villain of the Internet" for "their proposed approach to introduce DNS-over-HTTPS (DoH) in such a way as to bypass UK filtering obligations and parental controls, undermining internet safety standards in the UK."

Even so, these issues don't necessarily make "secure" DNS services invisible to surveillance—they may in fact make it easier to centrally tap into DNS requests by way of a FISA warrant or other request, because they are centralized services by nature. So while DoH and other new standards may squelch DNS "spoofing" attacks from a man in the middle, they won't stop government agencies from gaining access to legal requests.


Another problematic protocol is the Border Gateway Protocol (BGP) , designed to help routers find the path with the fewest hops for network traffic. Weaknesses in BGP still allow for malicious routing of traffic through routers under an attacker's control, though work on BGP security extensions that validate route paths and the origin of route advertisements are under development.

There are also issues in the mobile realm. Mobile device use has grown exponentially, putting the power of the Internet into more people's hands from nearly everywhere they go—and this exposes a whole new level of privacy concerns in the process. As the US and the wider world have shifted their interactions over the Internet to mobile devices and applications, they've also begun to expose even more sensitive data, often in unencrypted form.

Despite the efforts of mobile platform developers, many smart phone applications still transmit data that in aggregate can be intercepted, aggregated, and used to surveil their users unprotected by encryption. Passive network monitoring tests conducted by Ars found multiple mobile apps sending location data and other information unencrypted. While Apple and Google encourage the use of HTTPS in applications, and Apple at one point set a deadline for all applications to do so, both platforms still allow developers to use insecure HTTP connections to pass data.

Cloudy with a chance of PRISM

While mass surveillance is much less rich a source of data now for those who have fully embraced encryption, law enforcement still has plenty of ways to capture individuals' data. Even as Web and email traffic has become better secured over the past five years, changes in how we use the Internet and digital technology have fundamentally changed the surveillance equation. Facebook, Google, Microsoft, and other "platform" companies have become centralized sources of personal and tracking data, often gathering even more data than NSA could have in its Snowden-era metadata collection.

Those platforms, especially when connected to mobile devices, gather an increasingly granular amount of information about their users’ lives, including real-time location data and metadata about who you have contact with. And despite assurances, much of this data has been made available to all sorts of government and private organizations as a product—including for use as a warranted or warrantless surveillance tool. Until very recently, cellular providers themselves were selling location data to private companies. That tracking information is still available to governments.


The PRISM program, one of the most controversial elements of government surveillance unearthed by Snowden, was left unscathed by recent legislation and policy changes. For law enforcement, its equivalent is the All Writs Act. With an increasing volume of Internet usage being channeled through the major platform operators, it has become increasingly easier for intelligence and law enforcement agencies to use legal authority to gain access to the information associated with that activity.

The concentration of Internet usage around services such as Facebook in itself creates privacy concerns over how Facebook and its partners leverage that data. The Cambridge Analytica scandal showed the level of targeting made possible by use of Facebook's data. And the scope of the personal data collected by Facebook was illustrated by Facebook's mining of call and SMS records from Android phones through the company's Messenger application.

The increased use of mobile devices has driven the usage of not just social media and messaging platforms, but cloud storage as well—both by the devices and the applications that run on them. Even the publicly privacy minded Apple has, when served with a warrant, provided access to iCloud phone backups and other data from iPhone users.


Sometimes, such exchanges don't even take a warrant, since some mobile application providers fail to secure their own storage. Take, for example, the gay dating application that stored sensitive images of its users, along with EXIF data from the device, in an Amazon Web Services S3 storage "bucket" with no password protection, or the media company application that stored Facebook user data the same way. The same is true of cloud-based applications; in 2017, an RNC contractor left over 200 million voters' data exposed in an S3 bucket.

Knock, knock

The brunt of FBI arguments about "golden key" access usually focuses, as Director Wray's arguments in 2017 did, on access to mobile devices themselves. Often, the data that law enforcement seeks is in the cloud, but it occasionally may not be—as it was with the iPhone 5c issued by San Bernardino County to Syed Rizwan Farook, the perpetrator of the late 2015 mass shooting in San Bernardino. In this case, the data could not be pulled from iCloud because of a mistake made by the FBI—agents asked a San Bernardino County administrator to reset the password to Farook's iCloud account instead of approaching Apple for access. When the password was changed, the phone—which would have backed up to the cloud once it was connected to power—could no longer connect to iCloud without its passcode being entered on the phone.

That led to that now infamous showdown between Apple and the FBI over providing access to the phone with a specially crafted version of the iOS operating system, ordered by the FBI under the All Writs Act. But as it turned out, the FBI was able to hack into that iPhone 5c without Apple's help. And that is largely still the case despite additional security measures taken in more recent iOS devices—many current Apple devices remain vulnerable to attacks against physical security by forensic tools provided to law enforcement, such as GreyShift's GreyKey.


But intelligence and law enforcement agencies increasingly don't always even need to have physical access to get into the contents of a target's mobile device. Mobile application exploits have made targeted surveillance and data retrieval much more effective than simply getting physical access to the mobile device, because they can become a continuous source of intelligence or evidence.

In Australia, targeted exploitation has been explicitly written into law. A law passed in 2018 requires application providers to provide a way to overcome application security to the government when ordered to do so—such as sending out a backdoored version of their app to a targeted individual, allowing remote access.

But other countries have turned to third parties (like DarkMatter, HackingTeam, NSO, and FinFisher) to provide exploits and kits so that they don't have to build cybersurveillance capabilities from scratch or buy tools off the gray market or black market to spy on their own citizens. Much of the demand for this sort of capability was ironically triggered by the Snowden revelations, which showed what was possible when you have deep pockets and the right skills.

Broken keys

The demand for a "golden key" for government access to encrypted data, then, isn't so much about necessity as it is expense and convenience. The problem is that no matter how clever such a skeleton key system might be, it is exceptionally fragile and bound to be misused, exploited by an adversary, or both. Reform Government Surveillance—a coalition formed by Google, Apple, Microsoft, Dropbox, and other cloud platform operators—issued a statement last May warning about the consequences of such efforts:

"Recent reports have described new proposals to engineer vulnerabilities into devices and services, but they appear to suffer from the same technical and design concerns that security researchers have identified for years," the alliance wrote. "Weakening the security and privacy that encryption helps provide is not the answer."


Those concerns have been well-established since the failure of the Clipper chip decades ago. This is what cryptographer Whitfield Diffie told Congress in 1993 when he was testifying about the backdoored Clipper's "key escrow" backdoor, which would be required for all encrypted phone calls:

The backdoor would put providers in an awkward position with other governments and international customers, weakening its value.

Those who want to hide their conversations from the government for nefarious reasons can get around the backdoor easily.

The only people who would be easy to surveil would be people who didn't care about government surveillance in the first place.

There was no guarantee someone else might not exploit the backdoor for their own purposes.

As it turned out, the Law Enforcement Access Field (LEAF) used by Clipper's key escrow system turned out to be eminently hackable. And the emergence of other tools for encrypted communications that could be user-configured rendered Clipper moot in the marketplace—no one ever bought devices based on it. But the idea in itself, as Diffie and others (including Matt Blaze, now the McDevitt Chair of Computer Science and Law at Georgetown University) pointed out, was ludicrous even without the faulty LEAF.

"All key-recovery systems require the existence of a highly sensitive and highly-available secret key or collection of keys that must be maintained in a secure manner over an extended time period," Diffie, Blaze, and their coauthors wrote. "These systems must make decryption information quickly accessible to law enforcement agencies without notice to the key owners. These basic requirements make the problem of general key recovery difficult and expensive, and potentially too insecure and too costly for many applications and many users."


In other words, backdoors are inevitably more expensive than those expensive exploits and other techniques used to gain access to encrypted data. And backdoors don't make citizens safer—they do the opposite.

Listing image by Markus Gann / EyeEm via Getty Images