Ozzie knew that his proposal danced on the third rail of the crypto debate—many before him who had hinted at a technical solution to exceptional access have been greeted with social media pitchforks. So he decided to roll out his proposal quietly, showing Clear to small audiences under an informal nondisclosure agreement. The purpose was to get feedback on his system, and, if he was lucky, to jar some people out of the mindset that regarded exceptional access as a crime against science. His first stop, in September 2016, was in Seattle, where he met with his former colleagues at Microsoft. Bill Gates greeted the idea enthusiastically. Another former colleague, Butler Lampson—a winner of the Turing Award, the Nobel Prize of computer science—calls the approach “completely reasonable … The idea that there’s no way to engineer a secure way of access is ridiculous.” (Microsoft has no formal comment.)

Ozzie went on to show Clear to representatives from several of the biggest tech companies—Apple, Google, Facebook—none of whom had any interest whatsoever in voluntarily implementing any sort of exceptional access. Their focus was to serve their customers, and their customers want security. (Or, as Facebook put it in a statement to WIRED: “We have yet to hear of a technical solution to this challenge that would not risk weakening security for all users.”) At one company, Ozzie squared off against a technical person who found the proposal offensive. “I’ve seen this happen to engineers a million times when they get backed into a corner,” Ozzie says. “I told him ‘I’m not saying you should do this. I’m trying to refute the argument that it can’t be done.’ ”

Unsurprisingly, Ozzie got an enthusiastic reception from the law enforcement and intelligence communities. “It’s not just whether his scheme is workable,” says Rich Littlehale, a special agent in the Tennessee Bureau of Investigation. “It’s the fact that someone with his experience and understanding is presenting it.” In an informal meeting with NSA employees at its Maryland headquarters, Ozzie was startled to hear that the agency had come up with something almost identical at some point. They’d even given it a codename.

During the course of his meetings, Ozzie learned he was not alone in grappling with this issue. The names of three other scientists working on exceptional access popped up—Ernie Brickell, Stefan Savage, and Robert Thibadeau—and he thought it might be a good idea if they all met in private. Last August the four scientists gathered in Meg Whitman’s boardroom at Hewlett Packard Enterprise in Palo Alto. (Ozzie is a board member, and she let him borrow the space.) Though Thibadeau’s work pursued a different course, Ozzie found that the other two were pursuing solutions similar to his. What’s more, Savage has bona fides to rival Ozzie’s. He’s a world-­renowned expert on security research, and he and Ozzie share the same motivations. “We say we are scientists, and we let the data take us where they will, but not on this issue,” Savage says. “People I very much respect are saying this can’t be done. That’s not why I got into this business.”

Ozzie’s efforts come as the government is getting increasingly desperate to gain access to encrypted information. In a speech earlier this year, FBI director Christopher Wray said the agency was locked out of 7,775 devices in 2017. He declared the situation intolerable. “I reject this notion that there could be such a place that no matter what kind of lawful authority you have, it’s utterly beyond reach to protect innocent citizens,” he said.

Deputy attorney general Rod Rosenstein, in a speech at the Naval Academy late last year, was even more strident. “Warrant-proof encryption defeats the constitutional balance by elevating privacy above public safety,” he said. What’s needed, he said, is “responsible encryption … secure encryption that allows access only with judicial authorization.”

A Brief History of the Crypto Wars 1976 Scientists introduce public key cryptography, in which private and public complementary keys are used to encrypt and unlock data. 1982 RSA becomes one of the first companies to market encryption to the business and consumer world. 1989 Lotus Notes becomes the first software to obtain a license to export strong encryption overseas. 1993 The Clinton administration announces a plan to use the so-called Clipper Chip. 1994 A computer scientist finds a critical vulnerability in theClipper Chip. The US abandons the program within two years. 1999 The Clinton administration removes nearly all restrictions on the export of encryption products. 2013 Former NSA contractor Edward Snowden reveals classified information about government surveillance programs. 2014 Apple introduces default encryption in iOS 8. 2016 After a mass shooting in California, the Feds file a court order against Apple to access the contents of a shooter’s phone.

Since Apple, Google, Facebook, and the rest don’t see much upside in changing their systems, only a legislative demand could grant law enforcement exceptional access. But there doesn’t seem to be much appetite in Congress to require tech companies to tailor their software to serve the needs of law enforcement agencies. That might change in the wake of some major incident, especially if it were discovered that advance notice might have been gleaned from an encrypted mobile device.

As an alternative to exceptional access, cryptographers and civil libertarians have begun promoting an approach known as lawful hacking. It turns out that there is a growing industry of private contractors who are skilled in identifying flaws in the systems that lock up information. In the San Bernardino case, the FBI paid a reported $900,000 to an unnamed contractor to help them access the data on Farook’s iPhone. Many had suspected that the mysterious contractor was an Israeli company called Cellebrite, which has a thriving business in extracting data from iPhones for law enforcement agencies. (Cellebrite has refused to confirm or deny its involvement in the case, and its representatives declined to comment for this story.) A report by a think tank called the EastWest Institute concluded that other than exceptional access, lawful hacking is the only workable alternative.

But is it ethical? It seems odd to have security specialists promoting a system that depends on a reliable stream of vulnerabilities for hired hackers to exploit. Think about it: Apple can’t access its customers’ data—but some random company in Israel can fetch it for its paying customers? And with even the NSA unable to protect its own hacking tools, isn’t it inevitable that the break-in secrets of these private companies will eventually fall into the hands of criminals and other bad actors? There is also a danger that forces within the big tech companies could enrich themselves through lawful hacking. As one law enforcement official pointed out to me, lawful hacking creates a marketplace for so-called zero-day flaws—vulnerabilities discovered by outsiders that the manufacturers don’t know about—and thus can be exploited by legal and nonlegal attackers. So we shouldn’t be surprised if malefactors inside tech companies create and bury these trapdoors in products, with hopes of selling them later to the “lawful hackers.”