It’s up to tech firms and privacy advocates to show “exactly, specifically, and technically” how the government’s proposed backdoors into encrypted cellphones and other products would hurt data security, current and former law enforcement officials told lawmakers last week. At least one member of the Senate Armed Services Committee appears to agree, and that’s going to propel the long-running debate into some strange new territory.

First, a bit of background. In September 2014, Apple announced that its iOS 8 operating system would encrypt phone data so that the company itself could not read it. “Unlike our competitors, Apple cannot bypass your passcode and therefore cannot access this data,” the company wrote on its website. “So it’s not technically feasible for us to respond to government warrants.” Shortly afterward, Google announced a similar update for its Android 5.0 Lollipop operating system.

At the July 14 hearing, Manhattan District Attorney Cyrus Vance, who has said he holds more than 200 iPhones as evidence that he can’t unlock, told lawmakers that law enforcement had an “urgent” need for legislation compelling technologies to build backdoors into their encryption features.

“Each of our cases in state court have a statute of limitations,” Vance said.

Kenneth L. Wainstein, a former assistant attorney general for national security at the Department of Justice, told lawmakers that the burden is on technology companies and privacy advocates to show how backdoors would harm user security, rather than on law enforcement to prove that altering the encryption scheme would be safe.

“For the tech industry and civil liberties groups, this means laying out technically specific support for the contention that a government accommodation would undermine the integrity of default encryption. They should provide hard data that demonstrates exactly how—and how much—each possible type of accommodation would impact their encryption systems. It is only when Congress receives that data that it can knowledgeably perform its deliberative function and balance the potential cybersecurity dangers posed by a government accommodation against the national security and law enforcement benefits of having such an accommodation in place,” he said.

“There have been arguments raised as to why this [meaning backdoors or legal accommodations] might end up unduly compromising encryption, which really is an important thing for society. But the only way that you’re going to be able to do your job and balance the need for an accommodation against the impact it might on encryption is for them to show exactly, specifically, and technically, how this damage would come about. … We haven’t heard that yet and until we hear that, you can’t do your job and come up with a solution,” he said.

Vance was quick to agree. “It has been one of our frustrations that there has not been an ability or the willingness to quantify the increased loss of security,” he said.

I’m not aware of anyone ever deploying something like this at the kind of scale proponents want it deployed at. In fact, that’s the primary reason that it's so difficult to quantify the risk: because nobody has ever tried anything this risky before. Bruce Schneier, cryptologist.

Bruce Schneier is one of the 15 luminaries of the data-security and encryption world who a year ago published “Keys Under Doormats,” a 32-page argument against backdoors. In a recent email to Defense One, he said that it would be difficult to answer a demand for precisely calculated risks.

“What we have are some rough metrics,” Schneier wrote. “Every additional 1,000 lines of production code is thought to add between 0.5 and 3 code defects, depending on who writes it. This helps us understand the flaws in making a system more complex, by adding new features. But exceptional access is even worse, because by definition it involves not just accidentally adding weaknesses in encryption code, but deliberately engineering them. [emphasis Schneier’s]. The goal is to build a weakness that can only be exploited by the courts (in some cases, every single court in America) but that can't be exploited by the [Russian Federal Security Service] or organized crime or any of our other foreign adversaries.”

Making such a system work would be astonishingly difficult, he said. “I’m not aware of anyone ever deploying something like this at the kind of scale proponents want it deployed at. In fact, that’s the primary reason that it's so difficult to quantify the risk: because nobody has ever tried anything this risky before.”

Matthew Green, a computer science, technology expert and cryptographer at Johns Hopkins University, said that Wainstein’s assertion —that it was the technology community’s job to prove that back doors were harmful as opposed to the law enforcement’s job to prove that they were safe — reflected a faulty understanding of technology. “The problem is that this is not how computer security works,” Green said.

“You can't tell how secure something is until it’s been reliably attacked by people with the resources and expertise that you expect to attack your protocol in real life. In the case of exceptional access systems, that means attackers with nation-state level resources, like foreign intelligence agencies. No tech company has the resources or the time to perform a penetration test to those standards,” he said.

But at least one senator is open to the idea. Sen. John McCain, R-Ariz., the chair of the Senate Armed Services Committee, promised “more hearings” on the issue as the committee prepares to draft legislation or, possibly, organize a commission. Members of the technology community would be compelled to testify.

“Even if they don’t want to come here. This committee has subpoena power,” McCain said.