Australia's parliament passed controversial legislation on Thursday that will allow the country's intelligence and law enforcement agencies to demand access to end-to-end encrypted digital communications. This means that Australian authorities will be able to compel tech companies like Facebook and Apple to make backdoors in their secure messaging platforms, including WhatsApp and iMessage. Cryptographers and privacy advocates—who have long been staunch opponents of encryption backdoors on public safety and human rights grounds—warn that the legislation poses serious risks, and will have real consequences that reverberate far beyond the land down under.

For months, the bill has faced criticism that it is overly broad, vaguely worded, and potentially dangerous. The tech industry, after all, is global; if Australia compels a company to weaken its product security for law enforcement, that backdoor will exist universally, vulnerable to exploitation by criminals and governments far beyond Australia. Additionally, if a company makes an access tool for Australian law enforcement, other countries will inevitably demand the same capability.

View more

"The Australian legislation is particularly broad and vague, and would serve as an extremely poor model." Greg Nojeim, CDT

The new law also allows officials to approach specific individuals—such as key employees within a company—with these demands, rather than the institution itself. In practice, they can force the engineer or IT administrator in charge of vetting and pushing out a product's updates to undermine its security. In some situations, the government could even compel the individual or a small group of people to carry this out in secret. Under the Australian law, companies that fail or refuse to comply with these orders will face fines up to about $7.3 million. Individuals who resist could face prison time.

Australian lawmakers nonetheless lauded the bill, saying it will enable crucial capabilities in organized crime and anti-terrorism investigations. Even the bill's opponents within parliament, who had initially called for significant amendments to the draft, eventually relented on Thursday.

“We will pass the legislation, inadequate as it is, so we can give our security agencies some of the tools they say they need,” Bill Shorten, the opposition Labor party leader, told reporters.

Global Impact

Though Australia will become the testing ground, technologists and privacy advocates warn that the law will swiftly impact global policy. All of Australia's intelligence allies—the United States, the United Kingdom, Canada, and New Zealand, known collectively as the Five Eyes—have spent decades lobbying for these mechanisms.

"The debate about simplifying lawful access to encrypted communication carries a considerable risk of regulations spilling to other countries," says Lukasz Olejnik, a security and privacy researcher and member of the W3C Technical Architecture Group. "Once the capabilities exist, there will be many parties interested in similar access. It would spread."

Just last week, US deputy attorney general Rod Rosenstein advocated what he called "responsible encryption" at a Washington, DC symposium. And the UK already passed the Investigatory Powers Act at the end of 2016—often called the Snoopers' Charter—that attempts to set up a framework for compelling companies to give investigators access to users' encrypted communications. So far, the UK law has been dogged by judicial challenges, and it doesn't allow government requests to be made of individuals like Australia will. But efforts to develop a legal framework for such surveillance requests continue to proliferate.

Privacy advocates note that the Five Eyes have increasingly used euphemisms like "responsible encryption," implying some sort of balance. For example, Australia's new law has a section called "Limitations," which says, "Designated communications provider must not be requested or required to implement or build a systemic weakness or systemic vulnerability."

"It’s just shocking to see this happen." Danny O'Brien, EFF

Which sounds promising in theory. But the definition indicates some double speak. "Systemic vulnerability means a vulnerability that affects a whole class of technology, but does not include a vulnerability that is selectively introduced to one or more target technologies that are connected with a particular person," the Australian law says. In other words, intentionally weakening every messaging platform out there with the same backdoor wouldn't fly, but developing tailored access to individual messaging programs, like WhatsApp or iMessage, is allowed.

Increasingly, intelligence and law enforcement seem to want tech companies to be able to silently loop government officials into a suspect's encrypted communications. For example, an iMessage conversation that you think is just between you and your friend might actually be a group chat that includes an investigator who was invisibly added. The messages would all still be end-to-end encrypted, just between the three of you, instead of the two of you.

Cryptographers and privacy advocates are quick to note, though, that as with any such mechanism, criminals and other adversaries would figure out how to exploit it as well, creating an even larger public safety issue—and potentially endangering the operations of the entity that requested the workaround in the first place.

"They say, ‘we agree that we’re not going to put in backdoors or undermine encryption, but we do reserve the right to compel companies to assist us in getting all the data,'" says Danny O'Brien, international director of the Electronic Frontier Foundation. "And everyone in the technical community is somewhat confused by this, because there really isn’t a great deal of space between compelling people to give up plaintext and creating a backdoor. That’s just the definition of a backdoor."

Cryptographers have spent decades articulating a fundamental objection to backdoors, including in the seminal 2015 paper "Keys Under Doormats". But the recent rise in legislation like Australia's has prompted a fresh wave of rebuttals. For example, IEEE, the international professional engineering association, said unequivocally in a June position statement that, "Exceptional access mechanisms would create risks...Efforts to constrain strong encryption or introduce key escrow schemes into consumer products can have long-term negative effects on the privacy, security and civil liberties of the citizens so regulated."

Privacy advocates say that Australia's new law has other problems, too, especially in its vagueness about when and how often investigators can make data requests. This could lead to overreach, they say, especially since the law also restricts what companies can disclose about the number of requests they've received in some situations.

"One country's demands of a global provider or a global device maker can impact their operations on a global scale," says Greg Nojeim, director of the Freedom, Security and Technology Project at the Center for Democracy & Technology. "And there is a risk that other countries will enact similar legislation to compel companies to build in backdoors into encryption. The Australian legislation is particularly broad and vague, and would serve as an extremely poor model."

The Other Shoe

For people on both sides of the debate, the question now is how laws like Australia's will function in practice, and whether tech companies will comply with encryption-weakening orders or resist. For its part, Apple wrote statements objecting to both the UK's Investigatory Powers Act and Australia's new legislation before they were passed. And the company went to the mat about the issue in the US as well, when it refused to build a tool to help the FBI access one of the San Bernardino shooters' iPhones in 2015.

It is not clear that companies will be able to effectively resist as more laws emerge, though, particularly if Australia has success targeting individuals. Australian Parliament will consider amendments to the law next year, but privacy advocates and technologists say the situation so far is worrying. "It’s just shocking to see this happen in Australia," EFF's O'Brien says. "The other shoe is dropping."

Fines and especially prison time are already draconian punishments for failing or refusing to essentially break the security of a digital product. But the even deeper danger of Australia's new law, and the broader movement to enact backdoor-friendly legislation, is the logical extreme in which countries simply block access to technology that offers robust privacy and security protections to users. Authoritarian states like China, Russia, and Iran already do this. Now the Five Eyes are closer to it than ever.

More Great WIRED Stories