The government already has the power to force technology firms to act as it wants over end-to-end encryption, but is avoiding using existing legislation as it would force it into a battle it would eventually lose, security experts have said.

The Investigatory Powers Act, made law in late 2016, allows the government to compel communications providers to remove “electronic protection applied … to any communications or data”.

On Sunday the Home Secretary Amber Rudd called on “organisations like WhatsApp”, which is owned by Facebook, to make sure that they “don’t provide a secret place for terrorists to communicate with each other”. Rudd hinted at new legislation if they did not cooperate, despite the existing legislation already allowing the government to force such cooperation.

Alec Muffett, who is a technical advisor and board member for the Open Rights Group, said that using the existing legislation would lead the government into an argument it will lose “though they may buy some time forcing people to pay lip-service to it”.

“Eventually they will lose the battle because they will never (for instance) coerce the global open-source community to comply,” Muffett said. “Government time and money would be better spent elsewhere – pursuing criminals through ‘human’ means and by building upon metadata – than in attempting to combat ‘secure communication across the internet’ as an abstract entity.”

Muffett, who previously worked at Facebook and was the lead engineer for adding end-to-end Encryption to Facebook Messenger, added that actually attempting to enforce the law as it stands would require “a massively illiberal and misconceived business case … to be thrust upon Facebook/WhatsApp in order to force it to undermine its own security technologies”.

“It would be an ugly battle, and (win or lose) it would be self-defeating,” Muffett said. “People would flee a less secure, less competitive Facebook and move to other platforms – ones with less cordial government relationships, or with no corporate presence at all.”

Antony Walker, the deputy CEO of techUK, added that the existing law already gives the UK a strong range of powers “that enable the security services to do their job”. He said: “This legislation was put in place following an extensive and rigorous process of parliamentary scrutiny focused on ensuring the checks necessary to keep a democratic society secure.



“End-to-end encryption is the best defence we have available to keep the data and services we all rely on safe from misuse. From storing data on the cloud to online banking to identity verification, end-to-end encryption is essential for preventing data being accessed illegally in ways that can harm consumers, business and our national security.”

Tony Anscombe, senior security evangelist at information security firm Avast, said that any attempt to actually use the powers would be bound to introduce major security vulnerabilities. “Banning encryption in order to get to the communications of a select few opens the door to the communications of many, and renders us all less secure and our lives less private,” he said.

“If you build a backdoor, it’s there for everybody to access. And if you store that data you collect, even in encrypted form, how secure is it? All these data breaches we hear about show our privacy is regularly being breached by hackers, so the action suggested by the home secretary would only open us all up to further invasions of privacy.”

In the initial draft of the investigatory powers bill, the only limits to the government’s power to force the removal of electronic protection is a requirement that it consults with an advisory board beforehand, and that any specific obligation must be “reasonable” and “practicable”. The technical capability notice can even be issued to people outside the UK, and require them to do, or not to do, things outside the UK.

After technology firms warned that the law could end electronic privacy in Britain, the government made a small concession, promising that no company would be compelled to remove encryption of their own services if it was not technically feasible. It did not, however, provide a definition of technological feasibility.