This week senior government figures from the ‘Five Eyes’ nations - US, Australia, Canada, UK and New Zealand - met in Ottawa to consider further measures on counter-terrorism cooperation. Top of the substantive agenda this week was what, if anything, to do about encrypted communications.

This topic resurfaced with the most recent UK attacks as it became clear that terrorists are embracing methods of communication now freely available (such as WhatsApp, Signal and Facebook Messaging) to frustrate interception, surveillance and pre-emptive arrests under post-9/11 anti-terror laws.

This is a big deal that taps into an age-old problem. Technology is always subject to abuse and in the modern age law enforcement and national security agencies have struggled with internet technologies that empower the bad guys as well as everyone else.

Coding and concealing messages are themselves almost as old as communications technology itself. The Ancient Greek and Roman use of ciphers, the coded hieroglyphs of the Egyptians and the encrypted tablet messages of the Mesopotamians all show a refined appreciation of the value of military or commercial secrets. The famous Enigma machine and its decoding by British Intelligence is now the stuff of Hollywood, but was pivotal in its military significance of the day.

The later development of electronic scrambling of information that culminated in the development of public key encryption and encryption standards in the 1970s, combined with the democratising power of the internet levelled the field. Powers and functions that previously stood behind high barriers to entry became available to all.

An aspect that may have exercised minds in Ottawa is crypto’s cousin, steganography, hiding messages inside innocuous objects or forms to avoid detection, should the message be intercepted. This is not encryption per se, but its utility to terrorists is clear.

Again, it’s ancient. Herodotus, in The Histories, refers to three fascinating examples, all instrumental in warfare. The first, a message tattooed on a slave’s shaved head, who after some weeks was sent across enemy lines, and re-shaved on arrival to reveal the instruction. The second, the stitching a message into the belly of a hare, and the third, the inscription onto wooden tablet which was then coated with wax.

The modern day equivalents include concealing a text file in music or image content using freely available software. It means you don’t even need to encrypt a message to render it beyond the reach of prying eyes. If the message is a terrorist call to arms or a signal to initiate an attack, then clearly we have a problem.

Why encrypt?

The current debate around providing agencies backdoors to encryption protocols and products is not new.

It first arose in 1993 when the Clinton administration proposed the ill-fated ‘clipper chip’ solution to unlocking coded illicit messaging. Efforts to treat encryption algorithms as munitions failed, amid outcry that they would stifle legitimate commercial development of the Internet. The same arguments apply today, except that encryption algorithms now underpin most of our commerce and much of our communications.

What has changed however, is that since about 2005, organised crime and nation-state sponsored IP theft has entered the scene. Stealing credentials and the value associated with them, including valuable corporate intellectual property, is now an industry with secondary markets, and a state sponsored activity where inflicting economic harm is used as a geopolitical weapon. Encryption is a check against these forces.

Cyber criminals have increasingly come to rely on flaws in operating systems and software to steal information and credentials. Were governments to mandate the installation of backdoors into software and devices, there is nothing to say that these weaknesses would not also be exploited by the bad guys.

Why break encryption?

Yet breaking encryption may be what is needed to prevent terrorist attacks where prior intelligence indicates a likelihood and law enforcement agencies need to know more.

Here in Australia, the data retention laws, finally enacted last year, were the product of over 15 years’ pressure by government on industry to concede to handing over transactional information - but not the contents - of mobile, internet and fixed telephony communications. The contents are subject to judicial protection and therefore are only available under warrant.

As head of the national representative body for internet, I was first approached in 1998 to facilitate a request from law enforcement that wanted the internet industry to voluntarily agree to protocols regarding metadata retention and provision. The agencies said they were stymied by the standard practice of ISPs and telcos to overwrite data that had little value beyond short term-billing functions.

Our initial position was reflexively pro-privacy. We were not convinced of the efficacy of the proposal, yet we were prepared to entertain alternatives that could be agreed between industry and agencies if only to retain a hand in the final outcome.

We prepared a draft code of practice on cybercrime, but were unable to sell it to our member telcos who, at the time, argued they would only take on these obligations if they were legislated. We failed to convince them that legislation could be far worse than self-regulation. After successive attempts to get industry to play ball a stalemate was reached, principally over costs, as well as lack of definition of data sets.

The watershed was the 2014 Lindt siege, when the ground shifted and the moral high ground opposing regulation became untenable. We may be about to witness this all over again, this time with encryption.

Last year Apple very publicly refused to hand over access to a locked iPhone citing the same pro-encryption arguments used above. Subsequently, law enforcement officials unlocked the phone using third party technology. While Apple’s position was lauded by other companies in the tech sector, each successive terrorist attack facilitated by some form of technology makes it more difficult to occupy the moral high ground around privacy.

The dilemma

The tensions between privacy and security are not difficult to understand, but are very hard to balance. Essentially, it’s the terrorists on one side, the cybercriminals on the other, and governments and the rest of us in the middle.

Weakening encryption - by requiring encryption providers to afford backdoors to governments - would open the way to criminality. Governments themselves rely on encryption; the very technologies they seek to regulate they use in their own communications. The Turnbull Cabinet reputedly relies on WhatsApp or Signal to secure their own end-to-end communications. But once you weaken encryption for one purpose you weaken it for all.

The communique issued at the end of the Five Eyes meeting is carefully nuanced on the subject of encryption. It says:

Ministers and Attorneys General also noted that encryption can severely undermine public safety efforts by impeding lawful access to the content of communications during investigations into serious crimes, including terrorism. To address these issues, we committed to develop our engagement with communications and technology companies to explore shared solutions while upholding cybersecurity and individual rights and freedoms.

It comes as no surprise that no definitive position was reached, given the complexity of the issues involved. Not least of these is that the means of encryption and communications is now almost exclusively in the private sector domain. The statement recognises implicitly that governments can't resolve this dilemma without the cooperation of technology and communications providers.

The statement 'continue to engage with' should be interpreted as a concession that the power of governments to compel outcomes may be nearing its expiration date. In that sense, it's a tacit acknowledgement that ubiquitous technology platforms are becoming more powerful than governments. The encryption story is thus a subset of a much bigger story, one that not even Herodotus could have predicted.