About once or twice a year someone in the INFOSEC community exclaims loudly that they’re finished with PGP, that OpenPGP is obsolete and along with SMTP it should roll over and tell us about the brave new world of modern apps. I do not agree with those who do that, especially if they are privacy advocates. Relying on apps in a hostile surveillance state might get you in trouble or worse.

The threat model

Unlike most of the people who write about shortcomings of OpenPGP I was raised in a country where the government wasn’t your friend, especially if you wanted to exercise what Americans call freedoms of speech and religion. It wasn’t a full blown North Korea type dictatorship, but a police state where the people subconsciously knew that the Party was listening all the time, and this was enough to keep them in line, as it usually is. As one of the most cynical officials once said, we had freedom of thought without the freedom of expression for that thought. Some people I knew experienced that personally — they went to jail for hacking into the TV signal to express their forbidden thoughts. It was no joke.

Why I’m telling you this? Because I’ve seen and experienced what’s oppressive political system really like (it is no „Hunger Games”), and this is what I’ll base my threat model on for the rest of this article. And you should too, because if America’s current hysterical nightmares will come true, it will look like my childhood.

Signal vs PGP, the false dichotomy

From my point of view, Signal has two major weaknesses, security-wise. You could call it vulnerabilities, but they’re different from your everyday buffer overflow, and can’t (luckily) be exploited via code. And you (as a citizen, or a INFOSEC geek) can do jack shit about them.

The first vulnerability is that Signal as a communication medium has a single point of failure, called Moxie Marlinspike. If the government wanted to get rid of Signal it would be enough to put Moxie in jail [1] along with the rest of Whisper Systems folks and after some time (as the Signal central servers hosting would run out) there would be no Signal to worry about. But the US government has no reason to, because of the second vulnerability, which is that Signal requires a mobile device to operate. And mobile devices as we have them now are platforms that operate on business model of surveillance where every tiniest app wants access to device’s location (!), contact and call list (!!), and much more, but every secret policeman just pissed his pants in joy when he thought of having this kind of metadata on any person of interest. And it is warrant-free! And even if the suspect is not using apps with ads in them, Google and Facebook know most of it, and the rest is logged on mobile carriers infrastructure that always knows who and where are you [2], who are you talking to, what pages are you browsing and that you have something to hide (inferred from using Signal). And I think that the reason that US government has not tried to dabble with Signal, unlike encrypted email services (remember what happened to Ladar Levinson?) is that the overall metadata footprint of Signal makes it unnecessary to interfere with the app itself. For a mobile device, the overall metadata footprint should be called surveillance footprint to make it more obvious for what it really is.

I do not disagree with Matthew D. Green’s notion that [PGP’s] lack of forward secrecy kills or may kill someone. I just say that the massive metadata trail of a mobile device will kill or at least maim you severely much earlier than lack of forward secrecy in its crypto protocols. As the General Hayden said, metadata is enough for the US Government to kill people. And to quote Kurt Opsahl:

Why Metadata Matters * They know you rang a phone sex service at 2:24 am and spoke for 18 minutes. But they don’t know what you talked about. * They know you called the suicide prevention hotline from the Golden Gate Bridge. But the topic of the call remains a secret. * They know you spoke with an HIV testing service, then your doctor, then your health insurance company in the same hour. But they don’t know what was discussed.

There’s a word in my language that is relevant to those issues that I can’t find a direct English translation for: pokątnie. Google Translate makes it into „secret” which is one but not the main meaning of the word. Literally it is an adjective meaning „in the corner of the room” and it metaphorically denotes that something happens „in the out of view corner of the room”, out of public view. It can mean „secret” but also „underhanded”, „shady”, „backroom” (as in backroom deal)‚ „off-handed” or even „low profile”. It implies certain grey areal but not necessarily illegality. It also implies some personal context and lacks the officialness that does not make it into the equivalent of „clandestine” (a word that has no direct translation to my language). Harry Potter’s DiagonAlley was called „ulica Pokątna” in the Polish translation.

The point is: you can pokątnie use PGP with email or any other transfer protocol, but you can’t pokątnie use Signal. When using Signal you need to be straightforward. You need to put your contact list on the table. You need to give your Signal contacts your phone number that you more or less carry with you in person. On the other hand in the times of trouble you can put your PGP key on a Yubikey, bind it together with your father’s golden watch and put it in your body cavity for safekeeping for the time, and you can’t do that to in any way to your Signal identity and messages because while the SIM card is small enough, the encryption keys are stored on the device and the device is not intended to be durable. And the whole package is definitely too big and fragile for some methods of concealment.

I’m not saying that PGP is in any way better than Signal or that design of Signal is wrong. Signal is a great app and I use it every day. But its threat model covers only quite friendly threats — not the situation where I do not want to disclose my phone number to my correspondent or my location and identity as a Signal user to a nation-state adversary. Which is not true for most of the activists and whistleblowers that are often touted as the ones that need the cryptographically secured privacy and secrecy. Recent events have also shown that hostile network operators can block access to Signal servers, making the app and its perfect cryptography unusable unless a big player steps up, but in a hostile political situation there may be no big players wanting to support the case.

I’ve seen PGP used to transmit sensitive information in a hostile environment, and that was one for the good guys. Signal could not be used in those situations because at least one of the parties would not reveal their phone number for the others to investigate.

Signal is crypto for the masses with the underlying assumption that the masses are not breaking the local law. I do not agree with that approach, definition of criminality is too vague even in stable western-style democracies [3].

It is easy to hate PGP. It is old and the wear shows through it once-future-proof design decisions. Its key management is a mess — I can name you most of use cases that made it into that mess — but still it is a pile of ugly. PGP is a child born a little prematurely, and its deformation is of idealistic nature — it is a generic public key crypto tool forced into use as an encryption layer for messaging. But its general aim was to put the public key cryptography into the hands of the masses and it succeeded in that task. The privacy it offered was pretty good but not perfect, and it says it right there on the packaging.

The State of PGP

The beginning of 2017 is a good moment to have a look at what we’ve get.

The good stuff

Widespread software support: plugins for major email clients, lots of implementations and libraries, even the Facebook profiles have a slot for a PGP key.

Impressive hardware support on all major platforms: smart cards, USB tokens, even NFC tokens for use with mobile apps without exposing the key to the device itself.

No infrastructure single point of failure.

There’s social network based on it! [4][6]

Time-tested design.

The bad stuff

Many arguments exposing OpenPGP flaws are really discussing default settings of tools. OpenPGP key need not to disclose user name or email, this is enforced by tools key generation processes and those need to change.

Same with exposing recipient key ID on enciphered packet. This can be changed easily but it simplifies the UI.

Web of Trust is one of those ideas that look great on paper but it doesn't work, tutorials and docs preaching it should be discarded, SSH-like trust on first use introduced in the latest GPG should be used instead.

Keyservers: it was great while it lasted, but now there’s malicious actor duplicating short key ID-s in the wild and keyservers are useless. The protocol has some provisions for specifying master URL-s for every key and we should put that into use.

GNU Privacy Guard is a reference implementation and its need to implement every nook and cranny of the specification makes it unusable to anyone who doesn’t know the spec by heart. I know what’s the difference between sign, lsign and nrsign, commands but since I stopped following the project closely some time ago, I have no idea what tsign does and what it is good for. This needs simplifying. A tool implementing the basic messaging-related operations while skipping the more esoteric stuff would be a great solution.

There are some protocol flaws and features coming from esoteric use cases that should be eliminated. I use PGP since Phil Zimmerman’s 2.3 version and I have no idea what key annotations are for. Or are they key-id annotations? Whatever.

Since the public wants an app, an app is needed. PGP tools always lacked in UI. Probably the best environment for PGP use is Linux + mutt + GnuPG but it is not usable for citizens so a messaging solution as easy as Signal but based on PGP is needed. For hackers, Ben Nagy’s setup is quite good [5].

Some e-mail providers are using OpenPGP to secure their customers’ messages. A federation protocol so I can discover and get HushMail or ProtonMail user’s key before sending a message there is needed.

A paranoid conclusion

Many years ago I once tried to set-up a Linux host to use IPSec in opportunistic encryption mode. It was the times of Free S/WAN and there was some info on how to do it but it didn’t work. Most documentation was focused on creating site to site VPN-s and it seemed a little that setting peer to peer IPSec is discouraged. My queries on how should I deploy IPSec when I don’t trust the network itself were left unanswered.

Only much later we learned that the messengers of Fort Meade Sauron’s Eye infiltrated standards groups to weaken encryption protocols. E-mail providers who wanted to run zero knowledge services were forced to add backdoors. And I feel that it is a little too convenient for the Sauron’s Eye that the architecture of Apple Mail (that I love as an email client almost as much as mutt and I use it everyday) makes using GMail with GPG plugin moot as the drafts are sent to Google — unencrypted. And boy, it saves a lot of drafts.

As far as I know PGP is one of the handful of crypto protocols that are attacked in the wild on the crypto level for real. It means it is worth attacking, and form of the attack implies that the attackers can intercept network traffic easily.

It is a complicated tool, like a tank, but again, like a tank, it is very useful if you know how to handle it. And I can give you my PGP key ID if you want to discuss it in private but I won’t out my Signal phone number in public. That’s why I keep my PGP kung fu sharp.

May we not have to use all this in 2017!

If you want to discuss this in private, get my key from Keybase: https://keybase.io/weirdnik

Footnotes

[1] For what? For anything, really. And when / if it turns out to be a legal mistake and years later compensation is paid, Signal is gone anyway.

[2] Do you know that GCHQ guidelines for their operatives state that you need to be 60 miles (I guess, 100km) from your everyday cellphone to be able to safely activate your covert phone with least possibility of later telco metadata correlation? Count in automatic car number plates cameras and public transit and train station cameras and you possibly can’t use anything mobile phone related without a significant risk of being identified somewhen later. This includes Signal.

[3] See, for example, recent decriminalisations of nonheteronormative sexual behaviour and some aspects of use of marijuana.

[4] https://keybase.io/ it feels great but I have no idea what it is useful for.

[5] https://gist.github.com/bnagy/8914f712f689cc01c267

[6] I still have invites left.