Peter Ganten Backgrounds, Univention Blog

Apple is fundamentally against creating a backdoor allowing US authorities to bypass smartphone encryption. However, companies alone deciding whether backdoors are acceptable will not result in increased data security for us. We need open encryption mechanisms which can also be modified by users in case of doubt.

A showdown between Apple and the US government has already been raging in the US for some time now. In the aftermath of the San Bernadino attack, the FBI wants to investigate the perpetrator’s possible terrorist links. However, the memory on his cell, an iPhone 5C, is password-protected. Brute force attacks are of no use here – after several failed attempts, the smartphone is blocked for a period of time. There is even the risk that the perpetrator has enabled an option to erase the entire memory after ten failed password attempts.

Apple has already released a statement saying that it is cooperating with investigators. However, even following a decisive court ruling, the company is refusing to build a function for the FBI which would make it possible to bypass the encryption and export the memory directly. Apple CEO Tim Cook was up in arms at the court’s decision in an open message to Apple customers: “They have asked us to build a backdoor to the iPhone.” The software required by the FBI could “potentially unlock every iPhone” and fall into the wrong hands – in principle also a reference to the US government. After all, Mr. Cook warned that once the trick has been revealed, government authorities could exploit it to “collect your messages, health data, and financial information, locate you, or even activate your iPhone’s camera or microphone without being detected.”

So is Apple nipping things in the bud? Michael Hayden, Former Director of the National Security Agency (NSA) is already convinced to the effect. He accused FBI Director James Comey: “Jim would like a backdoor available to American law enforcement in all devices globally.” Apple has already achieved something with the extensive public debate: In stark contrast to its otherwise notorious technical secretiveness, Apple is now taking a stand as a company with a single priority: safeguarding the privacy of its customers.

However, there is one counterargument to the Apple spectacle: While backdoors do pose an enormous risk, no company should be allowed to put its authority before that of a democratic state. If at all, then it should only be for the state, not a random company, to decide who can access which data. The state will – it is only to be expected (see initial court ruling) – ultimately manage to gain access to the data it wants. As such, the amusement with which many people in this country are following the confrontation between Apple and the US government is ill-founded.

Leaving it to companies to safeguard the privacy of individuals once governments have shown that private information is suspicious is not sufficient. Then it’s only a matter of time before the backdoors follow. Effective encryption is too important to be left to managers who cannot even apply their authority to a democratic vote. Effective protection against snoopery and industrial espionage is only possible if the encryption software code is made public and can be modified by users in case of doubt. That renders the building of backdoors much more difficult. And people who truly require security and privacy can replace encryption mechanisms with ones they trust more.