Cothority, a new software project designed to make secret backdoored software updates nearly impossible, is offering to help Apple ensure that any secret court orders to backdoor its software cannot escape public scrutiny.

Currently, when Apple or any software maker issues a software update, they sign the update with their encryption keys. But those keys can be stolen, and a government could coerce the company to sign a backdoored software update for a targeted subset of end users—and do so in secret.

Cothority decentralises the signing process, and scales to thousands of cosigners. For instance, in order to authenticate a software update, Apple might require 51 percent of 8,000 cosigners distributed around the world.

"Before accepting any software image the device’s update mechanism verifies that it has been signed not only by the software maker but also by a threshold number of the designated witnesses," Bryan Ford of the Cothority project said in a blog post that will be published later today. [Update: it's now published.] "In essence, the device does not accept any software image unless it arrives with a cryptographic 'proof' that this particular software image has been publicly observed by—and placed under the scrutiny of—a decentralised group of independent parties scattered around the world in different jurisdictions."

The problem of multi-party cryptographic signatures has long been solved, Ford noted. What Cothority has done is solve the problem at scale. Even an urgent software update containing a critical security patch could be logged and signed by thousands of witnesses in a matter of seconds.

"This issue of whether governments should be allowed to force companies to create backdoored versions of their software, however it's answered, it's great that it's being debated in public," Ford told Ars. "We need to make sure that the issue remains public."

Cosigners are not expected to review source code, and indeed in the case of a proprietary operating system like iOS, would not be able to do so. Source code transparency is not the goal of the project, however. Rather, Cothority witnesses can proactively guarantee transparency by publicly logging any binaries they are requested to sign, even if they do no checking of the software update itself.

So if Apple requested Cothority witnesses to cosign a software update that was then only distributed to a small group of people, that would immediately draw public scrutiny.

Of course, the reliability of such a system depends on how trustworthy the witnesses are. Ford suggests that witnesses should be distributed around the world, in multiple jurisdictions, and include civil society groups like EFF, ACLU, CDT, and others.

Technical end users could even tweak their client software trust settings to only install updates signed by witnesses they especially trust.

The declining half-life of secrets

Not everyone agrees that Cothority will solve the problem of government-ordered backdoors, though. "Ultimately it's a hurdle that our legal system could still abuse," Jonathan Ździarski, an iOS forensics expert, told Ars. "There are plenty of cases where false witness has been given in the real world. Even worse, the idea would give a false sense of security to people that the system was not rigged, when indeed it can most certainly still be rigged, so it reinforces a system that could in fact be broken."

Ford acknowledged as much. Cothority does not make it impossible for a powerful adversary to compel Apple, or another software maker, to issue a backdoored software update in secret, Ford said, but does make it much more difficult. A nation-state attacker could, in theory, bribe thousands of witnesses, or coerce them to sign a targeted software update in secret. Or such an attacker could hack those witnesses' computers and issue fake signatures.

Given the declining half-life of secrets, though, it seems likely that any such coercion, bribery, or hacking would eventually come to light—defeating the point of doing so in the first place.

Ford also points out that Cothority can't defend against a "bug door" slipped into iOS by, say, an undercover NSA employee working for Apple. Nor can it prevent the government from coercing Apple to backdoor all iOS devices.

Inserting such a general-purpose backdoor, however, runs the risk of making iOS devices vulnerable to any black hat or nation-state attacker who managed to discover that backdoor. The recent revelation that a probable NSA backdoor in Juniper routers was subsequently exploited by another nation-state spy agency—likely China—serves as a cautionary tale.

The Cothority project has been peer-reviewed, and Ford and his team will present their research at the IEEE Symposium on Security and Privacy in May in San Jose, California. The draft paper, set to be formally published on March 18, is already online if you want to check it out.