Tim EllisCothority, a new software project designed to make secret backdoored software updates nearly impossible, is offering to help Apple ensure that any secret court orders to backdoor its software cannot escape public scrutiny.
Currently, when Apple or any software maker issues a software update, they sign the update with their encryption keys.
But those keys can be stolen, and a government could coerce the company to sign a backdoored software update for a targeted subset of end users—and do so in secret.
Cothority decentralises the signing process, and scales to thousands of cosigners.
For instance, in order to authenticate a software update, Apple might require 51 percent of 8,000 cosigners distributed around the world.
“Before accepting any software image the device’s update mechanism verifies that it has been signed not only by the software maker but also by a threshold number of the designated witnesses,” Bryan Ford of the Cothority project said in a blog post that will be published later today. [Update: it’s now published.] “In essence, the device does not accept any software image unless it arrives with a cryptographic ‘proof’ that this particular software image has been publicly observed by—and placed under the scrutiny of—a decentralised group of independent parties scattered around the world in different jurisdictions.”
The problem of multi-party cryptographic signatures has long been solved, Ford noted. What Cothority has done is solve the problem at scale.
Even an urgent software update containing a critical security patch could be logged and signed by thousands of witnesses in a matter of seconds.
Enlarge / What a classic petition would look like as a cryptographic multisignature
“This issue of whether governments should be allowed to force companies to create backdoored versions of their software, however it’s answered, it’s great that it’s being debated in public,” Ford told Ars. “We need to make sure that the issue remains public.”
Cosigners are not expected to review source code, and indeed in the case of a proprietary operating system like iOS, would not be able to do so. Source code transparency is not the goal of the project, however. Rather, Cothority witnesses can proactively guarantee transparency by publicly logging any binaries they are requested to sign, even if they do no checking of the software update itself.
So if Apple requested Cothority witnesses to cosign a software update that was then only distributed to a small group of people, that would immediately draw public scrutiny.
Of course, the reliability of such a system depends on how trustworthy the witnesses are.
Ford suggests that witnesses should be distributed around the world, in multiple jurisdictions, and include civil society groups like EFF, ACLU, CDT, and others.
Technical end users could even tweak their client software trust settings to only install updates signed by witnesses they especially trust.
The declining half-life of secrets
Not everyone agrees that Cothority will solve the problem of government-ordered backdoors, though. “Ultimately it’s a hurdle that our legal system could still abuse,” Jonathan Ździarski, an iOS forensics expert, told Ars. “There are plenty of cases where false witness has been given in the real world.
Even worse, the idea would give a false sense of security to people that the system was not rigged, when indeed it can most certainly still be rigged, so it reinforces a system that could in fact be broken.”
Ford acknowledged as much.
Cothority does not make it impossible for a powerful adversary to compel Apple, or another software maker, to issue a backdoored software update in secret, Ford said, but does make it much more difficult.
A nation-state attacker could, in theory, bribe thousands of witnesses, or coerce them to sign a targeted software update in secret. Or such an attacker could hack those witnesses’ computers and issue fake signatures.
Given the declining half-life of secrets, though, it seems likely that any such coercion, bribery, or hacking would eventually come to light—defeating the point of doing so in the first place.
Ford also points out that Cothority can’t defend against a “bug door” slipped into iOS by, say, an undercover NSA employee working for Apple. Nor can it prevent the government from coercing Apple to backdoor all iOS devices.
Inserting such a general-purpose backdoor, however, runs the risk of making iOS devices vulnerable to any black hat or nation-state attacker who managed to discover that backdoor.
The recent revelation that a probable NSA backdoor in Juniper routers was subsequently exploited by another nation-state spy agency—likely China—serves as a cautionary tale.
The Cothority project has been peer-reviewed, and Ford and his team will present their research at the IEEE Symposium on Security and Privacy in May in San Jose, California.
The draft paper, set to be formally published on March 18, is already online if you want to check it out.
“I would do anything I can to make it happen”
Ford says Cothority could be ready to deploy in less than six months, and is eager to see Apple adopt decentralised witness cosigning. “I would do anything I can to make it happen even without [a financial incentive],” Ford said. “I would absolutely work with them in an instant if I saw a willingness to make it happen.”
“In the case of Apple, if they wanted to deploy it, the obvious way to do so would be to integrate collective signing into their software update and verification process,” he added. “I don’t know the technical details of how their process works, and a lot would depend on that.”
Even if Apple loses their current dispute with the FBI, Ford is still confident Cothority can guarantee transparency.
If the FBI sets a precedent in this case, he said, “the floodgates will open to hundreds of similar demands from state and local agencies, foreign governments, etc.”
In such a scenario, Ford said, the FBI would likely demand that Apple hand over their signing keys, in the same way they demanded Lavabit hand over their TLS/SSL keys in 2013.
But Cothority’s decentralised witness cosigning model would deny the FBI, or any other government in possession of Apple’s signing keys, the power to unilaterally sign and issue targeted backdoored software updates.
“So even if the FBI gets its way, software transparency through collective signing can ensure at least that the public remains aware how far down the slippery slope from device-specific to general-purpose backdoors we might be at any given time,” Ford said. “While is is not a slippery slope we want to be on at all, if we’re forced onto it, software transparency could at least make it less slippery.”
Apple did not respond to our request for comment.
J.M. Porup is a freelance cybersecurity reporter who lives in Toronto. When he dies his epitaph will simply read “assume breach.” You can find him on Twitter at @toholdaquill.
This post originated on Ars Technica UK