Enlarge / The FBI wants Apple to shut down the retry limits on the San Bernardino shooter’s work phone.

Both sides have attempted to claim the moral high ground.John Karakatsanis
There’s been a lot of bluster about the ongoing encryption saga between the FBI and Apple. “So Apple recently joined ISIS,” The Daily Show’s Trevor Noah joked this week.

CIA Director John Brennan’s view was a tad more serious. “What would people say if a bank had a safe deposit box that individuals could use, access, and store things, but the government was not able to have any access to those environments?” he told NPR’s Morning Edition. “Criminals, terrorists, whatever could use it. What is it about electronic communications that makes it unique in terms of it not being allowed to be accessed by the government when the law, the courts say the government should have access?”
Let’s start with the facts.

Apple is currently fighting a court order obtained by the FBI.

The FBI wants Apple to build software to help bypass security software on a specific iPhone 5C.

The FBI is trying to unlock this device—a phone provided by San Bernardino County to employee Syed Farook, the man who with his wife shot 36 people and killed 14—but it’s obstructed by the phone’s security feature, which might delete the contents of the phone after 10 failed attempts to guess the PIN passcode. For now, Apple is resisting this court order that asks the company to write code that would block the auto-delete feature and allow the FBI to “brute-force” the passcode.
Beyond the facts are various arguments about things like the limits of government power or the legal authority of law enforcement to gain access to evidence believed to be related to what has been labeled a terrorist act.

Those questions will be resolved by the courts eventually.

But both the FBI and Apple have tried to take the high ground in different ways within the court of public opinion—the FBI emphasizes the moral imperative of honoring the victims and fighting terrorism, while Apple proclaims an ethical duty it has to protect the privacy and security of millions of iPhone users worldwide.
Like many interested parties, Ars staffers have debated this aspect of the Apple court order many times over the past week without arriving at a consensus. Where staffers have come down on the topic has largely depended upon their views of whether we can take the government at its word—or whether we should share the fear expressed by Apple. Will this request actually create potential security and privacy risks with widespread consequences for millions worldwide? And if so, in what ways?
Enlarge / FBI Director James Comey.
Courtesy of the FBI
The feds’ moral appeal
The moral element of the FBI’s request was perhaps brought to light most prominently by FBI Director James Comey in a post to the blog Lawfare.
“We don’t want to break anyone’s encryption or set a master key loose on the land,” he wrote. “I hope thoughtful people will take the time to understand that. Maybe the phone holds the clue to finding more terrorists. Maybe it doesn’t. But we can’t look the survivors in the eye, or ourselves in the mirror, if we don’t follow this lead.”
Invoking the horror of Farook’s terrorist act, Comey said he hoped that “folks will remember… why the FBI simply must do all we can under the law to investigate that. And in that sober spirit, I also hope all Americans will participate in the long conversation we must have about how to both embrace the technology we love and get the safety we need.”
Some Ars staffers believe that Comey is on firm legal and moral ground—and that’s all that matters.

As one put it:

Search warrants are already the established mechanism for balancing privacy and the investigative needs of the government, and there is precedent for compelling third parties to assist.

There’s no credible claim that this directly jeopardizes other phones. You’re worried about misuse? Sure, it’s a risk, but this particular instance ain’t misuse.

Certainly, that’s how the larger government and law enforcement community see it.

At the recent Suits & Spooks forum (an event held under the Chatham House rule), a panel of lawyers talked about the government’s view of this problem. “You will never prevent law enforcement from getting into your house,” one attorney framed it. “So people [in government] say, ‘Why should it be different in cyberspace?'”
Another legal expert noted that it’s unprecedented in history for people to be able to hide things completely from the government just by “scattering a little digital pixie dust on them.” And while document shredders have been around much longer than encryption, shredding permanently denies both the owner and government access to the information it destroys.

Encryption, on the other hand, is like magically reversible shredding.
But as Supreme Court Justice Antonin Scalia once put it, “There is nothing new in the realization that the Constitution sometimes insulates the criminality of a few in order to protect the privacy of us all.” And many of the lawyers participating in that Suits & Spooks forum were in agreement—there’s little the government could do within the bounds of law to force companies to hand over a backdoor.
A recent report from the Berkman Center for Internet & Society at Harvard University cast doubt on whether law enforcement actually needs special access to encrypted communications and data to gather evidence.
In their summary, the report signatories wrote, “Are we really headed to a future in which our ability to effectively surveil criminals and bad actors is impossible? We think not.

The question we explore is the significance of this lack of access to communications for legitimate government interests. We argue that communications in the future will neither be eclipsed into darkness nor illuminated without shadow.”
That position was echoed in a conversation I recently had with Rebecca Herold, a faculty member at the Institute for Applied Network Security.
She said that the solution to law enforcement’s problems could likely be solved simply by using the metadata that already exists and can be collected or by software and Internet providers adding additional metadata that doesn’t break the privacy of the conversation contained within encrypted messages. Herold likened what the FBI might find in the iPhone to Geraldo Rivera’s “Al Capone’s Vault” fiasco—they’re likely to find nothing there because it was the only device Farook didn’t destroy. He likely knew his employer had access to the device’s contents.

Waiting for monsters
It’s for that reason—the extremely low probability of anything valuable being on the phone the FBI so desperately wants Apple to provide access to—that several Arsians believe the FBI’s real reason for the request is strategic, not tactical.

That line of thinking is bolstered by a recent report by BuzzFeed quoting former FBI and NSA officials who stated that the government already has the capability to break into the phone.

The wrinkle? The feds would currently be reluctant to use the information acquired from those capabilities in court.
“The big thing to keep in mind when considering ‘morality’ is that this particular case is basically a proxy battle over encryption,” said one Ars staffer. “This is all about the government wanting to be able to access this stuff whenever it wants in other cases.
Viewed from that angle, I can’t really side with the institution that brought us the TSA, Patriot Act, Snowden-era NSA, etc. We’re afraid of monsters under our beds.”
Another staffer put it this way:

The Feds’ request for Apple to make their little custom backdoor reminds me a lot of the swift way Republicans whipped out the USA Patriot Act in the days following 9/11.

They had a wishlist of things they wanted and were waiting for the right moment to get it. You know that the Feds have been waiting to get this kind of access to mobile devices, and at last they have a case where they can cry “Terrorism! Apple is immoral!” (… despite the fact that the two shooters appear to have been loners with no real connections to actual working terrorists.)

Along those lines, some Ars arguments see the “moral” positions taken by both sides are “inevitably political”:

The arguments boil down to Apple and its supporters saying, “Once we build this tool, you will use it (or force us to use it) a lot, you will abuse that power and use it in inappropriate situations.” Whereas supporters of the government are saying, “We are working to keep the public safe; you’re working to sell gadgets. We are choosing our targets responsibly, and in any case, the decision isn’t yours to make.
If you don’t like our approach, go win an election.
In the meantime, you need to be responsible and help us.”

Complicating all of this, of course, is individual experience. How we morally view the arguments of Apple and the government are directly related to our personal experiences with power and its application—specifically, who are we most comfortable with getting to wield power? At face value, it seems somehow ironic that people would put their faith in a multinational corporation rather than the government they presumably elected.

But at its crux, that’s how the “moral” aspect of this argument plays out.

Either you put your faith in government to act more in your interest, or you put your faith in Apple.
In the post-Snowden world, a sizeable number of people believe that the US intelligence and federal law enforcement apparatus has lost (if it ever held) a claim to the moral high ground because of how the privacy of American citizens has been abused.

But technology companies have been complicit in those efforts, whether willingly or not.

The motives of both sides are open to suspicion, which is why some perceive Apple CEO Tim Cook’s strong position against the FBI’s request as largely propaganda.
In this line of thinking, resisting the government allows Apple to appeal to its global audience and generate attention for a privacy position that might be as much about profit as ethics. One Arsian put it this way: “I’m not saying Apple isn’t turning this into a publicity stunt, but there’s nothing wrong with advertising yourself by saying you’ll protect user privacy when the government comes knocking.”
The precedent question

Enlarge / Apple CEO Tim Cook.
Chris Foresman
Part of the reason why some people lean more toward trusting Apple in this case is fear of what might be unleashed if the request turns out to set a precedent. Possible precedent isn’t limited to just legality here in the US, but this situation could influence how other governments with less stellar protections of individuals’ privacy might press Apple to do things in their countries.
Several Arsians are convinced that once the FBI has gained access to the San Bernardino iPhone, that will just be the start. Law enforcement across the country have hundreds of iPhones that they can’t currently access, “sitting in evidence waiting to be cracked.” New York City District Attorney Cyrus Vance recently said that the New York Police Department has more than 150 iPhones waiting to be unlocked, for example.
Vance expressed outrage that Apple hadn’t opened those phones for prosecutors.
“Make no mistake—once Apple loses this case, the government will line up those phones to get cracked next,” one Arsian said.
Another agreed: “As soon as Apple caves, the FBI will start sending requests to unlock those phones.

This is how precedents work.
I would bet you anything that most of those phones are related to drug cases.

The government talks about terrorism and child molesters when it wants more power. When it gets that power, it uses it to prosecute drug dealers… that’s the only thing that makes sure enough people are criminals.”
There is a precedent for pushing technology companies to assist law enforcement.

For example, the Communications Assistance to Law Enforcement Act (CALEA) required telecommunications companies, Internet service providers, and some software providers to modify or install components that allow law enforcement to perform taps on digital and analog voice communications and deep packet inspection of Internet traffic when warranted.
Still, the precedent that some Ars staffers were most concerned about was international in scope.
If Apple was forced to provide access to a single phone by the FBI, would the company then be forced to provide even greater access to phones obtained by other governments?
“Have you not noticed that China (and many other countries, like Iran and Russia) are all explicitly, publicly concerned with being ‘respected’ in the world?” one staffer said. “When the US critiques them for something, it appears to sting, and they go out of their way to agree (publicly) to many international norms around freedom of speech (it’s in China’s constitution!) and the sanctity of international borders (Russia) while denying as hard as possible that they contravene those norms.”
As the New York Times recently noted, foreign pressure, norms, and precedent may have no legal force abroad, but they can still matter. (“Beijing last year backed off on some proposals that would have required foreign companies to provide encryption keys for devices sold in the country after facing pressure from foreign trade groups,” the paper wrote.)
Regardless of whether Apple prevails in this case, the company will certainly continue to deal with both political and technical efforts by other governments to gain access to its customers’ personal data.
Apple is apparently already looking at modifications to future products that would make shutting off the retry limit code impossible.

From a purely practical, not moral, standpoint, perhaps Apple should treat the current effort by the government as a penetration test and provide the access requested… then immediately push out a patch to iOS 9 that defends us all against future use. Of course, a hardware fix would mean that the company could sell more iPhones.