14.1 C
London
Saturday, September 23, 2017
Home Tags Whistleblower

Tag: Whistleblower

On Tuesday, The Guardian reported that the Federal Bureau of Investigation (FBI) has changed its rules regarding how it redacts Americans’ information when it takes international communications from the National Security Agency’s (NSA) database.

The paper confirmed the classified rule change with unnamed US officials, but details on the new rules remain murky. The new rules, which were approved by the secret US Foreign Intelligence Surveillance Court (FISC), deal with how the FBI handles information it gleans from the National Security Agency (NSA).

Although the NSA is technically tasked with surveillance of communications involving foreigners, information on US citizens is inevitably sucked up, too.

The FBI is then allowed to search through that data without any “minimization” from the NSA—a term that refers to redacting Americans’ identifiable information unless there is a warrant to justify surveillance on that person. The FBI enjoys privileged access to this information trove that includes e-mails, texts, and phone call metadata that are sent or received internationally. Recently, the Obama administration said it was working on new rules to allow other US government agencies similar access to the NSA’s database. But The Guardian notes that the Privacy and Civil Liberties Oversight Group (PCLOB), which was organized by the Obama administration in the wake of the Edward Snowden leaks, took issue with how the FBI accessed and stored NSA data in 2014. "As of 2014, the FBI was not even required to make note of when it searched the metadata, which includes the ‘to' or ‘from' lines of an e-mail,” The Guardian wrote. "Nor does it record how many of its data searches involve Americans’ identifying details." However, a recent report from PCLOB suggested that the new rules approved by FISC for the FBI involve a revision of the FBI's minimization procedures.
Spokespeople from both the FBI and PCLOB declined to comment on that apparent procedure change, saying it was classified, but PCLOB’s spokesperson, Sharon Bradford Franklin, told The Guardian that the new rules "do apply additional limits.” A spokesperson for the Office of the Director of National Intelligence said that the new procedures may be publicly released at some point.
It’s probably already happened, but you just haven't seen it... Technology moves quickly, not just in legitimate business, but in the cybercriminal world too.

Advanced attack tools are now available on the black market, lowering the barrier to entry for the average online lowlife.

They are happy to target large and small organizations alike, and they only have to be lucky once. Security pros have been forced to prepare for a world of constant, sustained attack by understanding the threats and choosing the right measures to prepare for them.

Companies are realising the extent of the threat and gearing up for it, say experts. “We have seen information security budgets increasing in the last 12 months to address the challenges that cyber crime is bringing to the organisation,” said Steve Durbin, managing director of the Information Security Forum. So what kinds of threats are they dealing with, and how can they prepare? What are the threats and where are they coming from? The cyberthreats facing modern companies fall into various categories, and they’re loosely linked to the type of cybercriminal that you’re dealing with and the kind of information that they’re after. Hacktivism has traditionally been characterised by attacks with a relatively low barrier to entry such as DDoS and web site defacements, for example. While hackers’ motives are frequently political or ideological, financial cybercriminals are interested purely in money, and are adept in their pursuit of it.
Some will attempt to transfer money out of an organization, while others will focus on saleable information. Malware typically underpins a financial cybercrime attack. One notable recent example is Carbanak, an extensive attack on financial institutions that netted $1bn in stolen assets.
It was a devilish attack, starting with a backdoor sent as an attachment that then moved through the network until it found an administrative machine. Then, the malware intercepted clerks’ computers, recording their sessions, and subsequently used that information to transfer money fraudulently using online banking sessions and to dispense money from ATMs. Carbanak was a sophisticated attack that sought to directly manipulate systems, but cybercriminals typically look to steal specific types of information such as personally identifiable information (PII) when they attack. Malware delivery via phishing and drive-by downloads is still a highly effective tool to steal this data.

Exploit kits designed to target enterprise clients with malicious payloads are on the rise.
In its 2015 Threat Report, Forcepoint found three times more exploit kits in circulation than it had in 2013. This information can be about your customers or your employees.

The latter can be just as damaging, because you’re likely to have financial and other data about the people who work for you. One of the most egregious attacks on employee data recently must be the Office of Personnel and Management hack that compromised 5.6 million fingerprint records, and more than 21 million former and government employees, harvesting social security numbers and addresses. PII isn’t the only threat category, though.
Intellectual property is another rich seam for online criminals to mine. Often the subject of targeted attacks, this information can take many forms, from email archives through to launch plans for new products, or details of new products currently under development. “We see a lot of intellectual property theft out there, coming from assumed nation states based on the IPs that they’re coming from, and from industry, too,” said Eric Stevens, director of strategic security consulting services at Forcepoint. “It’s a lot cheaper to steal development time than it is to do that development yourself,” he pointed out. While these different groups will typically seek different types of information, there is also an increasing amount of overlap. Hacktivists have begun targeting both customer data and intellectual property where it suits their needs.

Anonymous was behind the theft of ticketholder data for the 2012 F1 Grand Prix in Montreal, which was posted online. Hacktivist faction Lulzsec mined intellectual property from private security firm Stratfor in 2011. How do you live with attackers getting in, and continue to fight them? Over the years, the focus on keeping attackers out at all costs has shifted towards managing them when they break into an organization.
Security professionals seem to be tacitly admitting that network intrusion is a question of ‘when’, rather than ‘if’. “15 years ago, the focus was keeping them out.

Today, organizations are starting to realize they have to deal with a certain degree of compromise,” explained Stephen Northcutt, director of academic advising for the SANS Technology Institute. This is something that at least one of the three-letter agencies has understood for years.
In 2010, Deborah Plunkett, then-head of the Information Assurance Directorate at the NSA, said that the agency assumed that there were already intruders inside its network.

Considering itself already compromised forced it to protect critical data inside the network, rather than relying on a single ring of iron. The Open Group’s Jericho Forum focused on containing rather than preventing threats with its de-perimeterization principle, first espoused in the mid-2000s, which stated that the traditional trusted network boundary had eroded. One of the group’s commandments to survive in a de-perimeterized future was the assumption that your network was untrusted. Clearly, the NSA didn’t protect its resources especially well, though.

Ed Snowden, working for third party contractor Booz-Allen Hamilton, happily vacuumed up gigabytes of sensitive data for a sustained trickle-feed campaign to the media. No matter what side of the Snowden debate you’re on, for CISOs his case highlights the need for controls to stop the theft of information through authorized accounts. “Over the next few years, you will see a lot of growth in privilege and identity management,” said Northcutt. “At the network level you are going to see more segmentation and isolation.” To fully protect themselves with these techniques, though, organizations need a deep understanding of the data that they have and how it is used in their business, said Stevens.

There are many roles and sets of responsibilities in an organisation.
Some of them may even transcend internal employees altogether. “You have to understand what your business processes are surrounding that data,” he said.
It’s necessary to understand what a normal process looks like.

A hospital may send data to a third party company that produces its invoices for it. How can you distinguish between a legitimate business process like that, and an illegitimate one that is sending sensitive data to bad people? How do you distinguish between normal behaviour/threats Distinguishing between these different modes of behaviour is an important skillset for IT departments trying to spot attackers inside their network, but it’s doable with the right tools, say experts.
It’s all a question of mathematics, said Northcutt. “Twenty years ago the US Navy spent about a million dollars for a bunch of PhD statisticians to determine that like groups of people using like systems have a very similar network traffic footprint,” he said, adding that we have been using statistical techniques to baseline normal behaviour for years now. One form of attack involves malware that enters a network and then moves laterally, trying to find any data it can, and then exfiltrating it.
Software designed to baseline regular employee behaviour and then spot anything that deviates from the norm may be able to spot the unusual patterns that this malware may generate. Is a user account sending large amounts of data from an account that normally doesn’t? Is it encrypting that data, when it is normally sent over the internal company network in plain text? Why is it sending it at 2am when all employees are normally long gone? All of these things can raise flags in a suitably-equipped system. Where do you start when choosing tools Training people to be security aware is an important part of stopping breaches, but CISOs will never eradicate those problems entirely.

A technology layer provides a vital layer of protection.

Don’t be distracted by emotions or industry buzzwords when choosing these tools, said Stevens. He recommends first identifying what data you want to protect (adding that this is more difficult than you’d imagine for many companies).

Talk to compliance managers and line of business owners to identify this information, and then work out what category of tool would best block the egress of that data. Companies can hone their priorities by focusing on a security framework like NIST’s, using it to establish areas where they need to improve. “Then it’s about ensuring that those purchases are improving your security posture as well as catering to compliance requirements that you may have,” he said. At the very least, though, he recommends a web and email security gateway, along with a data leak prevention (DLP) tool to monitor and prevent things from leaving. “Essentials are always going to be network monitoring tools,” said the ISF’s Durbin, adding that companies can build out their tool sets as they become more sophisticated. “The more advanced will focus on big data and trying to anticipate breaches and identify weaknesses in the security perimeter. Best of breed vs holistic approach Should companies buy a single security platform offering a holistic approach, or focus on point solutions instead? “I would always vote on holistic, mainly because we aren’t seeing point channel solutions that are very effective,” said Stevens.

The main problem with best of breed solutions is visibility, he argued.
If you’re purchasing point solutions from multiple vendors, then integrating them to create a coherent view of your organizations’ security incidents can be challenging. Your view of security needs to be watertight, not least because incidents in one domain that seem incongruous might suddenly gain more significance if you’re able to correlate them with other incidents happening elsewhere. A single pane of glass can help to ensure a consistent view of everything that’s happening across the various aspects of your infrastructure, from email scanning through to web gateways. The good news is that while many of the threats facing companies are sophisticated, many of them rely on the least amount of effort to infiltrate a company.

Attackers will go for unpatched, out of date software versions and misconfigured machines if they can, to avoid giving away their zero-day secrets. Using tools to keep a watchful eye on your network, endpoints and data is one part of the solution.

Good threat intelligence is another. Just as important, though, are proper conversations with business counterparts to understand what data you should be trying to protect in the first place. ®
The cyber attacks of the future may be hard to spot, and nations may fight over fiber. In recent weeks, the digital security discussion has been focused on a certain fruit-flavored company's public battle with a three-letter agency.

But Kaspersky Principal Security Analyst Vicente Diaz is considering the far larger, and far more complicated, fights that nations might carry on in the digital world. You Don't Need StuxnetIn his presentation at RSA, Diaz made a distinction between three kinds of attacks.

The first were exotic attacks, developed and deployed at great expense by nation states.

Think Stuxnet, the complex malware allegedly developed by the U.S. and Israel to physically disable Iranian nuclear enrichment machinery. The second were so-called "middle-class" attacks, which are assembled by knowledgeable teams of hackers.

The third category encompassed all other attacks, usually carried out by individuals with little to no technical knowledge, who purchase malicious payloads and delivery mechanisms from the digital black market.The problem with complicated nation-state campaigns like Stuxnet is that they make attribution easier. When it comes to determining who is capable of developing and deploying such an attack, "the list of countries is very short," said Diaz. In the future, Diaz predicted that nation states will move away from exotic attacks and focus on middle-class attacks that are as simple and stealth as possible. "Now you don't need to develop Stuxnet-like malware just to attack," said Diaz. "Ukraine was attacked by BlackEnergy, which is not in the same league as Stuxnet." The key is obtaining the physical and digital infrastructure, like the cable that connects the global Internet. "It's good for cyber espionage but also good for attacking an adversary," said Diaz. "You can use it in an offensive way, or you can use it to get information from the people who are using this infrastructure." As an example, Diaz said that if you control the Internet infrastructure, you can simply snatch passing data rather than having to target specific devices.This approach sounds similar to the one used by the NSA in its massive data collection operations exposed by Edward Snowden, which used the position of the United States Internet infrastructure to intercept data traveling around the world. The Fight for Digital TerritoryDiaz believes that the importance of Internet infrastructure will spark conflict between nations. "Control over physical infrastructure is where the next big battles will happen," he said. He pointed to efforts made by Brazil to construct its own trans-Atlantic Internet connection and efforts within Europe to foster the development of Internet business and infrastructure within national borders. Conflicts over control of the Internet could take many forms, and need not be offensive.
Instead, countries might form alliances to create spheres of influence over the Internet.

For example, Diaz pointed to a diplomatic agreement between the U.S. and China, where the two countries agreed not engage in cyber attacks for financial gain. Diaz said this agreement was an example of one such alliance, and hinted that it would have wide-ranging consequences. "Obviously these alleged attacks will probably move to some other country because they still need to get this data," he said. Digital resources are already playing a role in warfare and politics.

This week saw confirmation from the Department of Defense that the U.S. was bringing cyber capabilities to bear against ISIS.

Also speaking at the RSA conference, Secretary of Defense Ashton Carter declined to go into specifics about these operations, but said they were focused on disrupting ISIS's command and communications capabilities. What Diaz is describing is more like the groundwork for larger operations.
It's also a shift in how diplomacy, as well as warfare, will be carried out since the fiber traveling through a stretch of land (or ocean) may be as a valuable as the land, its people, or its resources to a nation state developing its cyber capabilities. But perhaps the most important point is Diaz's prediction that attacks will simplify, rather than increase, in complexity.
If Diaz is correct, then the kind of cyber attack that worries NSA Director Rogers might be indistinguishable from the everyday work of a hacker and nearly impossible to spot.
Whistleblower's revelations mean vendors make bank RSA 2016 This year's RSA conference was the busiest on record, with over 40,000 people cramming the halls (and later, bars) of San Francisco, and more than a few of them were raising glasses to NSA whistleblower Edward Snowden. "The Snowden effect has had an undeniable effect on the business," Pravin Kothari, CEO of cloud encryption specialists CipherCloud, told The Reg. "I was on a two-day business trip overseas when the first stories broke, and we had to extend the trip to two weeks because there was so much interest in getting encrypted." As more stories came out, interest in encryption spiked still further, he explained, with a lot of non-US companies realizing quite how exposed they were to surveillance.

The Snowden effect has been very beneficial to security firms' bottom lines. Jon Callas, cofounder and CTO of secure comms company Silent Circle, agreed, telling El Reg that the news of NSA surveillance had had a big effect on sales. However, this only really holds true for non-US sales, and to a lesser extent in the EU. "For the last two or three years, we have been spending time on overseas sales because of the Snowden effect," he said. "It does help with US sales a bit, but the main areas where we see spikes in demand are the Middle East and Africa." He explained that in the US, the NSA revelations weren't as strong a driver because most of the spying was done mostly overseas, not within America.
Stories of Chinese hacking, the OPM breach, or the Sony breach had a much bigger effect in driving traffic to the firm's website and boosting sales domestically. "It's the personal experience phenomenon," he said. "It's like the difference between reading about dog attacks overseas and having your neighbor down the street savaged." ® Sponsored: Definitive guide to continuous networking monitoring
Uncle Sam can't argue against science Analysis Apple versus the FBI has generated much discussion and conjecture lately. The vast majority of it has centered on the rights and the wrongs, about the loss of privacy, and of the precedent that breaking one iPhone would create. Many are hanging on the blow-by-blow developments for an outcome, to see which side trumps: Apple – and by implication, increasingly, the tech industry – or law enforcement and the government.

But this misses the point and the ultimate outcome: victory for Apple. That's because there is a higher law beyond what FBI director James Comey sought to enforce on Apple last month. It was described by Harvard professor Larry Lessig almost 20 years ago, when he was then unknown, in a book called Code and Other Laws of Cyberspace, since updated as Code v2. Lessig called law as defined in computer code "West Coast Law." This is as opposed to "East Coast Law," which is defined by statute. Encryption is one such West Coast Law.
It was defined by Whitfield Diffie and Martin Hellman 40 years ago in a paper called "New Directions in Cryptography." Their Diffie-Hellman protocol brought us the concept of public key cryptography, messages encrypted first with a key everyone knows, then decrypted with a private key controlled by the recipient. Or vice versa. East Coast Law is analog.
It changes and it has exceptions.

Arguments can be made – on either side of a question – that define or change East Coast Law or that shift its interpretation, as happens in courts. West Coast Law, like encryption, is binary.
It's science.
It uses facts that can't be denied or altered through the relative strength or weakness of an argument.
So we have learned from that day to this. As the Diffie-Hellman paper was published, Ron Rivest, Adi Shamir, and Len Adleman created an implementation known by their initials: RSA.

They defied the wishes of the US National Security Agency and published an article on it in Scientific American in 1977. In 1991, programmer Phil Zimmermann wrote a program called Pretty Good Privacy, implementing RSA. Zimmermann launched PGP Inc in 1996, defying attempts by RSA Security (now part of EMC) to claim patent rights over the two-key method, then fighting the US government over rights to export it. The first version of the encrypted Web standard, https, also using Diffie-Hellman keys, was written into Netscape Navigator in 1994.
It evolved into a full Internet specification in 2000.

After encrypting its own traffic, Google began preferring the encrypted pages of web sites it indexed late last year. Why did Google do this? Partly in response to the revelations of Edward Snowden, whose document dump in 2013 showed that the NSA has been ignoring privacy routinely ever since 9/11.
Snowden's point was that the government's promises on this issue can't be trusted. Snowden says we can't trust government with our secrets, and we don't have to. You might as well pass a law telling glaciers not to melt. We all want our privacy and security. West Coast Law says the only way you get it is if everyone does. But, Comey says, he just wants Apple to disable PIN protection on one iPhone.

But this, too, is an encryption case.

The PIN serves as a shorter key.

This phone will self-destruct after 10 failures, just like the messages in Mission Impossible. If Apple unlocks the phone because of terrorism, the district attorney for New York County (Manhattan) alone has 175 Apple devices in his lab he wants to open, in hopes of solving crimes. And it's not just America.
If Apple broke its own phone's security because of US legal demands, China would demand that right.
So would Russia.
So would every other dictatorship. Many "crimes" being investigated in these countries are political.
If Comey gets his way, then so does Vladimir Putin. This is why Bruce Schneier, a security expert who became an IBM employee last week when his employer was bought by Big Blue, writes that "Our national security needs strong encryption." He adds: I wish I could give the good guys the access they want without also giving the bad guys access, but I can't.
If the FBI gets its way and forces companies to weaken encryption, all of us – our data, our networks, our infrastructure, our society – will be at risk. That's West Coast Law in a nutshell.
It's science.
It's binary. Resistance to it is futile. The decision by Judge James Orenstein to deny a government demand against Apple, based on the arguments used in San Bernardino, is thus theater.
So, too, with the House hearing.

Congress could pass a law, and the President could sign a law, mandating that all security have a back door, just as was sought in 1991. But even if Tim Cook was not allowed to defy such a demand, as he says he will in the case of the PIN, replacing it with something "even Apple" can't crack, unbreakable security is possible. Which means unbreakable security will exist. Will only criminals and governments have it? Or will you? Will everyone? It's all or nothing.

That's the ruling of West Coast Law. And what of Whitfield and Diffie, who launched this ship 40 years ago? They were just awarded the Turing Prize, computing's equivalent of the Nobel. Law can't defy science. ® Sponsored: Speed up incident response with actionable forensic analytics
NEWS ANALYSIS: The EU's finding is a critical first step in allowing data sharing between Europe and the U.S., but a number of review steps remain, and everything depends on U.S. actions. The European Commission issued a draft adequacy decision to the ...
Joe BruskyThe New York Times is reporting that Obama administration officials are close to agreeing on new rules that would allow the National Security Agency (NSA) to share surveillance information more freely with other federal agencies, including the FBI and the CIA, without scrubbing Americans’ identifying information first. In 2008, President George W.

Bush put forth an executive order that said such a change to the rules governing sharing between agencies could occur when procedures had been put in place. When the Obama administration took over, it started "quietly developing a framework” to carry out the proposed change in 2009, according to the Times. For the past decade, the NSA has collected massive amounts of phone metadata, e-mail, and other information from a variety of sources—sometimes directly from the companies that make such communication possible, sometimes through overseas taps on lines that connect to data centers outside of the US.

Currently when an agency wants information on a foreign citizen, it requests that data from the NSA, and the NSA theoretically scrubs it of any incidental references to American citizens who are not being targeted.

This process is known as “minimization.” "The new system would permit analysts at other intelligence agencies to obtain direct access to raw information from the NSA’s surveillance to evaluate for themselves,” the Times reported. Minimization would then theoretically happen at the agency level. Already, a variety of agencies have access to raw data that’s collected by the NSA under the Foreign Intelligence Surveillance Act, which encompasses information from any wire based on American soil.

But the new guidelines would give those agencies access to the rest of the NSA’s expansive purview. The general counsel for the office of the Director of National Intelligence, Robert Litt, confirmed to the Times that the Obama administration had a 21-page draft outlining the new procedures. However, Litt declined to share the draft with the paper.

The changes can be implemented without Congressional approval, according to Reagan-era Executive Order 12333. The Obama administration reportedly sought input last month from the five-person Privacy and Civil Liberties Oversight Board (which has drawn criticism in the past from civil liberties leaders for approving surveillance measures pushed by Obama), and the draft will have to be approved by Intelligence Director James Clapper, Attorney General Loretta Lynch, and Defense Secretary Ashton Carter. Ultimately, the goal is to give federal agencies leeway to trawl through NSA data to find worthwhile leads. "That also means more officials will be looking at private messages—not only foreigners’ phone calls and e-mails that have not yet had irrelevant personal information screened out, but also communications to, from, or about Americans that the NSA’s foreign intelligence programs swept in incidentally,” the Times writes. Much of the NSA’s practices have been kept secret, and what we know about the state of surveillance in America is in large part due to leaked documents released by former NSA contractor Edward Snowden.
Since the revelation of those documents in 2013, the Obama Administration ostensibly moved to limit the NSA’s powers but ultimately decided on a new program that would essentially just force telecommunications companies to keep metadata and supply it to the NSA on demand.
So far, however, the judicial system has allowed the NSA to keep its original system of collecting information in place. The Times reported that the new guidelines could go into effect in "months."
Ben Wizner is an ACLU attorney who we're sure the government views as a "worthy fuckin' adversary."Cyrus Farivar DAVIS, Calif.—Ben Wizner, a top attorney at the American Civil Liberties Union, is probably best known for being one of the lawyers representing Ed Snowden, the former National Security Agency contractor. On Tuesday, he told Ars that representing the world's most famous whistleblower has consumed a substantial portion of his professional life over the last 2.5 years.

But he framed his passion for civil liberties and fighting surveillance as part of a larger struggle that continues to play out as to the proper balance between not only surveillance and privacy but also between surveillance and democracy itself. Wizner was in this college town outside Sacramento to speak at the University of California, Davis law school as part of an ongoing public lecture series on surveillance. (Full disclosure: yours truly spoke as part of the same series last year.) In a 30-minute talk followed by questions from an audience primarily made up of law students, Wizner outlined a history of surveillance in America, going back to the 1971 Citizens' Commission to Investigate the FBI and extending through to the Snowden-era NSA. "This, to me, is what's so frustrating about the current debate between civil liberties and state security," Wizner said during the lecture. "It's become standard to approach the debate as if our challenge is to set the dial at precisely the right place that most efficiently maximizes both values.

But that ignores that the framers of the Constitution already put their thumbs on the scale.

And for good reason.

There's a good reason why, in the 4th Amendment, suspicion of wrongdoing comes before search.

And it's not only because of the presumption that we should generally be left alone—but because of the danger that a government, with enough data about any of us, can find some basis for being suspicious. 'Show me the man, and I will show you the crime,' said Stalin's secret police chief." Ars had a chance to sit down with him prior to the talk and touch on a range of surveillance-related topics, including the ongoing case in San Bernardino, Snowden, and the best way to think about future dystopias. What follows is the transcript of our conversation that has been lightly edited for clarity and brevity. Ars: Unlike prior cases, I feel like what’s different about the case in San Bernardino touches on the population as a whole, simply because a lot of people inherently understand: I have a smartphone, and I may or may not want the government to be able to have access to my stuff. Ben Wizner: I’m not surprised when I see polls that say that Apple should turn over the information to the FBI. What our community has failed to do effectively is to change the framing, so that people understand that security isn’t on one side and rights on the other in this kind of dispute.

But that actually security is on both sides—different kinds of security. The reason why Apple mainly objects to this kind of order is not because of the constitutional rights of its customers but because it is going to make vulnerable their systems in ways that will be exploited by others.
I have been urging colleagues to respond any time anybody asks about FBI and San Bernardino to say: the real conversation is about China wanting to unlock the phone of a US Embassy employee who they suspect is a CIA agent and don’t we want Apple to be able to say they can’t do that. Don’t we want Apple to be able to say that they can’t help China track the communications of dissidents or other repressive regimes? And it’s exactly the same question if you move it from one place to another—I think people’s intuitions might flip. So long as the focus is on a terrorism investigation in the US, I think it’s going to be hard to get high levels of support for what Apple is doing. I think that’s true. But what’s amazing to watch is how much the government is essentially contriving test case litigation.

This is the kind of thing that, candidly, groups like the ACLU do. We pick our ideal plaintiffs, we find our moment, and we present the case in a way that is most advantageous to the right we’re trying to uphold.

But I don’t know that I’ve ever seen the government be this transparent. It was reported this week that a week before the dispute went public the government recruited amicus counsel for the San Bernardino victims so that they could participate in this litigation.

And if you see the short piece that FBI Director Comey wrote on Lawfare, which is basically about looking the victims in the eye, it’s so emotional and manipulative, and it doesn’t address any of the core arguments against what the FBI is saying. But do any of the victims’ statements mean anything, legally speaking? Everyone in this debate recognizes that the magistrate judge is only one decision maker here.

And there is a much larger battle for hearts and minds that is taking place outside of that courtroom.

And it’s not just the hearts and minds of the public. We’ve seen the battles that have been taking place inside the Obama administration. Remember over the summer when The Washington Post obtained an e-mail from Bob Litt saying: "we may have lost this round, in the administration, of crypto wars, but let’s stay ready, and if there’s another attack, and we can show some link to encryption, let’s live to fight another day?" And sure enough as soon as the Paris terrorist attacks took place, the intelligence community was putting out there that the terrorists were communicating with encrypted phones with the suggestion that if they hadn’t been, this attack could have been prevented.

This really is the government recognizing that, over time, encryption is going to be an obstacle to certain kinds of investigations—it will.

There’s no question about that.

And then finding the most emotionally resonant battlefield on which to have this skirmish. But an amicus from a deceased person’s relative, does that carry any water legally? It just depends.
If you follow amicus practice in the Supreme Court, it’s not at all uncommon for groups of people to have an interest in the case but might not have any legal expertise to form.
In the Texas abortion case that is before the Supreme Court, there is a courageous brief filed by women lawyers who themselves had abortions and how that affected their life and their careers. So does it carry legal weight? Who's to say? But it certainly raises the stakes for the judges that are being asked to adjudicate the dispute. Are you aware of a case that compels a physical lock maker or safe maker to comply with the government? No.

But the case that everyone cites is the 1970s case, New York Telephone.

Everyone is playing the game of analogies.

But I guess we’ll have more to say about this in our brief next week. It’s a really wacky case.
It’s consumed my life for the last week, I’m sure yours too. What did you make of the The Guardian’s editorial yesterday? The Guardian tried to be Solomonic and say that no right is absolute.

And this particular request may be reasonable. I think the debate of “is this request reasonable” in isolation is a different question than “if you allow this case to go forward, as the government wants, then what does that then spawn in terms of other cases?” Right.

And even this week you have to think that the FBI is doing a facepalm when they see Cyrus Vance’s interviews when he’s saying that he’s sitting on 170 phones that he wants Apple to unlock and he’s just waiting for the resolution in this case. I imagine every district attorney in the country has that on their minds. Not just in the country.

That’s the key point.
If the FBI were really honest that this was something that they only wanted to do in extraordinary circumstances, in very rare occasions, it seems to me that the consensus in the tech community is that it’s achievable if not cheap.

And that if they brought in the NSA—and here there’s no 4th Amendment question, it’s the government’s phone—and asked them to engineer a solution, they’d be able to do it.

They might not have the resources to be able to do it 170 times a week in New York.

But they certainly could use targeted hacking rather than company conscription to achieve it in this case.

And I think that they have not done that.

And the questions that people should be really asking them is: have you asked the NSA for assistance? I actually did send that question to the FBI yesterday.
I haven’t gotten an answer. But the fact that they haven’t done that yet just shows how much they’re concerned with the precedent than with the real phone.

That is my real takeaway right here.
Is that the FBI is trying to get the precedent, not the contents of this phone. What would stop them from doing both? Because they have to make a representation under the All Writs Act to the court that they don’t have alternative means. Part of the analysis is how burdensome is this? And in analyzing how burdensome it is, if Apple could say to the court: look the FBI could easily get in this other way rather than forcing us to write this code, that would bear on the statutory analysis and I would say on the constitutional analysis.
If the FBI is able to get in with NSA assistance, what basis do they have for compelling Apple into doing something that it doesn’t want to do? That’s why people like Snowden and Soghoian and others have been saying needs to be asked of the FBI: has the NSA said they can’t help you? Turning to Snowden, has he communicated, other than through Twitter, has he influenced the way you guys think about this? I communicate with him regularly, and he influences my thinking on all of these issues. My colleagues like Chris Soghoian are pretty confident in their views about these issues.

But certainly I learn a lot from him.

And it’s not like I’m getting any kind of secret information that looks different than what he’s putting out there on Twitter, it’s just that I’m able to have a conversation and I can ask follow-up questions.

And he has the rare quality of being both an extraordinary security technologist and someone who can talk about it to unsavvy tech types like me.

And Soghoian has that, too.

They’re unicorns. Has he been able to provide better context for you in terms of the relationship between FBI and NSA, in terms of their technical capabilities? On points like that, what I know from him is what the public knows from him. I think Snowden has been pretty open about the technologies that he relies on for communication. SecureChat via OTR. He has endorsed Signal. So there are ways in which he’s able to use that.

For things that are less critical, in terms of security.

For his public appearances he generally uses Google Hangouts, because it’s easier for the venues and there’s no need for confidential communication when you’re speaking to a room of 1,000 people. So we use different things at different times depending on how sensitive the communication needs to be. Do you have regular times, or does he talk to you whenever he wants? Do you have regular chats? How does that work? I would say that we are frequently in touch with each other. Is his situation intractable? It seems that if he wanted to, presuming that the Russian government continues to extend his visa and the charges aren’t dropped, his legal situation could be the same five years from now, 10 years from now, or 50 years from now. Of course that’s a possibility, yeah. Yes, I don’t have any better means of predicting that than any ordinary citizen except that I have been privy to some communications with the government that the ordinary citizen has not been, but I do think that history tends to be kind to people like Snowden and much less kind to the claims of national security damage that are always levied against them.

And I already think that Snowden’s position is considerably stronger than it was a few years ago.

Even former Attorney General Eric Holder recognized the link between Snowden’s actions and historic legislative and judicial reform on these issues.

The disclosures led to the highest awards in journalism and film. He’s won the alternative Nobel Prize and from a foundation he’s been short-listed for the Nobel Peace Prize. Over time as the impact of the disclosures is more and more appreciated, and the punitive harms are revealed to be so much smoke, it’s going to be more difficult to consider the appropriate response a felony proposition. This last year a majority of the European Parliament voted to call on member states not to extradite Snowden.

That’s not legally binding—every member state makes its own decision for itself.

But it’s still a watershed moment that the body that represents the continent of Europe and all of their democracies have decided that punishment is not the appropriate response. We will continue to make the argument that this is an extraordinary case and is worthy of an extraordinary resolution. Powers like pardon and clemency they don’t exist where people didn’t break the law, necessarily.

They exist for cases where they did, but there are powerful extenuating circumstances.
I have not been harshly critical of the deal that Gen.

David Petraeus got, I do think that his career and service should bear on what an appropriate punishment is.

But I just think that the same kinds of considerations should be extended to people who don’t have friends in high places. Are you able to say if in Obama’s last year in office, if an appeal to clemency or a pardon has been made? I think all of the human rights groups in the world—ACLU, Amnesty International, Human Rights Watch—whether there will be some other focused push, is something we’re thinking about. What exactly is the nature of Snowden’s relationship with the Russian government? He has temporary legal residence, which I guess is the equivalent of a green card here.

But it’s renewable after three years.

Beyond that his relationship is nothing.

As he has said. He has a legal right to be there. He has no other relationship with the Russian government. But surely he must be under watch locally.
I don’t know if your Russian counterpart has connections to the Kremlin, or things of that nature. I don’t know what the question is. I guess what I’m trying to figure out is if is an understanding beyond the visa that he has been granted. The understanding is that he has a legal right to be there.
I don’t really have any more to say on the subject. He moves around freely. He doesn’t need anybody’s permission to do anything there. He meets with foreign journalists and politicians. His longtime girlfriend lives there with him.
I don’t have any more insight to offer you. As somebody who does communicate with him regularly, I’d love to know more about what he’s like personally.

Are there personal elements of him that aren’t well communicated over Twitter? I was asked this question more before Citizenfour and before he joined Twitter. What I liked so much about Citizenfour is that everybody got to see him up close during a week of maximum stress and see how calm and how funny he is.

And I think Twitter is a place that he is really showing the range of his qualities. What I used to have to say more is: you wouldn’t believe how funny this guy is. His first week on Twitter I was getting lots of questions: who is writing his tweets? No one is asking that anymore. How much of your time is spent as a lawyer dealing with his case versus others? I’m middle management these days.
It’s really true. When you direct a project at the ACLU—this Speech, Privacy and Technology Project at the ACLU—my primary job is to supervise that project.

And some of the project directors at the ACLU litigate cases and some just oversee.

But because of the time that I’ve devoted to Snowden’s case over the last 2.5 years that’s basically the time I would have spent litigating other cases.
It’s a significant amount of time—it’s diminishing somewhat as his circle of trusted confidants has grown. Are there other cases besides Snowden and San Bernardino that people like me should be paying more attention to? I think the momentum in state legislatures to enact privacy legislation has proceeded somewhat under the radar. People notice that California, because it’s a mammoth state passed CalECPA, but people didn’t realize that California was not the first state to do that.

And people didn’t realize that it was happening in places like Montana and Utah and Colorado before it was happening in places like California. Mostly at the local level this alliance between civil libertarians and more traditional libertarians and its ability to impose appropriate restrictions on law enforcement surveillance is something that deserves a little more attention. One of the things that I find difficult about writing about privacy is that people are worried about the potential for abuse.

But there isn’t always a clear-cut example to point to: to say that X scary government agency abused its authority in this way, but we are concerned that given this volume of data, that could be abused. I think that’s correct.

And it’s not that there are not abuse examples. Surely there are. We saw through the Snowden documents that the NSA had a codename for some of those abuses: LOVEINT. When their analysts would use their capabilities to track lovers and others, but those cases were self-reported so we don’t know how widespread they were.

But yes, I think the concerns that we have rest on a theory of power and the experience of history which is that authorities migrate and they are abused. What I mean by migrate is that these authorities that are promulgated as counterterrorism measures and are defended as narrowly necessary to prevent one threat will before long be in the hands of ordinary law enforcement agencies.

And we’ve already seen that in the Snowden documents. You see the FBI and the DEA popping up. You have that incredible thing, we want access to it! And think about the dynamic of that.
If there is a major terrorist attack—not on the level of San Bernardino but closer to the level of 9/11—it will most likely be shown that information that could have helped prevent the attack was residing in NSA databases because that’s what happens when you collect on a bulk scale. And the failure to prevent it will be blamed on rules that didn’t allow criminal investigations access to these files and this lock box that is now protecting our privacy, according to the government, is going to be opened up. That’s what happened after 9/11. Remember, what was blamed for 9/11? The wall between law enforcement and intelligence.

And the Patriot Act was written to tear down that wall.

And so I think that if you want to make a realistic assessment of harm from mass surveillance you can’t take a snapshot of today’s practices. You have to have a clear-headed assessment of where these capabilities are heading. So one of them is this mission creep.
It will be a very different world if every cop on the beat can pull out his smartphone and have access to something like an NSA metadata database.

And I worry about chill.
I worry about what it’s going to feel like to live in a world like that.
It’s obviously an abstract concept compared to a Hoover-era blackmail story.

But I do think that people grasp it intuitively.
I think people already express concern about having law enforcement drones flying overhead all the time. You’ve seen 40 states consider legislation for a problem that doesn’t even exist yet. And I think the leap that is going to be required is that there may not be drones already but there’s cops in their pockets already.

And that they’re already being tracked almost as intrusively as they would be by those cameras, that in some sense we’re already at the bottom of the slippery slope.

And yes there is a lot more that could be done.

But I think you’ve put your finger on a problem, which is that most people don’t experience in their daily life the harms that advocates are worried about. One of the things that I also struggle with is: my feeling about the NSA, the FBI, the Oakland Police Department is that they have a hard job that I probably couldn’t do.

The vast majority of those officers and people are trying to do it to the best of their ability and they’re trying to use whatever tools are at their disposal to do that job and the law is slow to catch up with the technological reality of stingrays and crypto. So they push the envelope as much as they can. I think that’s true.

But I think it’s also important to remember that the abuses of the Hoover era that seem so unfathomable to us today were carried about by people of goodwill for reasons that they considered to be patriotic and valid.

They were persuaded that there was a threat to the country and that required them to do things like break up marriages, infiltrate non-violent protesters, try to get MLK to commit suicide.
I’m not saying that there weren’t individually malicious people including Hoover himself but within these institutions, even ones that purport to be rule-bound, they’re always going to push and they’re always going to believe in the rightness of what they’re doing.

That’s why external oversight is so fundamentally critical. Has there ever been a police agency that didn’t abuse its powers? Has there ever been one? I honestly don’t know the answer to that. Do you worry about the dystopian surveillance future, in 10 years? Like I think that facial recognition will be ridiculous in a decade. Is it because of the number of photos we’re uploading to Facebook and other social networks? Yeah it’s that and just the science is getting better. It’s interesting that Google Glass just face planted. Yeah, but that’s just one way you would do it.

Another way you would do it would be a facial recognition device like a license plate reader. Maybe there’s an app that cops have or maybe there’s a device that sits on the car or on the officer’s uniform. Maybe it’s integrated into the body cam that can compare to a DMV registry. Matching unknown person X against the DMV database.

To me that seems inevitable. Yeah, and that probably each person will walk around with that capability in some way. So that your device will recognize a stranger on the street and link to that person’s social media or other information that may or may not be correct about them. It certainly seems like we’re headed that way.

From the government standpoint, the question is whether they call bulk collection inevitable or whether it can be constrained. Should we just—as companies like Palantir want us to—accept the fact that mass surveillance is here to stay and focus all of our energy on limiting access and having tight controls? Or can we persuade government entities that it’s a liability? A security liability and a democratic liability for them to be sitting on their types of capabilities.

And that they need to actually constrain themselves and tie a hand behind their back, and so that’s going to be the interesting debate going forward which is: are we as a society going to tell our security officials: we don’t want you to do these things.

Even though you can and even though they might have uses that might benefit security in some ways.
I’m somewhat encouraged by the debate that we’ve had over the last few years particularly compared with the decade that we had before 9/11, where it seemed like all you had to do was: X will make us safer and that was the end of the discussion, which is not how a constitutional democracy operates. At heart I’m an optimist, but the longer I spend my time thinking about this, I think the capabilities get better faster than the law can catch up. Then you have the need to have public-minded technologists on the right side, trying to use their skills to go into the spaces where the law may be too slow.

From the government's point of view, the biggest fallout from the Snowden revelations is not the USA Freedom Act, it's the number of communications platforms that are moving to end-to-end encryption.

And the advantage of those kinds of solutions is that they are not geographically bound.

They don’t only help people who live in a democracy, they may also benefit people who are trying to create one. And they may benefit bad guys. Sure.

And I will say this later: the Constitution was written by people who were more worried about a government with too much power than they were about bad guys getting away.

There's no other way to read the Bill of Rights.

At every turn it is making the government’s job deliberately more difficult. Not because they hated government, but because they understood it too well.
Timothy Krause Apparently the US Army is interested in a zealous interpretation of copyright protection, too. According to the Electronic Frontier Foundation, a Chelsea Manning supporter recently attempted to mail Manning a series of printed EFF articles about prisoner rights.

Those materials were withheld and not delivered to her because, according to the EFF, the correspondence contained “printed Internet materials, including email, of a volume exceeding five pages per day or the distribution of which may violate U.S. copyright laws.”Other materials, including lengthy Bureau of Prisons documents, were allowed through, and so the EFF concludes that "it was potentially copyright concerns that resulted in Manning’s mail being censored." Manning, who is serving a 35-year military prison term for leaking classified military documents to WikiLeaks, has previously had run-ins with military prison authorities over alleged “reading contraband.” On February 11, EFF Executive Director Cindy Cohn wrote to the commandant of the US Disciplinary Barracks (USDB) at Fort Leavenworth, explaining that not only did EFF grant permission for Manning to receive the materials, but that all EFF content is published under a Creative Commons license. On Tuesday, EFF wrote: As of Monday morning, EFF has received no response from the Army explaining the matter or clarifying why the material was withheld. Manning has also not received the documents. We have since put the documents in the mail ourselves. … It is tremendously important to EFF that people who are incarcerated have access to our materials.

For example, our Creative Commons license allows Prison Legal News to regularly republish our work in its periodical, which is widely circulated in corrections facilities nationwide. We also believe that there are many pages on the Internet, freely available to anyone with a Web browser, which would prove edifying to prisoners. We would be deeply concerned by a prison policy that blocked any copyrighted works from the Web being printed and distributed to prisoners, as this would block the overwhelming majority of news articles and academic publications.
In the case of the materials denied to Manning, we hope that the Army made a mistake and does not have a policy of misusing copyright to deny prisoners access to important materials that the general public can freely access. Fort Leavenworth did not immediately respond to Ars’ request for comment.
The geeks are approaching the whole Apple vs.

FBI battle over encryption and privacy all wrong.

This is a golden opportunity to get John Q. Public on board regarding data privacy and online security.

But instead, we have a cacophony of conflicting information and noise, and the FBI is winning in the court of public opinion. It's high time Jane Q.

Citizen got to see a clear example of how the U.S. government is slowly but surely chipping away at personal privacy under the guise of national security.

And you couldn't have a better company standing up to the government: The one behind some of the most popular consumer electronics devices today.

There's none of the squickiness of Google and its constant slurping of data, or Facebook's desire to collect information about people you know and things you like.

Apple is not just a tech company -- hate it or love it, Apple is indubitably a lifestyle brand. But there is a stark difference in how Apple and the FBI and the Justice Department, along with their allies, are framing the conversation.

And as a result, Apple and the techies are losing John's and Jane's attention by railing about backdoors, encryption, and legal precedents. Those detailed explainers and FAQs do lay out what's at stake.

But it's the FBI that comes off looking reasonable.

The FBI is, it regular reminds us, trying find out why two people killed 22 people and injured 14 a few months ago as part of a mass shooting, which it regularly describes as terrorism.  So reasonable, in fact, that there's this headline: "San Bernardino terror attack victims' families ask Apple to cooperate with FBI." The side relying on emotions and fear is always going to win against the side carefully crafting logical arguments.
It may be in the nature of technical people to avoid emotions and favor logic, but that's one reason why the FBI is winning the hearts and minds of Americans here.  The thing is, even with all the secret documents that Snowden stole from the NSA, the average user isn't any more concerned about government surveillance today than he or she was three years ago. Sure, it's terrible, but when it comes to user privacy it's still a world of weak passwords, mobile devices with no passcode (or TouchID) enabled, and an overall lack of urgency. So skip the arguments about how if the FBI wins this round, law enforcement will keep coming back with more requests against more devices.  If there is something the Janes and Johns are scared of, it's the foreign other, the faceless enemies sitting in China, Russia, and Iran (why not throw North Korea in the mix, too?).
It's the criminals siphoning money from banks, the nation-state actors stealing personal information from government agencies, and adversaries trying to stop a movie release.  If the FBI gets its way on bypassing this iPhone 5c's protections, what would stop other governments from coming to Apple, Dell, and other companies and asking for help modifying the devices we use to further their own purposes? It won't be the first time a government tried to compel a company to modify technology in the name of national security. Remember BlackBerry?  "While the FBI's request seems to go beyond what other governments have sought from Apple so far, if Apple is forced to develop code to exploit its own phones, it will only be a matter of time before other countries seek to do the same," Jennifer Granick, the director of civil liberties at the Stanford Center for Internet and Society, wrote on the NYU School of Law's Just Security blog. She's right.

And that's a scary enough prospect to justify supporting Apple.

Techies may not like the emotionalism, and consider it to be FUD.

But it's not FUD.
It truly is scary -- and should be talked about that way.
It’s not only Apple. Hundreds of technology companies large and small are engaged in a historic battle to determine how much access governments can have to your personal information.

This includes Google, Microsoft, and nearly every technology company that has significantly impacted your life over the last two decades. The fight for personal privacy versus the state’s right to know has been a battle for millennia.

Aristotle made the key distinction between the public and private spheres thousands of years ago.

Benjamin Franklin is famously quoted by privacy advocates for saying, “Those who would give up essential liberty to purchase a little temporary safety deserve neither liberty nor safety.” Governments have always tried to erode personal privacy.

They think that in order to protect the state and its citizens, the veil of personal privacy should be pulled back whenever necessary. I understand the impulse to eliminate privacy protections.

At least half of my friends and acquaintances -- even my wife -- can’t understand my passion for the topic.

They say they aren’t doing anything illegal and those who marshal legal arguments against the government invading privacy must be hiding something. So let me state some of the concerns that privacy advocates have against an even more intrusive government and see if it persuades you one way or the other. The surveillance state First, there’s not much left of your privacy as it stands.

Corporations and government agencies already have far more access to our personal lives than most people would imagine or allow.
If you want to understand the complete picture, read Bruce Schneier’s “Data and Goliath: The Hidden Battles to Collect Your Data and Control the World.” Bruce is no conspiracy theorist, but if you read his well-researched books and blog posts, you might have a hard time telling the difference between what he puts out there and the ramblings of the New World Order crew. One of my favorite anecdotes in “Data and Goliath” is about the father who sues a retailer for sending pregnancy information and sales pitches to his teenage daughter. He had to drop the lawsuit after he learned that the online retailer knew more than he did. We’re unknowingly surveilled and often permanently recorded by dozens of electronic devices each day. You may hate stoplight cameras, but did you know these cameras often record and store every license plate that passes by, whether or not you ran the light? Plus, your car’s GPS and various computers can enable police to know exactly where you’ve been. There’s little our governments don’t already know about us.

They know what you read and buy.

They know where you drive, where you go on the Internet, who you communicate with. The problem is that a society without privacy protections is not a free society.

Although the government may tout extreme, individual circumstances that justify violation of privacy, once a new Rubicon is crossed, it's never uncrossed.
In nearly every instance where governments have been given the legal right to invade our privacy, they exceed the given authority and exercise those privacy invasions to far more people and instances than permitted by a specific case. Read anything written by James Bamford. His first book, published in 1983 and called the “Puzzle Palace,” reveals almost everything you might learn from a modern NSA whistleblower.

The privacy abuses cited over three decades ago are still occurring -- at even more alarming levels. When the NSA or another spying agency is caught in an illegal act, the most common response, even after public uproar, is for politicians to legalize those illegal actions retroactively.
It seems nothing any spying agency can do is considered truly illegal anymore.

And they want the ability to do more of it. It's not about the iPhone On the face of it, the Apple case, where the FBI seeks more information about the San Bernardino terrorists, would seem like a small intrusion on personal freedom.

After all, the government wants access to a single device of a known terrorist. What could be the harm in that? You might wonder why Apple or anyone else is against it. In fact, the underlying issue is foundational to our freedom. I routinely travel to countries where simply questioning a leader’s strategy in public is enough to get you locked up for a long time.

These aren’t empty threats. People are picked up in bars and restaurants for voicing disagreements and never heard from again. People are locked in prison for talking smack about their employers on Facebook.
In America, you can be fired for being that stupid, but you won’t be arrested unless you make an illegal threat.
I routinely travel to countries where even your supposedly encrypted communications are recorded. No warrants, no suspicion -- because it can be done. That’s why the Apple case is so important. One small decision can have all sorts of implications.
If the government gets the right to insert backdoors or break encryption on a terrorist’s phone, then that decision can be applied to everyone.
It hurts our personal privacy, it hurts our global competitiveness, and it hurts our survival as a free people. So I applaud Apple, Google, Microsoft, and all the other technology firms for fighting on our behalf.

There’s not a whole lot of our privacy left.

They're trying to protect what little remains.
Cook and Comey vie for public opinion but look to Congress/America for resolution Analysis In the latest salvo in a very public war, Apple's CEO and the FBI's director have published letters arguing their cases over gaining access to a locked iPhone. In Apple's corner, Tim Cook sent an all-staff email Monday morning in which he argued that the case represents a "precedent that threatens everyone's civil liberties." FBI director James Comey meanwhile wrote a letter published on Sunday in which he argued the opposite: that the legal argument "is actually quite narrow" and its use would become "increasingly obsolete." The war of words highlights the importance that public opinion is going to have on this critical test case, where the privacy of the individual is going to be weighed directly against the ability of law enforcement to investigate crimes.

Both sides feel they have a strong case. Cook's email was turned into a public Q&A posted on the Apple website.
In it, he highlights the "outpouring of support we've received from across America" since he formally – and publicly – refused to abide by a court order that required Apple to write a version of its mobile operating system so the FBI could unlock the phone of San Bernardino shooter Syed Farook. Public debate Cook is certain that an open public debate on the matter will fall down in Apple's favor, given the increasingly personal information American citizens store on their phones and a general distrust of the federal government. As such, he and Apple are painting the case as one that has far-reaching impacts. "This case is about much more than a single phone or a single investigation," he wrote. "At stake is the data security of hundreds of millions of law-abiding people, and setting a dangerous precedent that threatens everyone's civil liberties." The FBI's Comey paints the opposite picture: that this is a rare and extraordinary case. "The San Bernardino litigation isn't about trying to set a precedent or send any kind of message," he wrote. "It is about the victims and justice.

Fourteen people were slaughtered and many more had their lives and bodies ruined. We owe them a thorough and professional investigation under law." Interestingly, both sides recognize in the middle of tearing down the other side that they may have hit an impasse that will require a large public debate to decide. Cook specifically highlighted a proposal before the US Congress to create a special commission to look at the issue of encryption and privacy, which is at the heart of the matter. He wrote: "We feel the best way forward would be for the government to withdraw its demands under the All Writs Act and, as some in Congress have proposed, form a commission or other panel of experts on intelligence, technology and civil liberties to discuss the implications for law enforcement, national security, privacy and personal freedoms.

Apple would gladly participate in such an effort." 'Awesome new technology that creates a serious tension' Apple would likely gain more from Congress' general inability to achieve anything due to partisan in-fighting, but the FBI's Comey also lends some weight to the idea of taking the issue out of the boxing ring and into more institutional hands. "Although this case is about the innocents attacked in San Bernardino, it does highlight that we have awesome new technology that creates a serious tension between two values we all treasure – privacy and safety," he wrote. "That tension should not be resolved by corporations that sell stuff for a living.
It also should not be resolved by the FBI, which investigates for a living.
It should be resolved by the American people deciding how we want to govern ourselves in a world we have never seen before." There were, of course, some side-swipes by both sides: the FBI accusing Apple of putting marketing ahead of the country's security and Apple implying that the FBI is either incompetent or disingenuous when it caused the iCloud back-up of the phone in question to lock up. But both sides appear to have realized that they are not going to reach a solution between themselves, and an endless public fight is going to rapidly make both of them look bad. Journalists: Crucial details in the @FBI v. #Apple case are being obscured by officials. Skepticism here is fair: pic.twitter.com/lEVEvOxcNm — Edward Snowden (@Snowden) February 19, 2016 Survey Increasingly, in the first of what will likely be numerous surveys on the matter, reputable survey company Pew Research Center found that the FBI's perspective is slightly more popular. A majority – 51 per cent – of those it spoke to said they felt Apple should unlock the phone, whereas 38 per cent said the company should not, and 11 per cent were undecided. The topic had a surprising level of awareness: 75 per cent of the 1,002 people it spoke to were aware of the issue and 39 per cent said they had heard "a lot" about it. Perhaps more surprisingly, there was almost no difference in opinion between Republicans and Democrats on the issue, with 56 and 55 per cent respectively saying Apple should open the phone. There are obviously all sorts of other factors and questions that may come into play.

And most significant will be if the case represents a precedent. There isn't credible polling data yet on how they would change support for the court order, but it is not hard to imagine that if the FBI's order was taken to be a legal precedent, that could lead to it being granted access to many more people's phones; then the balance of agreement could shift the other way. ® Sponsored: Addressing data governance requirements in a dispersed data environment