14.6 C
London
Tuesday, September 26, 2017
Home Tags Apple iCloud

Tag: Apple iCloud

Apple is vindicated as it turns out the celebrity accounts were compromised by a phishing attack and not a breach of Apple's data center. In September 2014, news first broke that somehow attackers were able to get access to private pictures of a number of high-profile Hollywood celebrities stored in Apple's iCloud service.  While justice isn't always swift, on March 15, the U.S Department of Justice announced that it has charged 36-year-old Ryan Collins of Lancaster, Pa., in connection with a phishing scheme that led to the celebrity iCloud image disclosure.The DOJ stated in a release that Collins was engaged in a phishing scheme to get usernames and passwords from November 2012 until the beginning of September 2014.

Collins simply sent emails that appeared to be from Apple or Google requesting the usernames and passwords from the targeted victims. Once he had access to the victims' accounts, Collins was able to gain personal information and also downloaded the contents of the victims' Apple iCloud backups.According to the DOJ, Collins was able to gain access to at least 50 iCloud accounts, mostly connected to female celebrities in the entertainment industry."By illegally accessing intimate details of his victims' personal lives, Mr.

Collins violated their privacy and left many to contend with lasting emotional distress, embarrassment and feelings of insecurity, "David Bowdich, the assistant director in charge of the FBI's Los Angeles Field Office, said in a statement. "We continue to see both celebrities and victims from all walks of life suffer the consequences of this crime and strongly encourage users of Internet-connected devices to strengthen passwords and to be skeptical when replying to emails asking for personal information." In the immediate wake of the iCloud incident in 2014, Apple emphasized that its systems were not directly hacked. Now that the DOJ has charged an individual in the case, it is clear that Apple was in fact accurate, as it was a phishing scheme that led to the breach of user information. Security experts contacted by eWEEK were not surprised that phishing was the root cause for the 2014 celebrity information disclosure."The revelation about the method used to hack the accounts, a phishing scheme, is only half surprising," David Melamed, senior research engineer at CloudLock, told eWEEK. "At the time of the hack, there were multiple allegations that the celeb accounts were protected with weak passwords and some brute-force attack was used to compromise them."Melamed added that the phishing scheme used by the hacker is unfortunately not that surprising and is quite common, but it does dismiss accusations that iCloud's security was breached.Rob Sadowski, director of marketing at RSA, the Security Division of EMC, said the confirmation that the method used to gain unauthorized access to iCloud was phishing rather than an exploit may lessen some concerns about the security of Apple's iCloud service.  Sadowski added that the iCloud celebrity photo incident has increased the pressure on Apple and other cloud app service providers to offer more convenient strong authentication options to further protect users' accounts and information with something stronger than a username and password.Kevin Bocek, vice president of threat intelligence and security strategy at Venafi, noted that he's not surprised that the celebrity picture hacker was tracked down and arrested."Apple, as does every other cloud provider, collects significant data on us and readily shares this under court order," Bocek told eWEEK. " It seems even this hacker would be surprised about the amount of data collected that can be used to track down and arrest bad guys."Casey Ellis, CEO and founder of Bugcrowd, is also among those not surprised that the iCloud attacker was caught.

Given how public the incident was and the considerable embarrassment it caused, the DOJ would have invested significant resources into catching the perpetrator, he said.Ellis added that it's good to see that there are penalties for malicious hackers. That said, there is still some longer term impact as a result of the incident."Apple implemented a number of changes quite rapidly after the hack and definitely worked to make it harder in the future for an attacker," Ellis told eWEEK. "As for the public, I know for a fact that more of my friends are thinking twice about using photo stream and iCloud post-hack."Sean Michael Kerner is a senior editor at eWEEK and InternetNews.com.

Follow him on Twitter @TechJournalist.
He got shlebs' passwords by asking for them over email, says DoJ A 36-year-old US man has admitted hacking into the iCloud and Gmail accounts of celebrities through a long-running phishing attack. Ryan Collins, from Lancaster in Pennsylvania, admitted he had illegally accessed and downloaded images from 50 iCloud accounts and 72 Gmail accounts that he had managed to compromise through phishing attacks that ran between November 2012 until the beginning of September 2014. According to the DoJ: He sent e-mails to victims that appeared to be from Apple or Google and asked victims to provider their usernames and passwords. When the victims responded, Collins then had access to the victims’ e-mail accounts. Collins agreed to plead guilty to one count of unauthorised access to a protected computer to obtain information as part of a plea bargaining deal that means he is likely to face 18 months imprisonment, said the California US Attorney’s Office. Collins collected personal information including nude photographs and video of celebrities.
In some cases, the Feds say, he used a "software program to download the entire contents of the victims’ Apple iCloud backups". The charges against Collins arose from an investigation into the leaks of photographs of numerous female celebrities including Jennifer Lawrence in September 2014. However, investigators say they have not uncovered any evidence linking Collins to the actual leaks showing he shared the information with third parties. The hack, which became known as "Celebgate" followed the leak of celebrity nude pics to notorious image board 4chan back in September 2014.

At the time, an iCloud security breach was blamed but now we know that phishing was also in play. The FBI investigation into Celebgate is ongoing, as the DoJ statement on Collins’ guilty plea explains. Court papers related to the prosecution of Collins can be found here. ® Sponsored: DevOps for Dummies 2nd edition
Kārlis DambrānsApple's encryption battle FBI is asking courts to legalize crypto backdoors because Congress won’t Apple prevails in forced iPhone unlock case in New York court Most software already has a “golden key” backdoor: the system update Police chief: There’s a “reasonably good chance” not much is on seized iPhone More GOP presidential hopefuls now side with the FBI in iPhone crypto fight View all…While many tech companies, cryptographers, and privacy advocates have lined up publicly behind Apple in its ongoing fight with federal prosecutors, the company now has an unexpected ally: a San Bernardino man whose wife was shot and severely injured during the December 2015 terrorist attack. On Thursday, Apple published Salihin Kondoker’s letter to the federal judge overseeing the case. In the letter, Kondoker describes how his wife, a San Bernardino County Health Department employee, was shot three times during the attack but survived. Kondoker describes himself as an IT consultant for Pacific Gas & Electric, a public utility that serves much of California. As he wrote: When I first learned Apple was opposing the order I was frustrated that it would be yet another roadblock.

But as I read more about their case, I have come to understand their fight is for something much bigger than one phone.

They are worried that this software the government wants them to use will be used against millions of other innocent people.
I share their fear. I support Apple and the decision they have made.
I don’t believe Tim Cook or any Apple employee believes in supporting terrorism any more than I do.
I think the vicious attacks I’ve read in the media against one of America’s greatest companies are terrible. He went on to explain that his wife also had a county-issued iPhone, noting that both the "iCloud account and the carrier account" were controlled and paid for by the county. "This was common knowledge among my wife and other employees," Kondoker continued. "Why then would someone store vital contacts related to an attack on a phone they knew the county had access to? [The terrorists] destroyed their personal phones after the attack.

And I believe they did that for a reason." As he concluded: Finally, and the reason for my letter to the court, I believe privacy is important and Apple should stay firm in their decision. Neither I, nor my wife, want to raise our children in a world where privacy is the tradeoff for security.
I believe this case will have a huge impact all over the world. You will have agencies coming from all over the world to get access to the software the FBI is asking Apple for.
It will be abused all over to spy on innocent people. America should be proud of Apple. Proud that it is an American company and we should protect them not try to tear them down. I support them in this case and I hope the court will too. Last month, Apple CEO Tim Cook reiterated the company’s firm commitment to privacy and its resolve to fight the unprecedented court order.
If the order stands up to legal challenges, Apple would be forced to create a new customized iOS firmware that would remove the passcode lockout on a seized iPhone 5c as part of the ongoing investigation.

Doing so would allow federal investigators to brute-force the passcode in order to gain access to the phone's content. Last week, Apple filed its formal legal response and set the stage for an important court hearing in nearby Riverside later this month.

Earlier this week, a different judge in a related case ruled in favor of Apple, saying that the company did not have to help the government even if it was able to do so. Both the government and Apple are set to appear before US Magistrate Judge Sheri Pym in Riverside on March 22.
The director of the Federal Bureau of Investigation has conceded it was a mistake to ask San Bernardino County to reset the password of an iCloud account that had been used by gunman Syed Farook. Changing the password to the account prevented the phone from making a backup to an iCloud account, which Apple could have accessed without bypassing the encryption and security settings on the phone. "As I understand it from the experts, there was a mistake made in that 24 hours after the attack where the county, at the FBI’s request, took steps that made it impossible later to cause the phone to backup again to the iCloud," James Comey told the House Committee on the Judiciary in Washington, D.C., on Tuesday. But, Comey said the issue would still have ended where it is today because the iCloud backup would not have provided the bureau with all of the information from the smartphone. Watch the exchange here:
IDG News Service | Mar 1, 2016 James Comey, director of the FBI, said his agency made a mistake in asking San Bernardino County to reset an iCloud password. He was speaking in front of the House Committee on the Judiciary in Washington, D.C., on March,...
Bill Montgomery is the Maricopa County Attorney.Gage Skidmore Apple's encryption battle The case for using iTunes, not iCloud, to back up your iPhone Activists plan rally on Tuesday at dozens of Apple Stores worldwide How the FBI could use acid and lasers to access data stored on seized iPhone Apple CEO Tim Cook: Complying with court order is “too dangerous to do” If FBI busts into seized iPhone, it could get non-iCloud data, like Telegram chats View all…On Wednesday, an Arizona county attorney's office announced that it will immediately halt "providing iPhones as an option for replacement or upgrades for existing employees," citing the current legal battle between Apple and the Department of Justice. Last week, Apple CEO Tim Cook again reiterated the company’s firm commitment to privacy and its resolve to fight a new court order issued earlier this month.
If the order stands up to legal challenges, Apple would be forced to create a new customized iOS firmware that would remove the passcode lockout on a seized iPhone as part of the ongoing San Bernardino terrorism investigation. Maricopa County, the nation’s fourth most populous county, which encompasses Phoenix and the surrounding area, is also well-known for its very conservative sheriff, Joe Arpaio. "Apple’s refusal to cooperate with a legitimate law enforcement investigation to unlock a phone used by terrorists puts Apple on the side of terrorists instead of on the side of public safety," County Attorney Bill Montgomery said in a statement. "Positioning their refusal to cooperate as having anything to do with privacy interests is a corporate PR stunt and ignores the Fourth Amendment protections afforded by our Constitution." The agency specified that it has 564 smartphones, and of those, 366 are iPhones. "If the potential for unauthorized access to an encryption key is truly motivating Apple’s unwillingness to assist in downloading information from specific iPhones, then let’s define the problem in those terms and work on that concern," Montgomery added. "Otherwise, Apple is proving indifferent to the need for evidence to hold people accountable who have harmed or intend to harm fellow citizens."
EnlargeAndrew Cunningham Apple's encryption battle Activists plan rally on Tuesday at dozens of Apple Stores worldwide How the FBI could use acid and lasers to access data stored on seized iPhone Apple CEO Tim Cook: Complying with court order is “too dangerous to do” If FBI busts into seized iPhone, it could get non-iCloud data, like Telegram chats Apple: We tried to help FBI terror probe, but someone changed iCloud password View all…Since iOS 8 was released in September 2014, Apple has encrypted the local storage of all iPhones.

That’s not news, but it’s become newly relevant since the company and the FBI started a very loud, very public fight about the data stored on a particular iPhone. Privacy advocates have praised Apple’s commitment to full-device encryption by default, and after a false start last year, all new Android phones shipping with version 6.0 or higher should be encrypted by default as well.
It’s an effective tool for keeping thieves from grabbing your data even if they can take your phone. If you're looking for comprehensive privacy, including protection from law enforcement entities, there’s still a loophole here: iCloud.

Apple encourages the use of this service on every iPhone, iPad, and iPod Touch that it sells, and when you do use the service, it backs up your device every time you plug it into its power adapter within range of a known Wi-Fi network. iCloud backups are comprehensive in a way that Android backups still aren’t, and if you’ve been following the San Bernardino case closely, you know that Apple’s own legal process guidelines (PDF) say that the company can hand iMessages, SMS/MMS messages, photos, app data, and voicemail over to law enforcement in the form of an iOS device backup (though some reports claim that Apple wants to strengthen the encryption on iCloud backups, removing the company's ability to hand the data over to law enforcement). For most users, this will never be a problem, and the convenience of iCloud backups and easy preservation of your data far outweigh any risks.

For people who prefer full control over their data, the easiest option is to stop using iCloud and use iTunes instead.

This, too, is not news, and in some ways is a regression to the days before iOS 5 when you needed to use a computer to activate, update, and back up your phone at all.

But there are multiple benefits to doing local backups, so while the topic is on everyone’s mind we’ll show you how to do it (in case you don’t know) and what you get from it (in case you don’t know everything). How to do it Andrew Cunningham The relevant section of iTunes. You'll definitely want to encrypt your local backup, but whether you want to shut iCloud backups off entirely is up to you. Andrew Cunningham The relevant section of iTunes. You'll definitely want to encrypt your local backup, but whether you want to shut iCloud backups off entirely is up to you. Andrew Cunningham Backing up to the computer. Andrew Cunningham Here's where you manage backups.
It'll show you the name of the device and the timestamp as well as a padlock icon to signify that the backups are encrypted. First, obviously, you’ll need iTunes.
If you have OS X, it’s pre-installed; if you have Windows, it’s over here.
Version 12.3 is the first to support iOS 9, and it will run on OS X 10.8 or Windows 7 and anything newer.

Connect your phone to your computer with your Lightning cable and, if you’ve never connected them before, tell both the computer and the phone to trust each other when prompted. From the summary screen, the next thing you’ll want to do is check the “Encrypt iPhone backup” box and set a password—ideally, this would be a unique password separate from your account password or your iOS passcode. Whatever it is, don’t forget it, because if you do, you won’t be able to restore your backup later. Hit the big Back Up Now button, wait for the backup to finish, and you’re done. You can back your phone up to your computer and to iCloud if you want.

This isn’t an all-or-nothing proposition, and there are non-privacy-related reasons for doing a local backup even if you continue to do iCloud backups.
If you don’t want your backups on iCloud at all, you can click the “This computer” button to disable backups, or you can do it from within the settings on your iPhone.

Also remember to delete old backups from iCloud if you don’t want them up there, since they’ll stay there even if you turn backups off, and Apple won’t keep copies of your stuff on its servers after you’ve deleted it. To manage and delete your backups locally, you’ll need to go to the iTunes preferences and hit the Devices tab.

All backups for all devices along with timestamps and lock icons (to denote an encrypted backup) will be listed here, and you can delete them if you’re running out of drive space or if you just don’t need them anymore. Why do local, encrypted backups? Privacy is definitely one reason to use local backups; if your encrypted phone backup is stored on your encrypted laptop that is itself protected with a strong password, there’s very little chance that anyone without the right credentials can get access to anything. There are also benefits when you’re restoring that backup to your iPhone.

As Apple’s page on encrypted iTunes backups outlines, encrypted local backups are the only ones that contain saved account passwords, Wi-Fi settings, browsing history, and data from the Health app.

Apple doesn’t want this data on its servers for security and privacy reasons, and it’s not stored in unencrypted local backups for the same reason. Use encrypted local backups, and you get that info back if you need to do a restore. It also helps if you’re upgrading to a new phone or using a loaner or replacement phone. When you restore an iCloud backup to a phone or tablet that’s not the phone or tablet you backed it up from, you don’t lose any of your photos or iMessage history or anything like that, but you do lose the credentials for e-mail accounts and any other apps that require authentication. This makes it so that someone with the password for your Apple ID couldn’t restore one of your backups to his or her own phone and gain access to every e-mail account and app that you’ve signed into on your own phone.

But it also makes it a hassle to move over to a new iPhone or one AppleCare just replaced.

Encrypted backups retain and restore that account information even if you’re moving to a different phone. Encrypted local backups won’t be the best choice for everybody. You give up convenience, as is usually the case when you make security and privacy a priority. You need to remember to connect your phone to your computer every day or two, you need to devote local storage space to the backups, and you won’t have access to those backups if anything happens to the computer.
It’s up to you to decide what balance of privacy and security works for you.
Apple is right to push back against government efforts to undermine iPhone security. This week, the notoriously secretive Apple went up against the FBI when the agency requested that the company help break open an iPhone 5c used by the San Bernardino shooters.
Instead, Tim Cook released a letter stating publicly that Apple believed creating a special tool to disable security features on iPhones would set a dangerous precedent. And he was right to do so. Knock, KnockTo be clear: The FBI is not specifically asking that Apple provide an always-accessible backdoor into everyone's iPhone. What it wants Apple to do is construct a special version of iOS that would bypass the time delay required by iOS between failed attempts and disable a setting that wipes iPhones after 10 failed attempts.

This would allow agents to brute force—or try lots and lots of wrong passwords until stumbling across the right one—the phone's passcode. (It's been suggested that creating a super-long passcode might slow down that process.) The argument that law enforcement and the intelligence community are responsible enough to handle powerful tools like these has been around for a long time.

The FBI and others have talked about the risk of "going dark," where communications will be carried out via encrypted services that are inaccessible to investigators or surveillance tools. The old adage is that a backdoor for the good guys is a backdoor for the bad guys; the safest way to keep people out is to not give them a way in.
It's the argument that Co-Founder and Co-Chairman Nico Sell made when she was approached by FBI agents to put a backdoor in her secure messaging service, Wickr.

But it's far from just a hypothetical argument. Broken LocksTake TSA-compliant luggage locks. When you buy one from the store, it's designed to accept one of several possible master keys in the hands of TSA agents.

The idea is that this allows the right people—the TSA inspectors—to open your luggage without having to cut off locks and then safely lock the baggage again. Only you and the inspectors should be able to open it. It's a nice idea, but it only works as long as sole access to the master keys is restricted to the right people.

The good guys.

But these keys were posted online and made into 3D printable objects, providing access to everyone: good guys and bad guys. It's this scenario that Apple cites as its primary reason for fighting the FBI's court order.
If Apple created a special version of iOS and used it to allow the phone in question to be unlocked, it might not stay under the company's control for very long.
If it got loose, it could undermine the hard work Apple has put in developing a smart, secure phone.
If it exists at all, Apple could be compelled to use it again, and again, and again. To be fair, Apple already spends a good deal of time and effort responding to court orders and investigators' requests.

The New York Times reports that the company handed over the shooter's iPhone backup files that were stored on iCloud. When PCMag recently looked closer at encryption and how Apple stores our information, we found that as long as it is stored on Apple's servers it is potentially readable.

But Apple is making it clear that it is only willing to go so far, and developing custom intrusion tools for the FBI is apparently the limit. Security for AllAs our devices become more and more personal, it is no surprise that they'll be targeted by law enforcement and intelligence agencies.

But that's no excuse for the FBI, or anyone else, to weaken existing security tools and claim that digital privacy is the exclusive realm of those in power. Which is, effectively, what it's doing.  Back in 2014, FBI director James B.

Comey addressed the crowd at the RSA Conference. When it came to surveillance and searches, particularly of the digital kind, he said "Our goal is to be surgical and precise in what we're looking for, and do whatever we can to protect privacy rights and competitive advantage." Building a magic key for iPhones, or preventing the widespread use of encryption, would do neither. If the FBI wants to get into an iPhone, or any other secure device, it can develop the technology themselves. Security experts are often telling me that if someone wants to break into a phone and has physical access to it, they will eventually succeed.
I'm confident that if the FBI rolled up its sleeves, it would get what it's looking for.
If accused murderer and bath salts enthusiast John McAfee thinks he can pull off cracking an iPhone, surely the FBI can, too.
Cook and Comey vie for public opinion but look to Congress/America for resolution Analysis In the latest salvo in a very public war, Apple's CEO and the FBI's director have published letters arguing their cases over gaining access to a locked iPhone. In Apple's corner, Tim Cook sent an all-staff email Monday morning in which he argued that the case represents a "precedent that threatens everyone's civil liberties." FBI director James Comey meanwhile wrote a letter published on Sunday in which he argued the opposite: that the legal argument "is actually quite narrow" and its use would become "increasingly obsolete." The war of words highlights the importance that public opinion is going to have on this critical test case, where the privacy of the individual is going to be weighed directly against the ability of law enforcement to investigate crimes.

Both sides feel they have a strong case. Cook's email was turned into a public Q&A posted on the Apple website.
In it, he highlights the "outpouring of support we've received from across America" since he formally – and publicly – refused to abide by a court order that required Apple to write a version of its mobile operating system so the FBI could unlock the phone of San Bernardino shooter Syed Farook. Public debate Cook is certain that an open public debate on the matter will fall down in Apple's favor, given the increasingly personal information American citizens store on their phones and a general distrust of the federal government. As such, he and Apple are painting the case as one that has far-reaching impacts. "This case is about much more than a single phone or a single investigation," he wrote. "At stake is the data security of hundreds of millions of law-abiding people, and setting a dangerous precedent that threatens everyone's civil liberties." The FBI's Comey paints the opposite picture: that this is a rare and extraordinary case. "The San Bernardino litigation isn't about trying to set a precedent or send any kind of message," he wrote. "It is about the victims and justice.

Fourteen people were slaughtered and many more had their lives and bodies ruined. We owe them a thorough and professional investigation under law." Interestingly, both sides recognize in the middle of tearing down the other side that they may have hit an impasse that will require a large public debate to decide. Cook specifically highlighted a proposal before the US Congress to create a special commission to look at the issue of encryption and privacy, which is at the heart of the matter. He wrote: "We feel the best way forward would be for the government to withdraw its demands under the All Writs Act and, as some in Congress have proposed, form a commission or other panel of experts on intelligence, technology and civil liberties to discuss the implications for law enforcement, national security, privacy and personal freedoms.

Apple would gladly participate in such an effort." 'Awesome new technology that creates a serious tension' Apple would likely gain more from Congress' general inability to achieve anything due to partisan in-fighting, but the FBI's Comey also lends some weight to the idea of taking the issue out of the boxing ring and into more institutional hands. "Although this case is about the innocents attacked in San Bernardino, it does highlight that we have awesome new technology that creates a serious tension between two values we all treasure – privacy and safety," he wrote. "That tension should not be resolved by corporations that sell stuff for a living.
It also should not be resolved by the FBI, which investigates for a living.
It should be resolved by the American people deciding how we want to govern ourselves in a world we have never seen before." There were, of course, some side-swipes by both sides: the FBI accusing Apple of putting marketing ahead of the country's security and Apple implying that the FBI is either incompetent or disingenuous when it caused the iCloud back-up of the phone in question to lock up. But both sides appear to have realized that they are not going to reach a solution between themselves, and an endless public fight is going to rapidly make both of them look bad. Journalists: Crucial details in the @FBI v. #Apple case are being obscured by officials. Skepticism here is fair: pic.twitter.com/lEVEvOxcNm — Edward Snowden (@Snowden) February 19, 2016 Survey Increasingly, in the first of what will likely be numerous surveys on the matter, reputable survey company Pew Research Center found that the FBI's perspective is slightly more popular. A majority – 51 per cent – of those it spoke to said they felt Apple should unlock the phone, whereas 38 per cent said the company should not, and 11 per cent were undecided. The topic had a surprising level of awareness: 75 per cent of the 1,002 people it spoke to were aware of the issue and 39 per cent said they had heard "a lot" about it. Perhaps more surprisingly, there was almost no difference in opinion between Republicans and Democrats on the issue, with 56 and 55 per cent respectively saying Apple should open the phone. There are obviously all sorts of other factors and questions that may come into play.

And most significant will be if the case represents a precedent. There isn't credible polling data yet on how they would change support for the court order, but it is not hard to imagine that if the FBI's order was taken to be a legal precedent, that could lead to it being granted access to many more people's phones; then the balance of agreement could shift the other way. ® Sponsored: Addressing data governance requirements in a dispersed data environment
Soraya Okuda/EFFApple's encryption battle How the FBI could use acid and lasers to access data stored on seized iPhone Apple CEO Tim Cook: Complying with court order is “too dangerous to do” If FBI busts into seized iPhone, it could get non-iCloud data, like Telegram chats Apple: We tried to help FBI terror probe, but someone changed iCloud password Trump urges supporters to boycott Apple in wake of encryption brouhaha View all…If you are in favor of Apple’s staunch resistance to the government, you may be interested to join a rally on Tuesday, February 23 at 5:30pm local time at an Apple Store near you. Last Tuesday, Fight for the Future, an advocacy group, quickly organized a pro-Apple rally at the Apple Store on Stockton Street in downtown San Francisco. Representatives from the Electronic Frontier Foundation and a few dozen people showed up on short notice, so the groups are expanding their efforts. The new rallies promise events in Hong Kong, Munich, London, and many cities around the United States, including Anchorage, San Diego, New York, and Minneapolis. On Monday morning, Apple CEO Tim Cook again reiterated the company’s firm commitment to privacy and its resolve to fight a new court order issued last week.
If the order stands up to legal challenges, Apple would be forced to create a new customized iOS firmware that would remove the passcode lockout on a seized iPhone as part of the ongoing San Bernardino terrorism investigation. FBI Director James Comey wrote in an op-ed published Sunday evening: Although this case is about the innocents attacked in San Bernardino, it does highlight that we have awesome new technology that creates a serious tension between two values we all treasure: privacy and safety.

That tension should not be resolved by corporations that sell stuff for a living.
It also should not be resolved by the FBI, which investigates for a living.
It should be resolved by the American people deciding how we want to govern ourselves in a world we have never seen before.
AmyApple's encryption battle Apple CEO Tim Cook: Complying with court order is “too dangerous to do” If FBI busts into seized iPhone, it could get non-iCloud data, like Telegram chats Apple: We tried to help FBI terror probe, but someone changed iCloud password Trump urges supporters to boycott Apple in wake of encryption brouhaha Feds to court: Apple must be forced to help us unlock seized iPhone View all…A key justification for last week's court order compelling Apple to provide software the FBI can use to crack an iPhone belonging to one of the San Bernardino shooters is that there's no other way for government investigators to extract potentially crucial evidence from the device. Technically speaking, there are ways for people to physically pry the data out of the seized iPhone, but the cost and expertise required and the failure rate are so great that the techniques aren't practical. In an article published Sunday, ABC News lays out two of the best-known techniques.

The first one is known as decapping.
It involves removing the phone’s memory chip and dissecting some of its innards so investigators can read data stored in its circuitry. With the help of Andrew Zonenberg, a researcher with security firm IOActive, here's how ABC News described the process: In the simplest terms, Zonenberg said the idea is to take the chip from the iPhone, use a strong acid to remove the chip’s encapsulation, and then physically, very carefully drill down into the chip itself using a focused ion beam.

Assuming that the hacker has already poured months and tens of thousands of dollars into research and development to know ahead of time exactly where to look on the chip for the target data -- in this case the iPhone's unique ID (UID) -- the hacker would, micron by micron, attempt to expose the portion of the chip containing exactly that data. The hacker would then place infinitesimally small "probes" at the target spot on the chip and read out, literally bit by bit, the UID data.

The same process would then be used to extract data for the algorithm that the phone normally uses to "tangle" the UID and the user's passkey to create the key that actually unlocks the phone. From there the hacker would load the UID, the algorithm and some of the iPhone's encrypted data onto a supercomputer and let it "brute force" attack the missing user passkey by simply trying all possible combinations until one decrypts the iPhone data. Since the guessing is being done outside the iPhone's operating system, there's no 10-try limit or self-destruct mechanism that would otherwise wipe the phone. But that’s if everything goes exactly right.
If at any point there's even a slight accident in the de-capping or attack process, the chip could be destroyed and all access to the phone's memory lost forever. A separate researcher told ABC News it was unlikely the decapping technique would succeed against an iPhone.
Instead, it would likely cause the data to be lost forever.

A slightly less risky alternative is to use infrared laser glitching.

That technique involves using a microscopic drill bit to pierce the chip and then use an infrared laser to access UID-related data stored on it. While the process may sound like it was borrowed from a science fiction thriller, variations of it have been used in the real world.
In 2010, for instance, hardware hacker Chris Tarnovsky developed an attack that completely cracked the microcontroller used to lock down the Xbox 360 game console. His technique used an electron microscope called a focused ion beam workstation (then priced at $250,000 for a used model) that allowed him to view the chip in the nanometer scale. He could then manipulate its individual wires using microscopic needles. While such techniques are technically possible against the iPhone, in this case, their practicality is severely lacking.

For one thing, the chances of permanently destroying the hardware are unacceptably high.

And for another, the long and extremely costly hacks would have to be carried out from scratch on any additional iPhones government investigators wanted to probe. By contrast, the software a federal magistrate judge has ordered Apple to produce would work against virtually all older iPhones with almost no modifications. Yes, Apple would have to alter the digital signature to make the software run on different devices, but that would require very little investment. More importantly, the software Apple provided in the current case would all but guarantee the expectation that Apple produce similar assistance in future cases.

And even when a suspect's iPhone used "secure enclave" protections not available on the 5C model in this case, the firmware running on the underlying chips can be updated.

Given the precedent that would be set in the current case, it wouldn't be much of a stretch for a court to require Apple to augment the software with functions for bypassing Secure Enclave protections. The process laid out in Sunday's article is interesting, and technically it shows that it may be possible for the FBI to extract the data stored on the seized iPhone without Apple's assistance.

But for the reasons laid out, it will never be seriously considered, let alone used.
Enlarge / Apple CEO Tim Cook.Chris Foresman Apple's encryption battle How the FBI could use acid and lasers to access data stored on seized iPhone If FBI busts into seized iPhone, it could get non-iCloud data, like Telegram chats Apple: We tried to help FBI terror probe, but someone changed iCloud password Trump urges supporters to boycott Apple in wake of encryption brouhaha Feds to court: Apple must be forced to help us unlock seized iPhone View all… Apple CEO Tim Cook has again reiterated the company’s firm commitment to privacy and its resolve to fight a new court order issued last week.
If the order stands up to legal challenges, Apple would be forced to create a new customized iOS firmware that would remove the passcode lockout on a seized iPhone as part of the ongoing San Bernardino terrorism investigation. Early Monday morning, Cook released a letter sent to employees and published a Q&A on the issue. In the letter, which Apple provided to Ars, the CEO wrote: Some advocates of the government’s order want us to roll back data protections to iOS 7, which we released in September 2013. Starting with iOS 8, we began encrypting data in a way that not even the iPhone itself can read without the user’s passcode, so if it is lost or stolen, our personal data, conversations, financial and health information are far more secure. We all know that turning back the clock on that progress would be a terrible idea. Our fellow citizens know it, too. Over the past week I’ve received messages from thousands of people in all 50 states, and the overwhelming majority are writing to voice their strong support. One email was from a 13-year-old app developer who thanked us for standing up for "all future generations." And a 30-year Army veteran told me, "Like my freedom, I will always consider my privacy as a treasure." In the Q&A, Apple wrote: Could Apple build this operating system just once, for this iPhone, and never use it again? The digital world is very different from the physical world.
In the physical world you can destroy something and it’s gone.

But in the digital world, the technique, once created, could be used over and over again, on any number of devices. Law enforcement agents around the country have already said they have hundreds of iPhones they want Apple to unlock if the FBI wins this case.
In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks. Of course, Apple would do our best to protect that key, but in a world where all of our data is under constant threat, it would be relentlessly attacked by hackers and cybercriminals.

As recent attacks on the IRS systems and countless other data breaches have shown, no one is immune to cyberattacks. Again, we strongly believe the only way to guarantee that such a powerful tool isn’t abused and doesn’t fall into the wrong hands is to never create it. The document closed with a call for Congress to "form a commission or other panel of experts on intelligence, technology, and civil liberties to discuss the implications for law enforcement, national security, privacy, and personal freedoms.

Apple would gladly participate in such an effort." A federal court hearing has been scheduled for March 22 in Riverside, California to address Apple’s legal challenges, which Ars will attend.