10.1 C
London
Monday, October 23, 2017
Home Tags Apple iPhone 5C

Tag: Apple iPhone 5C

Not that kind of crack.Geoff Parsons Apple's encryption battle Feds: New judge must force iPhone unlock, overturning ruling that favored Apple Amazon will restore Fire OS‘ encryption support in the spring What is a “lying-dormant cyber pathogen?” San Bernardino DA says it’s made up [Updated] San Bernardino DA says seized iPhone may hold “dormant cyber pathogen” [Update] To get back at Apple, GOP congressman introduces pointless bill View all…The custom firmware that the FBI would like Apple to produce in order to unlock the San Bernardino iPhone would be the most straightforward way of accessing the device, allowing the federal agency to rapidly attempt PIN codes until it found the one that unlocked the phone. But it's probably not the only way to achieve what the FBI wants.

There may well be approaches that don't require Apple to build a custom firmware to defeat some of the iPhone's security measures. The iPhone 5c used by the San Bernardino killers encrypts its data using a key derived from a combination of an ID embedded in the iPhone's processor and the user's PIN.

Assuming that a 4-digit PIN is being used, that's a mere 10,000 different combinations to try out. However, the iPhone has two protections against attempts to try every PIN in turn.

First, it inserts delays to force you to wait ever longer between PIN attempts (up to one hour at its longest).
Second, it has an optional capability to delete its encryption keys after 10 bad PINs, permanently depriving access to any encrypted data. The FBI would like to use a custom firmware that allows attempting multiple PINs without either of these features.

This custom firmware would most likely be run using the iPhone's DFU mode.

Device Firmware Update (DFU) mode is a low-level last resort mode that can be used to recover iPhones that are unable to boot.

To use DFU mode, an iPhone must be connected via USB to a computer running iTunes. iTunes will send a firmware image to the iPhone, and the iPhone will run that image from a RAM disk.

For the FBI's purposes, this image would include the PIN-attack routines to brute-force the lock on the device. Developing this firmware should not be particularly difficult—jailbreakers have developed all manner of utilities to build custom RAM disks to run from DFU mode, so running custom code from this environment is already somewhat understood—but there is a problem.

The iPhone will not run any old RAM disk that you copy to it.
It first verifies the digital signature of the system image that is transferred. Only if the image has been properly signed by Apple will the phone run it. The FBI cannot create that signature itself. Only Apple can do so.

This means also that the FBI cannot even develop the code itself.

To test and debug the code, it must be possible to run the code, and that requires a signature.

This is why it is asking for Apple's involvement: only Apple is in a position to do this development. Do nothing at all The first possibility is that there's simply nothing to do.

Erasing after 10 bad PINs is optional, and it's off by default.
If the erase option isn't enabled, the FBI can simply brute force the PIN the old-fashioned way: by typing in new PINs one at a time.
It would want to reboot the phone from time to time to reset the 1 hour delay, but as tedious as the job would be, it's certainly not impossible. It would be a great deal slower on an iPhone 6 or 6s.
In those models, the running count of failed PIN attempts is preserved across reboots, so resetting the phone doesn't reset the delay period.

But on the 5c, there's no persistent record of bad PIN trials, so restarting the phone allows an attacker to short-circuit the delay. Why it might not work Obviously, if the phone is set to wipe itself, this technique wouldn't work, and the FBI would want to know one way or the other before starting.
It ought to be a relatively straightforward matter for Apple to tell, as the phone does have the information stored in some accessible way so that it knows what to do when a bad PIN is entered. But given the company's reluctance to assist so far, getting them to help here may be impossible.Update: It turns out that this bug was fixed in iOS 8.1, so it probably wouldn't work after all. Acid and laserbeams One risky solution that has been discussed extensively already is to use lasers and acid to remove the outer layers of the iPhone's processor and read the embedded ID. Once this embedded ID is known, it's no longer necessary to try to enter the PIN directly on the phone itself.
Instead, it would be possible to simply copy the encrypted storage onto another computer and attempt all the PINs on that other computer.

The iPhone's lock-outs and wiping would be irrelevant in this scenario. Why it might not work The risk of this approach is not so much that it won't work, but that if even a tiny mistake is made, the hardware ID could be irreparably damaged, rendering the stored data permanently inaccessible. Jailbreak the thing The iPhone's built-in lockouts and wipes are unavoidable if running the iPhone's operating system... assuming that the iPhone works as it is supposed to.
It might not.

The code that the iPhone runs to enter DFU mode, load a RAM image, verify its signature, and then boot the image is small, and it should be simple and quite bullet-proof. However, it's not impossible that this code, which Apple calls SecureROM, contains bugs.
Sometimes these bugs can enable DFU mode (or the closely related recovery mode) to run an image without verifying its signature first. There are perhaps six known historic flaws in SecureROM that have enabled jailbreakers to bypass the signature check in one way or another.

These bugs are particularly appealing to jailbreakers, because SecureROM is baked into hardware, and so the bugs cannot be fixed once they are in the wild: Apple has to update the hardware to address them.

Exploitable bugs have been found in the way SecureROM loads the image, verifies the signature, and communicates over USB, and in all cases they have enabled devices to boot unsigned firmware. If a seventh exploitable SecureROM flaw could be found, this would enable jailbreakers to run their own custom firmwares on iPhones.

That would give the FBI the power to do what it needs to do: it could build the custom firmware it needs and use it to brute force attack the PIN.
Some critics of the government's demand have suggested that a government agency—probably the NSA—might already know of such a flaw, arguing that the case against Apple is not because of a genuine need to have Apple sign a custom firmware but merely to give cover for their own jailbreak. Why it might not work Of course, the difficulty with this approach is that it's also possible that no such flaw exists, or that even if it does exist, nobody knows what it is.

Given the desirability of this kind of flaw—it can't be fixed through any operating system update—jailbreakers have certainly looked, but thus far they've turned up empty-handed.

As such, this may all be hypothetical. Ask Apple to sign an FBI-developed firmware Apple doesn't want to develop a firmware to circumvent its own security measures, saying that this level of assistance goes far beyond what is required by law.

The FBI, however, can't develop its own firmware because of the digital signature requirements. But perhaps there is a middle ground.

Apple, when developing its own firmwares, does not require each test firmware to be signed.
Instead, the company has development handsets that have the signature restriction removed from SecureROM and hence can run any firmware.

These are in many ways equivalent to the development units that game console manufacturers sell to game developers; they allow the developers to load their games to test and debug them without requiring those games to be signed and validated by the console manufacturer each time. Unlike the consoles, Apple doesn't distribute these development phones.
It might not even be able to, as it may not have the necessary FCC certification.

But they nonetheless exist.
In principle, Apple could lend one of these devices to the FBI so that the FBI would then be responsible for developing the firmware.

This might require the FBI to do the work on-site at Cupertino or within a Faraday cage to avoid FCC compliance concerns, but one way or another this should be possible. Once it had a finished product, Apple could sign it.
If the company was truly concerned with how the signed firmware might be used, it might even run the firmware itself and discard it after use. This would relieve Apple of the burden of creating the firmware, and it could be argued that it was weakening Apple's first amendment argument against unlocking the firmware. While source code is undoubtedly expressive and protected by the first amendment, it seems harder to argue that a purely mechanical transformation such as stamping a file with a digital signature should be covered by the same protection. Why it might not work Apple may very well persist in saying no, and the courts may agree. Andrew Cunningham Stop the phone from wiping its encryption keys The way the iPhone handles encryption keys is a little more complex than outlined above.

The encryption key derived from the PIN combined with the hardware ID isn't used to encrypt the entire disk directly.
If it were, changing the PIN would force the entire disk to be re-encrypted, which would be tiresome to say the least.
Instead, this derived key is used to encrypt a second key, and that key is used to encrypt the disk.

That way, changing the PIN only requires re-encryption of the second key.

The second key is itself stored on the iPhone's flash storage. Normal flash storage is awkward to securely erase, due to wear leveling.

Flash supports only a limited number of write cycles, so to preserve its life, flash controllers spread the writes across all the chips. Overwriting a file on a flash drive may not actually overwrite the file but instead write the new file contents to a different location on the flash drive, potentially leaving the old file's contents unaltered. This makes it a bad place to store encryption keys that you want to be able to delete.

Apple's solution to this problem is to set aside a special area of flash that is handled specially.

This area isn't part of the normal filesystem and doesn't undergo wear leveling at all.
If it's erased, it really is erased, with no possibility of recovery.

This special section is called effaceable storage. When the iPhone wipes itself, whether due to bad PIN entry, a remote wipe request for a managed phone, or the built-in reset feature, this effaceable storage area is the one that gets obliterated. Apart from that special handling, however, the effaceable area should be readable and writeable just like regular flash memory. Which means that in principle a backup can be made and safely squirreled away.
If the iPhone then overwrites it after 10 bad PIN attempts, it can be restored from this backup, and that should enable a further 10 attempts.

This process could be repeated indefinitely. This video from a Shenzhen market shows a similar process in action (we came at it via 9to5Mac after seeing a tweet in February and further discussion in March). Here, a 16GB iPhone has its flash chip desoldered and put into a flash reader.

A full image of that flash is made, including the all-important effaceable area.
In this case, the chip is then replaced with a 128GB chip, and the image restored, with all its encryption and data intact.

The process for the FBI's purposes would simply use the same chip every time. By restoring every time the encryption keys get destroyed, the FBI could—slowly—perform its brute force attack.
It would probably want to install a socket of some kind rather than continuously soldering and desoldering the chip, but the process should be mechanical and straightforward, albeit desperately boring. A more exotic possibility would be to put some kind of intermediate controller between the iPhone and its flash chip that permitted read instructions but blocked all attempts to write or erase data. Hardware write blockers are already routinely used in other forensic applications to prevent modifications to SATA, SCSI, and USB disks that are being used as evidence, and there's no reason why such a thing could not be developed for the flash chips themselves.

This would allow the erase/restore process to be skipped, requiring the phone to be simply rebooted every few attempts. Why it might not work The working assumption is that the iPhone's processor has no non-volatile storage of its own.
So it simply doesn't remember that it is supposed to have wiped its encryption keys, and thus will offer another ten attempts if the effaceable storage area is restored, or that even if it does remember, it doesn't care.

This is probably a reasonable assumption; the A6 processor used in the iPhone 5c doesn't appear to have any non-volatile storage of its own, and allowing restoration means that even a securely wiped phone can be straightforwardly restored from backup by connecting it to iTunes. For newer iPhones, that's less clear.

Apple implies that the A7 processor—the first to include the "Secure Enclave" function—does have some form of non-volatile storage of its own. On the A6 processor and below, the time delay between PIN attempts resets every time the phone is rebooted. On the A7 and above, it does not; the Secure Enclave somehow remembers that there has been some number of bad PIN attempts earlier on.

Apple also vaguely describes the Secure Enclave as having an "anti-replay counter" for data that is "saved to the file system." It's not impossible that this is also used to protect the effaceable storage in some way, allowing the phone to detect that it has been tampered with.

Full restoration is similarly still likely to be possible. There is also some risk to disassembling the phone, but if the process is reliable enough for Shenzhen markets, the FBI ought to be able to manage it reliably enough. This last technique in particular should be quite robust.

There's no doubt that Apple's assistance would help a great deal; creating a firmware to allow brute-forcing the PIN would be faster and lower risk than any method that requires disassembly.

But if the FBI is truly desperate to bypass the PIN lockout and potential disk wipe, there do appear to be options available to it that don't require Apple to develop the firmware.
The U.S.

Department of Justice has appealed an order by a court in New York that turned down its request that Apple should be compelled to extract data from the iPhone 5s of an alleged drug dealer. The case in New York is seen as having a bearing on another high-profile case in California where Apple is contesting an order that would require the company to assist the FBI, including by providing new software, in its attempts at cracking by brute force the passcode of an iPhone 5c running iOS 9. The phone was used by one of the two terrorists in the San Bernardino killings on Dec. 2 and the FBI wants Apple to disable the auto-erase feature on the phone, which would erase all data after 10 unsuccessful tries of the passcode, if the feature was activated by the terrorist. The DOJ argues in the New York case as well that it is unable to access the data on the phone running iOS 7, because it is locked with a passcode.

By trying repeated passcodes, the government risks permanently losing all access to the contents of the phone because when an iPhone is locked it is not apparent if the auto-erase feature is enabled, according to the filing Monday. But Apple can help extract the data from the iPhone 5s, which it has done dozens of times in the past for similar iPhone versions at the request of the government, the DOJ argues.

For versions of the operating system that predate iOS 8, Apple has the capability to bypass the passcode feature and access the contents of the phone that were unencrypted, it wrote in the filing. Invoking user privacy and safety, Apple has said that its previous acquiescence to similar judicial orders does not mean it consents to the process.

But the government, which has a warrant, claims that Apple made an U-turn in the particular instance, having first agreed to extract the data, when it responded that "Upon receipt of a valid search warrant pursuant to the instructions laid out in [the legal process guidelines], we can schedule the extraction date within a 1-2 week time frame." At no time during the communications ahead of the government seeking an order did Apple object to the propriety of the proposed order directing its assistance, according to the DOJ filing Monday. Magistrate Judge James Orenstein of the U.S.

District Court for the Eastern District of New York ruled recently that Apple can’t be forced to extract the data from the iPhone.

The government's reading of the All Writs Act, a statute enacted in 1789 and commonly invoked by law enforcement agencies to get assistance from tech companies on similar matters, would change the purpose of the law “from a limited gap-filing statute that ensures the smooth functioning of the judiciary itself into a mechanism for upending the separation of powers by delegating to the judiciary a legislative power bounded only by Congress's superior ability to prohibit or preempt.” But the government argues that the residual authority of the court under the All Writs Act is particularly important where legislation lags behind technology or risks obsolescence. The government argues that courts have relied on the All Writs Act to mandate third-party assistance with search warrants even in circumstances far more burdensome that what is requested in the New York case.

An unreasonable burden on the third-party is a key criterion when a Judge considers an order under the Act.

Apple has argued that a burden on the company would be the impact on its ability to protect customer information, which could threaten the trust customers have in the company and tarnish the Apple brand. The DOJ is now asking the court to review the decision by the Magistrate Judge. "Judge Orenstein ruled the FBI’s request would 'thoroughly undermine fundamental principles of the Constitution’ and we agree,” Apple said in a statement on the new filing by the DOJ.

The company said it shared the Judge’s concern “that misuse of the All Writs Act would start us down a slippery slope that threatens everyone’s safety and privacy."
Kārlis Dambrāns As expected, federal prosecutors in an iPhone unlocking case in New York have now asked a more senior judge, known as a district judge, to countermand a magistrate judge who ruled in Apple’s favor last week. Last week, US Magistrate Judge James Orenstein concluded that what the government was asking for went too far.
In his ruling, he worried about a “virtually limitless expansion of the government's legal authority to surreptitiously intrude on personal privacy.”The case involves Jun Feng, a drug dealer who has already pleaded guilty, and his seized iPhone 5S running iOS 7. Prosecutors have said previously that the investigation was not over and that it still needed data from Feng's phone.

As the government reminded the court, Apple does have the ability to unlock this phone, unlike the seized iPhone 5C in San Bernardino. Moreover, as Department of Justice lawyers note, Apple has complied numerous times previously. In its 51-page Monday filing, the government largely re-hashed its previous arguments, saying that existing law should force Apple’s assistance. In this case, the government arrested a criminal.

The government got a warrant to search the criminal’s phone. Law enforcement agents tried to search the phone themselves, but determined they could not do so without risking the destruction of evidence.

The government then applied for a second court order to ask Apple to perform a simple task: something that Apple can easily do, that it has done many times before, and that will have no effect on the security of its products or the safety of its customers.

This is how the system is supposed to work. In 2014 and 2015, Apple took a two-pronged approach to resisting government pressure: one was to make iOS 8 more resilient than previous versions of the operating system, making it impossible for Apple itself to bypass a passcode lockout.

The other crucial element was to impose firmer legal resistance in court filings.

The New York case is believed to be the first time that Apple openly resisted the government’s attempt to access a seized phone. Agree to disagree? This New York case pre-dates Apple's current battle with the government over a locked iPhone 5C that belonged to one of the shooters in the December 2015 terrorist attack in San Bernardino—that case is due to be heard in court next month in nearby Riverside, California.
In the California case, federal investigators asked for and received an unprecedented court order compelling Apple to create a new firmware to unlock the device.
In February 2016, Apple formally challenged that order, and the outcome is pending. Both the New York and California cases, however, involve the government’s attempt to use an obscure 18th-Century statute known as the All Writs Act, which enables a court to order a person or a company to perform some action. "Judge Orenstein ruled the FBI’s request would 'thoroughly undermine fundamental principles of the Constitution’ and we agree," an Apple spokesman told Ars in a statement. "We share the judge’s concern that misuse of the All Writs Act would start us down a slippery slope that threatens everyone’s safety and privacy." The New York case, however, marks the first time that a federal judge has ruled in favor of a more privacy-minded Apple. More recent amicus, or friend of the court briefs, supporting Apple, have cited Judge Orenstein’s ruling. “The government’s argument is: ‘I would have gotten away with it too, if it weren't for you pesky magistrate!’” Riana Pfefferkorn, a legal fellow at the Stanford Center for Internet and Society, told Ars.
The DOJ wants Apple return to security levels present in iOS 7, before default encryption. Two weeks ahead of a scheduled court date, Apple continues to publicly battle the FBI's request to unlock one of its iPhones.
Senior Vice President of Software Engineering Craig Federighi on Sunday penned an opinion piece for the Washington Post, which suggests that compliance will set mobile security back at least three years. The U.S. Justice Department, he said, believes security on iOS 7 was "good enough," so Apple should roll back to the security level of that operating system. "But the security of iOS 7, while cutting-edge at the time, has since been breached by hackers," Federighi wrote. "What's worse, some of their methods have been productized and are now available for sale to attackers who are less skilled by often more malicious." Apple decided to encrypt its mobile operating system by default beginning with iOS 8, meaning device-level data is inaccessible even to Cupertino, so the company cannot turn over things like phone passcodes and iMessage chats to the feds. But following a December terrorist attack in California, the government is itching to access an iPhone 5c issued to one of the shooters, Syed Rizwan Farook, by his employer, the San Bernardino Health Department. The FBI wants Apple to create a new mobile operating system, which could disable a feature that wipes the gadget after 10 incorrect password guesses—"intentionally creating a vulnerability that would let the government force its way into an iPhone," Federighi said. "Once created, this software—which law enforcement has conceded it wants to apply to many iPhones—would become a weakness that hackers and criminals could use to wreak havoc on the privacy and personal safety of us all," he added. The tech titan is even willing to take its fight against the FBI over iPhone backdoors all the way to the Supreme Court, where it would have the support of numerous industry heavyweights. Oral arguments are set for March 22 in federal court.
The term "cyber pathogen," however, seems to exist only in Harry Potter fan fiction. Does the San Bernardino shooter's iPhone contain anything of value for investigators? They FBI doesn't know, but the San Bernardino District Attorney suggests the county-owned handset could have been used as a weapon of mass cyber destruction. "The iPhone…may have connected to the San Bernardino County computer network," DA Michael Ramos said in a court filing. "The seized iPhone may contain evidence that it was used as a weapon to introduce a lying dormant cyber pathogen that endangers San Bernardino County's infrastructure." Local residents shouldn't be too quick to panic, though: iPhone forensics expert Jonathan Zdziarski debunked the DA's claims. "I quickly Googled the term 'cyber pathogen' to see if anyone had used it in computer science," Zdziarski wrote in a blog post.

The first result: Harry Potter fan fiction. "That's right, a Demigod from Gryffindor is the closest thing Google could find about cyber pathogens." Zdziarski said even CSI: Cyber is not bold enough to use "wildly non-existent terms" like "cyber pathogen" in its TV scripts. "There is absolutely nothing in the universe that knows what a cyber pathogen is," Zdziarski wrote. "Fagan's statements are not only misleading to the court, but amount to blatant fear mongering.

They are designed to manipulate the court into making a ruling for the FBI." The device in question—an iPhone 5c issued to Syed Rizwan Farook as part of his San Bernardino Health Department duties—is currently in the possession of the FBI, which wants Apple to disable a feature that wipes the gadget after 10 incorrect password guesses so that it may use an automated system to guess the phone's passcode and break in. According to Ramos, information contained on the smartphone could provide evidence to help the government identify co-conspirators "who would be prosecuted for murder and attempted murder." But to do that, Cupertino would need to create another mobile operating system that could open the encrypted device—a slippery slope, according to CEO Tim Cook, who is worried the workaround might end up in the wrong hands. Apple is even willing to take its fight against the FBI over iPhone backdoors all the way to the Supreme Court, where it would have the support of numerous industry heavyweights. Oral arguments are set for March 22 in federal court.
Maurizio PesceApple's encryption battle Amazon removed device encryption from Fire OS 5 because no one was using it Apple’s new ally in unlocking battle: A man whose wife was shot 3 times in attack FBI is asking courts to legalize crypto backdoors because Congress won’t Apple prevails in forced iPhone unlock case in New York court Most software already has a “golden key” backdoor: the system update View all…A Florida congressman has introduced a new bill that would forbid federal agencies from purchasing Apple products until the company cooperates with the federal court order to assist the unlocking of a seized iPhone 5C associated with the San Bernardino terrorist attack. In a statement released on Thursday, Rep.

David Jolly (R-Fla.) blasted Apple. "Taxpayers should not be subsidizing a company that refuses to cooperate in a terror investigation that left 14 Americans dead on American soil," he said. "Who did the terrorist talk to? Who did he message with? Did he go to a safe house? Is there information on the phone that might prevent a future attack on US soil? Following the horrific events of September 11, 2001, every citizen and every company was willing to do whatever it took to side with law enforcement and defeat terror.
It’s time Apple shows that same conviction to further protect our nation today." Last month, Apple was given a controversial court order to create a customized firmware that would enable investigators to brute force a seized iPhone 5C and get past its passcode.

Apple has vowed to fight the order in court, and the company is set to appear before a judge later this month. At least for now, Jolly’s bill is unlikely to advance very far in a Congress that can barely agree on the time of day; GovTrack gives it a 1 percent chance of passage. On a related note, there are currently two state bills in California and New York that seek to ban sales of phones with unbreakable encryption. Plus, while Apple does provide discounts for federal government employees, military personnel, and their families, it’s unlikely that Apple has made headway into overtaking the entrenched legacy sales of BlackBerry. The new legislation was announced the same day that many tech companies, including Twitter, Airbnb, eBay, and many others that have lined up behind Apple in their own court filings. Earlier in the day, Apple also published a support letter by a San Bernardino man whose wife was shot and severely injured during the December 2015 terrorist attack. President Barack Obama is known to use an iPad that he was given in 2011 by then-CEO Steve Jobs himself.
Kārlis DambrānsApple's encryption battle FBI is asking courts to legalize crypto backdoors because Congress won’t Apple prevails in forced iPhone unlock case in New York court Most software already has a “golden key” backdoor: the system update Police chief: There’s a “reasonably good chance” not much is on seized iPhone More GOP presidential hopefuls now side with the FBI in iPhone crypto fight View all…While many tech companies, cryptographers, and privacy advocates have lined up publicly behind Apple in its ongoing fight with federal prosecutors, the company now has an unexpected ally: a San Bernardino man whose wife was shot and severely injured during the December 2015 terrorist attack. On Thursday, Apple published Salihin Kondoker’s letter to the federal judge overseeing the case. In the letter, Kondoker describes how his wife, a San Bernardino County Health Department employee, was shot three times during the attack but survived. Kondoker describes himself as an IT consultant for Pacific Gas & Electric, a public utility that serves much of California. As he wrote: When I first learned Apple was opposing the order I was frustrated that it would be yet another roadblock.

But as I read more about their case, I have come to understand their fight is for something much bigger than one phone.

They are worried that this software the government wants them to use will be used against millions of other innocent people.
I share their fear. I support Apple and the decision they have made.
I don’t believe Tim Cook or any Apple employee believes in supporting terrorism any more than I do.
I think the vicious attacks I’ve read in the media against one of America’s greatest companies are terrible. He went on to explain that his wife also had a county-issued iPhone, noting that both the "iCloud account and the carrier account" were controlled and paid for by the county. "This was common knowledge among my wife and other employees," Kondoker continued. "Why then would someone store vital contacts related to an attack on a phone they knew the county had access to? [The terrorists] destroyed their personal phones after the attack.

And I believe they did that for a reason." As he concluded: Finally, and the reason for my letter to the court, I believe privacy is important and Apple should stay firm in their decision. Neither I, nor my wife, want to raise our children in a world where privacy is the tradeoff for security.
I believe this case will have a huge impact all over the world. You will have agencies coming from all over the world to get access to the software the FBI is asking Apple for.
It will be abused all over to spy on innocent people. America should be proud of Apple. Proud that it is an American company and we should protect them not try to tear them down. I support them in this case and I hope the court will too. Last month, Apple CEO Tim Cook reiterated the company’s firm commitment to privacy and its resolve to fight the unprecedented court order.
If the order stands up to legal challenges, Apple would be forced to create a new customized iOS firmware that would remove the passcode lockout on a seized iPhone 5c as part of the ongoing investigation.

Doing so would allow federal investigators to brute-force the passcode in order to gain access to the phone's content. Last week, Apple filed its formal legal response and set the stage for an important court hearing in nearby Riverside later this month.

Earlier this week, a different judge in a related case ruled in favor of Apple, saying that the company did not have to help the government even if it was able to do so. Both the government and Apple are set to appear before US Magistrate Judge Sheri Pym in Riverside on March 22.
At the RSA Conference, Defense Secretary Ashton Carter offers his take on encryption, the importance of innovation to the nation's defense, and more. SAN FRANCISCO—U.S.
Secretary of Defense Ashton Carter provided his views on encryption and why being open to innovation is important for the nation's defense.

Addressing attendees at this year's RSA Conference, Carter also detailed the new "Hack the Pentagon" effort, aimed at engaging security researchers in helping identify potential security vulnerabilities."We have to think outside the five-sided box," he said.The five-sided box is a reference to the Pentagon, home of the Defense Department, and the Hack the Pentagon project is essentially a bug-bounty program for the DOD."There are black hats out there, but we're looking for white hats," Carter said. The DOD is trying to adopt a best practice and invite people to attack it—an effort designed to find vulnerabilities, said Carter, who described the Hack the Pentagon program as a way to crowdsource expertise and get access to skilled people. "You'd much rather find the vulnerabilities in your networks this way than the other way," meaning through a breach or some other attack from malicious threat actors, Carter said.Bug bounties are increasingly common in the commercial sector, with individual companies running programs.
Some vendors like HackerOne and Bugcrowd run bug-bounty programs on behalf of companies.The Hack the Pentagon effort is the first such program at the DOD, which is seeking to learn from the endeavor, Carter said. "It's an example of using best practices in the government.
I'm not doing this for fun.
I'm doing it for utility."Carter said he's trying to create a culture of innovation at the DOD to help it execute on its mission of defending the nation. "If you don't take risks and are not willing to fail, you won't get anywhere."Apple-FBI CaseCarter also made a few brief comments on the ongoing Apple-FBI debate over encryption related to the iPhone 5C used by one of the terrorists involved in the San Bernardino, Calif., shooting late last year.

The FBI-Apple case is not a DOD matter; it's a legal and a law-enforcement issue, he explained.That said, he noted that data security, including encryption, is essential to the DOD. "We're behind data security and strong encryption—no question about it.
I'm not a believer in backdoors or a single technical approach to what is a complex issue."Industry and government must work together to achieve a sensible result when it comes to data security and privacy, Carter said, adding that it is a solemn trust to be the U.S. government. "We are not anybody else; we are the government, and it's fair that people hold us to a higher standard."Carter also commented on the use of cyber-weapons in the battle against the Islamic State. While he did not provide any specifics about the weapons, he said the goal is nothing less than achieving victory over the terrorists. "We will and we must defeat ISIL. We're looking at all the ways we can accelerate that defeat."Sean Michael Kerner is a senior editor at eWEEK and InternetNews.com.

Follow him on Twitter @TechJournalist.
A recent Pew Research Center report finds that 51 percent of Americans agree with the FBI. While Silicon Valley execs rally around Apple in its stand-off with the FBI, one industry heavyweight is not exactly jumping to Apple's defense: Bill Gates. "Apple has access to the information, they're just refusing to provide the access," he told the Financial Times. "You shouldn't call the access some special thing," he said, adding that this case is no different from the government asking for phone or bank records. "Any time a bank is told, 'Hey, turn over bank account information,' as soon as they do that on one person, then they're admitting they can do it on many people," Gates said. In a later interview with Bloomberg (video below), Gates said he was "disappointed" that his comments had been characterized as backing the FBI "because that doesn't state my view on this," but he stopped short of saying Apple should not comply with the court's order. "The courts are going to decide this, and I think Apple said whatever the court decision is, they'll abide by [it]," Gates said when asked asked what Cupertino should do. "In the meantime, that gives us this opportunity to get the discussion" started. He urged both sides not to act rashly. "You want to strike that balance [and] set an example" for the rest of the world. Last week, a Los Angeles District Court judge ruled that the tech titan must assist the U.S. government in the search of an iPhone 5c owned by San Bernardino shooter Syed Rizwan Farook. The controversial court order does not explicitly ask Apple to break the phone's encryption, but rather to develop and install a new mobile operating system to allow the government to bypass a setting that wipes the phone after 10 incorrect passcode guesses.

The FBI would then use a brute force attack to figure out the phone's passcode and unlock it, without fear of deleting its data. But Apple CEO Tim Cook says that's a slippery slope, suggesting that a backdoor created for the FBI could very easily land in the hands of those with nefarious intent.

Cupertino has already handed over everything it had on the shooter from Apple's servers, but the actual phone is encrypted for security purposes. Apple fears that if it makes an exception for this case, more requests will follow. Law enforcement agents around the country have already said they have hundreds of iPhone they want unlocked if the FBI wins this case. "In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks," Cupertino said in a Q&A posted to its website. "Of course, Apple would do our best to protect that key, but … it would be relentlessly attacked by hackers and cybercriminals." To the contrary, FBI boss James Comey said on Sunday that the government does not "want to break anyone's encryption or set a master key loose on the land." Comey and the Bureau seem to have the ear of the American public: According to a recent Pew Research Center study, 51 percent of people say Apple should unlock the iPhone; only 38 percent think the company should stand its ground. Late last week, the Department of Justice filed a motion to force Apple to comply with the court's original order.

The agency said it shared the iDevice maker's concerns that information needs to be protected, but insists the FBI's order does not compromise that goal.

Apple has until Feb. 26 to submit its response to the court. Editor's Note: This story was updated at 10:30 a.m.

ET with more comments from Gates.
Apple's refusal to help the FBI get into the San Bernardino shooter's iPhone 5c is the most public, but the company is resisting similar court orders in 12 more cases. The Wall Street Journal reported that the Justice Department is trying to compel App...
Microsoft founder Bill Gates says he supports the U.S. government in its efforts to unearth the contents of a terrorist's iPhone, countering a trend by other tech leaders to back Apple's refusal to code a backdoor into its iOS operating system. Gates appears to have made the case, however, that he is in favor of the government's request because he feels it is narrowly worded.  "This is a specific case where the government is asking for access to information," Gates told the Financial Times in a story published Monday night Pacific time. "They are not asking for some general thing; they are asking for a particular case." "It is no different than [the question of] should anybody ever have been able to tell the phone company to get information, should anybody be able to get at bank records," Gates added.  Why this matters: The issue concerns data stored on the iPhone which may or may not have national security implications.

The FBI originally confiscated an Apple iPhone 5c issued to San Bernardino shooter Syed Rizwan Farook by his employer, the San Bernardino County Department of Public Health.

The FBI then asked the government for an order compelling Apple to defeat the hardware safeguards it built into the iPhone by sideloading code that would remove those safeguards and allow the government to decode the PIN protecting the data by brute force. Otherwise, the iPhone could automatically delete all data on the phone after 10 incorrect passcode attempts. Set for a showdown Apple has refused, however, and has said that the case involves the "freedom and liberties" of its customers.
It has asked the government to form a commission to decide the matter.  Legally, the issue is set to come to a head in March, when lawyers from both sides present their arguments. Gates is on the side of most Americans, who apparently side with the government against Apple's position. However, Facebook chief executive Mark Zuckerberg, along with Google executive Sundar Pichai, and Twitter CEO Jack Dorsey, have all come out in favor of Apple, and consumer privacy.  Gates told the FT, however, that he encouraged a "debate" so that citizens of individual countries did not feel that they would be forced to restrict the government's access to information. Additional reporting by Susie Ochs. This story, "Bill Gates backs the U.S. government in Apple's iPhone privacy standoff" was originally published by PCWorld.
The geeks are approaching the whole Apple vs.

FBI battle over encryption and privacy all wrong.

This is a golden opportunity to get John Q. Public on board regarding data privacy and online security.

But instead, we have a cacophony of conflicting information and noise, and the FBI is winning in the court of public opinion. It's high time Jane Q.

Citizen got to see a clear example of how the U.S. government is slowly but surely chipping away at personal privacy under the guise of national security.

And you couldn't have a better company standing up to the government: The one behind some of the most popular consumer electronics devices today.

There's none of the squickiness of Google and its constant slurping of data, or Facebook's desire to collect information about people you know and things you like.

Apple is not just a tech company -- hate it or love it, Apple is indubitably a lifestyle brand. But there is a stark difference in how Apple and the FBI and the Justice Department, along with their allies, are framing the conversation.

And as a result, Apple and the techies are losing John's and Jane's attention by railing about backdoors, encryption, and legal precedents. Those detailed explainers and FAQs do lay out what's at stake.

But it's the FBI that comes off looking reasonable.

The FBI is, it regular reminds us, trying find out why two people killed 22 people and injured 14 a few months ago as part of a mass shooting, which it regularly describes as terrorism.  So reasonable, in fact, that there's this headline: "San Bernardino terror attack victims' families ask Apple to cooperate with FBI." The side relying on emotions and fear is always going to win against the side carefully crafting logical arguments.
It may be in the nature of technical people to avoid emotions and favor logic, but that's one reason why the FBI is winning the hearts and minds of Americans here.  The thing is, even with all the secret documents that Snowden stole from the NSA, the average user isn't any more concerned about government surveillance today than he or she was three years ago. Sure, it's terrible, but when it comes to user privacy it's still a world of weak passwords, mobile devices with no passcode (or TouchID) enabled, and an overall lack of urgency. So skip the arguments about how if the FBI wins this round, law enforcement will keep coming back with more requests against more devices.  If there is something the Janes and Johns are scared of, it's the foreign other, the faceless enemies sitting in China, Russia, and Iran (why not throw North Korea in the mix, too?).
It's the criminals siphoning money from banks, the nation-state actors stealing personal information from government agencies, and adversaries trying to stop a movie release.  If the FBI gets its way on bypassing this iPhone 5c's protections, what would stop other governments from coming to Apple, Dell, and other companies and asking for help modifying the devices we use to further their own purposes? It won't be the first time a government tried to compel a company to modify technology in the name of national security. Remember BlackBerry?  "While the FBI's request seems to go beyond what other governments have sought from Apple so far, if Apple is forced to develop code to exploit its own phones, it will only be a matter of time before other countries seek to do the same," Jennifer Granick, the director of civil liberties at the Stanford Center for Internet and Society, wrote on the NYU School of Law's Just Security blog. She's right.

And that's a scary enough prospect to justify supporting Apple.

Techies may not like the emotionalism, and consider it to be FUD.

But it's not FUD.
It truly is scary -- and should be talked about that way.