Not that kind of crack.Geoff Parsons
Apple’s encryption battle
Feds: New judge must force iPhone unlock, overturning ruling that favored Apple
Amazon will restore Fire OS‘ encryption support in the spring
What is a “lying-dormant cyber pathogen?” San Bernardino DA says it’s made up [Updated]
San Bernardino DA says seized iPhone may hold “dormant cyber pathogen” [Update]
To get back at Apple, GOP congressman introduces pointless bill
View all…The custom firmware that the FBI would like Apple to produce in order to unlock the San Bernardino iPhone would be the most straightforward way of accessing the device, allowing the federal agency to rapidly attempt PIN codes until it found the one that unlocked the phone.
But it’s probably not the only way to achieve what the FBI wants.

There may well be approaches that don’t require Apple to build a custom firmware to defeat some of the iPhone’s security measures.
The iPhone 5c used by the San Bernardino killers encrypts its data using a key derived from a combination of an ID embedded in the iPhone’s processor and the user’s PIN.

Assuming that a 4-digit PIN is being used, that’s a mere 10,000 different combinations to try out. However, the iPhone has two protections against attempts to try every PIN in turn.

First, it inserts delays to force you to wait ever longer between PIN attempts (up to one hour at its longest).
Second, it has an optional capability to delete its encryption keys after 10 bad PINs, permanently depriving access to any encrypted data.
The FBI would like to use a custom firmware that allows attempting multiple PINs without either of these features.

This custom firmware would most likely be run using the iPhone’s DFU mode.

Device Firmware Update (DFU) mode is a low-level last resort mode that can be used to recover iPhones that are unable to boot.

To use DFU mode, an iPhone must be connected via USB to a computer running iTunes. iTunes will send a firmware image to the iPhone, and the iPhone will run that image from a RAM disk.

For the FBI’s purposes, this image would include the PIN-attack routines to brute-force the lock on the device.
Developing this firmware should not be particularly difficult—jailbreakers have developed all manner of utilities to build custom RAM disks to run from DFU mode, so running custom code from this environment is already somewhat understood—but there is a problem.

The iPhone will not run any old RAM disk that you copy to it.
It first verifies the digital signature of the system image that is transferred. Only if the image has been properly signed by Apple will the phone run it.
The FBI cannot create that signature itself. Only Apple can do so.

This means also that the FBI cannot even develop the code itself.

To test and debug the code, it must be possible to run the code, and that requires a signature.

This is why it is asking for Apple’s involvement: only Apple is in a position to do this development.
Do nothing at all
The first possibility is that there’s simply nothing to do.

Erasing after 10 bad PINs is optional, and it’s off by default.
If the erase option isn’t enabled, the FBI can simply brute force the PIN the old-fashioned way: by typing in new PINs one at a time.
It would want to reboot the phone from time to time to reset the 1 hour delay, but as tedious as the job would be, it’s certainly not impossible.
It would be a great deal slower on an iPhone 6 or 6s.
In those models, the running count of failed PIN attempts is preserved across reboots, so resetting the phone doesn’t reset the delay period.

But on the 5c, there’s no persistent record of bad PIN trials, so restarting the phone allows an attacker to short-circuit the delay.
Why it might not work
Obviously, if the phone is set to wipe itself, this technique wouldn’t work, and the FBI would want to know one way or the other before starting.
It ought to be a relatively straightforward matter for Apple to tell, as the phone does have the information stored in some accessible way so that it knows what to do when a bad PIN is entered. But given the company’s reluctance to assist so far, getting them to help here may be impossible.Update: It turns out that this bug was fixed in iOS 8.1, so it probably wouldn’t work after all.
Acid and laserbeams
One risky solution that has been discussed extensively already is to use lasers and acid to remove the outer layers of the iPhone’s processor and read the embedded ID. Once this embedded ID is known, it’s no longer necessary to try to enter the PIN directly on the phone itself.
Instead, it would be possible to simply copy the encrypted storage onto another computer and attempt all the PINs on that other computer.

The iPhone’s lock-outs and wiping would be irrelevant in this scenario.
Why it might not work
The risk of this approach is not so much that it won’t work, but that if even a tiny mistake is made, the hardware ID could be irreparably damaged, rendering the stored data permanently inaccessible.
Jailbreak the thing
The iPhone’s built-in lockouts and wipes are unavoidable if running the iPhone’s operating system… assuming that the iPhone works as it is supposed to.
It might not.

The code that the iPhone runs to enter DFU mode, load a RAM image, verify its signature, and then boot the image is small, and it should be simple and quite bullet-proof. However, it’s not impossible that this code, which Apple calls SecureROM, contains bugs.
Sometimes these bugs can enable DFU mode (or the closely related recovery mode) to run an image without verifying its signature first.
There are perhaps six known historic flaws in SecureROM that have enabled jailbreakers to bypass the signature check in one way or another.

These bugs are particularly appealing to jailbreakers, because SecureROM is baked into hardware, and so the bugs cannot be fixed once they are in the wild: Apple has to update the hardware to address them.

Exploitable bugs have been found in the way SecureROM loads the image, verifies the signature, and communicates over USB, and in all cases they have enabled devices to boot unsigned firmware.
If a seventh exploitable SecureROM flaw could be found, this would enable jailbreakers to run their own custom firmwares on iPhones.

That would give the FBI the power to do what it needs to do: it could build the custom firmware it needs and use it to brute force attack the PIN.
Some critics of the government’s demand have suggested that a government agency—probably the NSA—might already know of such a flaw, arguing that the case against Apple is not because of a genuine need to have Apple sign a custom firmware but merely to give cover for their own jailbreak.
Why it might not work
Of course, the difficulty with this approach is that it’s also possible that no such flaw exists, or that even if it does exist, nobody knows what it is.

Given the desirability of this kind of flaw—it can’t be fixed through any operating system update—jailbreakers have certainly looked, but thus far they’ve turned up empty-handed.

As such, this may all be hypothetical.

Ask Apple to sign an FBI-developed firmware
Apple doesn’t want to develop a firmware to circumvent its own security measures, saying that this level of assistance goes far beyond what is required by law.

The FBI, however, can’t develop its own firmware because of the digital signature requirements.
But perhaps there is a middle ground.

Apple, when developing its own firmwares, does not require each test firmware to be signed.
Instead, the company has development handsets that have the signature restriction removed from SecureROM and hence can run any firmware.

These are in many ways equivalent to the development units that game console manufacturers sell to game developers; they allow the developers to load their games to test and debug them without requiring those games to be signed and validated by the console manufacturer each time.
Unlike the consoles, Apple doesn’t distribute these development phones.
It might not even be able to, as it may not have the necessary FCC certification.

But they nonetheless exist.
In principle, Apple could lend one of these devices to the FBI so that the FBI would then be responsible for developing the firmware.

This might require the FBI to do the work on-site at Cupertino or within a Faraday cage to avoid FCC compliance concerns, but one way or another this should be possible. Once it had a finished product, Apple could sign it.
If the company was truly concerned with how the signed firmware might be used, it might even run the firmware itself and discard it after use.
This would relieve Apple of the burden of creating the firmware, and it could be argued that it was weakening Apple’s first amendment argument against unlocking the firmware. While source code is undoubtedly expressive and protected by the first amendment, it seems harder to argue that a purely mechanical transformation such as stamping a file with a digital signature should be covered by the same protection.
Why it might not work
Apple may very well persist in saying no, and the courts may agree.
Andrew Cunningham

Stop the phone from wiping its encryption keys
The way the iPhone handles encryption keys is a little more complex than outlined above.

The encryption key derived from the PIN combined with the hardware ID isn’t used to encrypt the entire disk directly.
If it were, changing the PIN would force the entire disk to be re-encrypted, which would be tiresome to say the least.
Instead, this derived key is used to encrypt a second key, and that key is used to encrypt the disk.

That way, changing the PIN only requires re-encryption of the second key.

The second key is itself stored on the iPhone’s flash storage.
Normal flash storage is awkward to securely erase, due to wear leveling.

Flash supports only a limited number of write cycles, so to preserve its life, flash controllers spread the writes across all the chips. Overwriting a file on a flash drive may not actually overwrite the file but instead write the new file contents to a different location on the flash drive, potentially leaving the old file’s contents unaltered.
This makes it a bad place to store encryption keys that you want to be able to delete.

Apple’s solution to this problem is to set aside a special area of flash that is handled specially.

This area isn’t part of the normal filesystem and doesn’t undergo wear leveling at all.
If it’s erased, it really is erased, with no possibility of recovery.

This special section is called effaceable storage. When the iPhone wipes itself, whether due to bad PIN entry, a remote wipe request for a managed phone, or the built-in reset feature, this effaceable storage area is the one that gets obliterated.
Apart from that special handling, however, the effaceable area should be readable and writeable just like regular flash memory. Which means that in principle a backup can be made and safely squirreled away.
If the iPhone then overwrites it after 10 bad PIN attempts, it can be restored from this backup, and that should enable a further 10 attempts.

This process could be repeated indefinitely.
This video from a Shenzhen market shows a similar process in action (we came at it via 9to5Mac after seeing a tweet in February and further discussion in March). Here, a 16GB iPhone has its flash chip desoldered and put into a flash reader.

A full image of that flash is made, including the all-important effaceable area.
In this case, the chip is then replaced with a 128GB chip, and the image restored, with all its encryption and data intact.

The process for the FBI’s purposes would simply use the same chip every time.
By restoring every time the encryption keys get destroyed, the FBI could—slowly—perform its brute force attack.
It would probably want to install a socket of some kind rather than continuously soldering and desoldering the chip, but the process should be mechanical and straightforward, albeit desperately boring.
A more exotic possibility would be to put some kind of intermediate controller between the iPhone and its flash chip that permitted read instructions but blocked all attempts to write or erase data. Hardware write blockers are already routinely used in other forensic applications to prevent modifications to SATA, SCSI, and USB disks that are being used as evidence, and there’s no reason why such a thing could not be developed for the flash chips themselves.

This would allow the erase/restore process to be skipped, requiring the phone to be simply rebooted every few attempts.
Why it might not work
The working assumption is that the iPhone’s processor has no non-volatile storage of its own.
So it simply doesn’t remember that it is supposed to have wiped its encryption keys, and thus will offer another ten attempts if the effaceable storage area is restored, or that even if it does remember, it doesn’t care.

This is probably a reasonable assumption; the A6 processor used in the iPhone 5c doesn’t appear to have any non-volatile storage of its own, and allowing restoration means that even a securely wiped phone can be straightforwardly restored from backup by connecting it to iTunes.
For newer iPhones, that’s less clear.

Apple implies that the A7 processor—the first to include the “Secure Enclave” function—does have some form of non-volatile storage of its own. On the A6 processor and below, the time delay between PIN attempts resets every time the phone is rebooted. On the A7 and above, it does not; the Secure Enclave somehow remembers that there has been some number of bad PIN attempts earlier on.

Apple also vaguely describes the Secure Enclave as having an “anti-replay counter” for data that is “saved to the file system.” It’s not impossible that this is also used to protect the effaceable storage in some way, allowing the phone to detect that it has been tampered with.

Full restoration is similarly still likely to be possible.
There is also some risk to disassembling the phone, but if the process is reliable enough for Shenzhen markets, the FBI ought to be able to manage it reliably enough.
This last technique in particular should be quite robust.

There’s no doubt that Apple’s assistance would help a great deal; creating a firmware to allow brute-forcing the PIN would be faster and lower risk than any method that requires disassembly.

But if the FBI is truly desperate to bypass the PIN lockout and potential disk wipe, there do appear to be options available to it that don’t require Apple to develop the firmware.