File System Crypto

From The iPhone Wiki
Revision as of 22:04, 9 June 2015 by Http (talk | contribs) (initial article, copied directly from Jonathan Zdziarski's blog (with permission), includes Andrey Belenko's corrections - feel free to rewrite, add graphics, etc.)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

Here’s iOS file system / PIN encryption as how Jonathan Zdziarski understands it.

Block 0 of the NAND is used as effaceable storage and a series of encryption “lockers” are stored on it. This is the portion that gets wiped when a device is erased, as this is the base of the key hierarchy. These lockers are encrypted with a hardware key that is derived from a unique hardware id fused into the secure space of the chip (secure enclave, etc). Only the hardware AES routines have access to this key, and there is no known way to extract it without chip deconstruction.

One specific locker, called BAGI, contains an encryption key that encrypts what’s called the system keybag. The keybag contains a number of encryption “class keys” that ultimately protect files in the file system; they’re locked and unlocked at different times, depending on user activity. This lets developers choose if files should get locked when the device is locked, or stay unlocked after they enter their PIN, and so on. Every file on the file system has its own random file key, and that key is encrypted with a class key from the keybag. The keybag keys are encrypted with a combination of the key in the BAGI locker and the user’s PIN.

There’s another locker in the NAND (what Apple calls the class 4 key, and what we call the Dkey). The Dkey is not encrypted with the user PIN, and in previous versions of iOS (<8), was used as the foundation for encryption of any files that were not specifically protected with “data protection”. Most of the file system at the time used the Dkey instead of a class key, by design. Because the PIN wasn’t involved in the crypto (like it is with the class keys in the keybag), anyone with root level access (such as Apple) could easily open that Dkey locker, and therefore decrypt the vast majority of the file system that used it for encryption. The only files that were protected with the PIN up until iOS 8 were those with data protection explicitly enabled, which did not include a majority of Apple’s files storing personal data. In iOS 8, Apple finally pulled the rest of the file system out of the Dkey locker and now virtually the entire file system is using class keys from the keybag that *are* protected with the user’s PIN. The hardware-accelerated AES crypto functions allow for very fast encryption and decryption of the entire hard disk making this technologically possible since the 3GS, however for no valid reason whatsoever (other than design decisions), Apple decided not to properly encrypt the file system until iOS 8.

In order to deduce the PIN in iOS 8 (and access the file system as of iOS 8), you need to iterate through all 10,000 possibilities. This takes about 20 minutes with root code execution on the device, and can be done in the kernel without triggering a wipe after too many failed attempts. Because newer devices’ boot loaders have been stripped down and several vulnerabilities have been addressed, getting root execution is no longer feasible (like it was on the iPhone 4 and lower) unless you are the rare owner of a low level 0day or if you have Apple’s signing keys (to sign your own ram disk). If you jailbreak your device, then you remove a number of security mechanisms that allow certain forensics tools to boot unsigned code to crack the PIN; the exploits typically used in jailbreaks require the user authenticate with their PIN first, which is why law enforcement can’t just jailbreak it for you. There have, however, been recent documents released in the Snowden corpus that suggest NSA was attempting to target developers using Xcode, and possibly Apple themselves. If NSA is in possession of Apple’s signing keys, they’d be able to sign and boot a RAM disk of their own (without a jailbreak), or use private 0days to obtain root execution privileges on the device.

This is one reason it is very important to use a complex passcode. 10,000 iterations only takes about 20 minutes, but a very long passcode could take years, decades, or longer to brute force. Thanks to the fingerprint reader in newer devices combined with Apple’s 24-hour timeout and other protections, users can really benefit by using a strong complex passphrase without the inconvenience of having to type it in very often. It also prevents video surveillance to attempt to steal your PIN, if you’re not typing it in most of the time.

In spite of all this great file system encryption goodness, PIN/Passcode protection is only designed to provide encryption for data at rest. If NSA were targeting you and had a 0day to gain remote code execution on your device, then whatever lockers were unlocked at a given time could easily be exploited for data. A simple program could even be injected waiting for crypto to become unlocked, and then harvesting the data back to a C&C server when that happens… code execution is tricky, but given Apple’s patch history, a very real threat that PIN/passcodes won’t offer protection against. Such a remote exploit is rare, but definitely plausible, again, taking Apple’s patch history as proof that such vulnerabilities have been recently found.

NOTE: There are open source POCs to obtain remote code execution on iOS 7.

Additionally, certain parts of the file system encryption can be unlocked using the escrowbag (a kind of backup keybag) included with an iTunes pair record. If your device is seized at an airport along with your laptop, for example, the pair record on your desktop could be used to access data on your device. This is much more involved than it used to be, however, as iOS 7 and lower had a number of encryption backdoors that would allow someone to bypass the backup encryption on the device. Even though those “diagnostic services” cough have been closed, it’s still possible to decrypt and harvest most third party application data with that pair record, so long as the device has not been power cycled since the PIN/passcode was last typed in. This is why most agencies are now keeping devices powered on while it’s transported back to forensics. iOS 8 devices can be upgraded to beta versions, which re-enable these encryption backdoors, and so if the investigating agent obtains the user’s PIN or passcode, they could potentially dump all information from the device, even if backup encryption is enabled. Fortunately, upgrading to beta at least requires a reboot… users who are running Apple public betas should be aware that these encryption backdoors can be accessed WITHOUT a reboot, meaning a forensics investigator only need a pair record to access that data on your device. Fortunately, Apple has at least shut down wireless access to the most critical parts of that data. You can better protect these interfaces by pair-locking your devices, so that while unlocked it cannot create any new pairing relationships. I have instructions in my blog to do this.

Also of note, the fingerprint reader shuts down after 24 hours of inactivity, so compelling a user’s fingerprint will do no good unless the court system is streamlined to grant such warrants quickly. Even this, however, is easy to thwart if the user sets an obscure finger as the only authenticated finger; after a few failed attempts with an index finger or thumb, for example, the reader also shuts down.

So the current state of encryption is this: a four digit PIN is in no way NSA proof, and neither is your device. If you’re targeted by them, NSA likely has 0days they could use to remote backdoor your device, and if they don’t then I’d put my money on their capabilities to sign their own root code to run on the device to brute force a PIN. A passcode will protect your data at rest much better than a PIN, however nothing would protect you from remote code injected into your device without your knowledge. Due to the way Apple has designed their key hierarchy, offline attacks are infeasible, unless you’re a really bad person and the government were willing to invest big time and money into deconstructing the chip on your device to extract a hardware key. Whether or not that’s yet possible is presently unknown; rumors suggest “sometimes”.

If you are actually targeted by NSA, however, chances are your iPhone wouldn’t be the most vulnerable device and data could be leaked from your desktop and other devices. There’s also warrants to access your iCloud data, including iCloud backups, which contain an obscene amount of data that most people underestimate.

In terms of a non-NSA/CIA agency, things are very different. If they’ve seized your locked device and either do not have your desktop pairing records, or your phone was shut down, then unless you’ve jailbroken it, the capabilities of forensics tools available to these agencies are far more limited. A four digit PIN can still be deduced by dusting latents off of the screen, or by video surveillance, or with tools such as IP-BOX which shuts down the device in between attempts before it can flush the failed attempts to disk. (I am not convinced that Apple has completely fixed this vulnerability). Outside of that, there’s not much they can do to get into the device itself, especially if you’ve set a complex passcode. A complex passcode is far more secure, and no law enforcement agency I’m aware of has any tools capable of brute forcing this or even attacking it on a non-jailbroken phone.

None of this accounts for the beat-him-with-a-wrench-until-he-talks approach, or court-compelled passwords. Just because your data is secure doesn’t mean they can’t make your life a living hell.


References