In all fairness, General Michael Hayden, former head of the NSA actually disagrees with FBI Director James Comey and sides with Apple. The reason is fascinating.
Apple’s formal statement is here.
Zetter – Wired:
The news this week that a magistrate ordered Apple to help the FBI hack an iPhone used by one of the San Bernardino shooter suspects has polarized the nation—and also generated some misinformation.
Those who support the government say Apple has cooperated in the past to unlock dozens of phones in other cases—so why can’t it help the FBI unlock this one?
But this isn’t about unlocking a phone; rather, it’s about ordering Apple to create a new software tool to eliminate specific security protections the company built into its phone software to protect customer data. Opponents of the court’s decision say this is no different than the controversial backdoor the FBI has been trying to force Apple and other companies to build into their software—except in this case, it’s an after-market backdoor to be used selectively on phones the government is investigating.
The stakes in the case are high because it draws a target on Apple and other companies embroiled in the ongoing encryption/backdoor debate that has been swirling in Silicon Valley and on Capitol Hill for the last two years. Briefly, the government wants a way to access data on gadgets, even when those devices use secure encryption to keep it private.
Apple specifically introduced security features in 2014 to ensure that it would not be able to unlock customer phones and decrypt the data on them; but it turns out it overlooked a loophole in those security features that the government is now trying to exploit. The loophole is not about Apple unlocking the phone but about making it easier for the FBI to attempt to unlock it on its own. If the controversy over the San Bernardino phone causes Apple to take further steps to close that loophole so that it can’t assist the FBI in this way in the future, it could be seen as excessive obstinance and obstruction by Capitol Hill. And that could be the thing that causes lawmakers to finally step in with federal legislation that prevents Apple and other companies from locking the government out of devices.
If the FBI is successful in forcing Apply to comply with its request, it would also set a precedent for other countries to follow and ask Apple to provide their authorities with the same software tool.
In the interest of clarifying the facts and correcting some misinformation, we’ve pulled together a summary of the issues at hand.
What Kind of Phone Are We Talking About?
The phone in question is an iPhone 5c running the iOS9 version of Apple’s software. The phone is owned by the San Bernardino Department of Public Health, which gave it to Syed Rizwan Farook, the shooter suspect, to use for work.
What Is the Issue?
Farook created a password to lock his phone, and due to security features built into the software on his device, the FBI can’t unlock the phone and access the data on it using the method it wants to use—a bruteforce password-guessing technique wherein they enter different passcodes repeatedly until they guess the right one—without running the risk that the device will lock them out permanently.
How Would It Do That?
Apple’s operating system uses two factors to secure and decrypt data on the phone–the password the user chooses and a unique 256-bit AES secret key that’s embedded in the phone when it’s manufactured. As cryptographer Matthew Green explains in a blog post, the user’s password gets “tangled” with the secret key to create a passcode key that both secures and unlocks data on the device. When the user enters the correct password, the phone performs a calculation that combines these two codes and if the result is the correct passcode, the device and data are unlocked.
To prevent someone from brute-forcing the password, the device has a user-enabled function that limits the number of guesses someone can try before the passcode key gets erased. Although the data remains on the device, it cannot be decrypted and therefore becomes permanently inaccessible. The number of password tries allowed before this happens is unclear. Apple says on its web site that the data becomes inaccessible after six failed password attempts. The government’s motion to the court (.pdf) says it happens after 10 failed guesses.
The government says it does not know for certain if Farook’s device has the auto-erase feature enabled, but notes in its motion that San Bernardino County gave the device to Farook with it enabled, and the most recent backup of data from his phone to iCloud “showed the function turned on.”
A reasonable person might ask why, if the phone was backing data up to iCloud the government can just get everything it needs from iCloud instead of breaking into the phone. The government did obtain some data backed up to iCloud from the phone, but authorities allege in their court document that he may have disabled iCloud backups at some point. They obtained data backed up to iCloud a month before the shootings, but none closer to the date of the shooting when they say he is most likely to have used the phone to coordinate the attack.
Is This Auto-Erase the Only Security Protection Apple Has in Place?
No. In addition to the auto-erase function, there’s another protection against brute force attacks: time delays. Each time a password is entered on the phone, it takes about 80 milliseconds for the system to process that password and determine if it’s correct. This helps prevent someone from quickly entering a new password to try again, because they can only guess a password every 80 milliseconds. This might not seem like a lot of time, but according to Dan Guido, CEO of Trail of Bits, a company that does extensive consulting on iOS security, it can be prohibitively long depending on the length of the password.
“In terms of cracking passwords, you usually want to crack or attempt to crack hundreds or thousands of them per second. And with 80 milliseconds, you really can only crack eight or nine per second. That’s incredibly slow,” he said in a call to reporters this week.
With a four-digit passcode, he says, there are only about 10,000 different combinations a password-cracker has to try. But with a simple six-digit passcode, there are about one million different combinations a password cracker would have to try to guess the correct one—Apple says would take more than five-and-a-half-years to try all combinations of a six-character alpha-numeric password. The iOS9 software, which appears to be the software on the San Bernardino phone, asks you to create a six-digit password by default, though you can change this requirement to four digits if you want a shorter one.
Later models of phones use a different chip than the iPhone 5c and have what’s called a “secure enclave” that adds even more time delays to the password-guessing process. Guido describes the secure enclave as a “separate computer inside the iPhone that brokers access to encryption keys” increasing the security of those keys.
With the secure enclave, after each wrong password guess, the amount of time you have to wait before trying another password grows with each try; by the ninth failed password you have to wait an hour before you can enter a tenth password. The government mentioned this in its motion to the court, as if the San Bernardino phone has this added delay. But the iPhone 5c does not have secure enclave on it, so the delay would really only be the usual 80 milliseconds in this case.
Why None of This Is an Issue With Older iPhones
With older versions of Apple’s phone operating system—that is, phones using software prior to iOS8—Apple has the ability to bypass the user’s passcode to unlock the device. It has done so in dozens of cases over the years, pursuant to a court order. But beginning with iOS8, Apple changed this so that it can no longer bypass the user’s passcode.
According to the motion filed by the government in the San Bernardino case, the phone in question is using a later version of Apple’s operating system—which appears to be iOS9. We’re basing this on a statement in the motion that reads: “While Apple has publicized that it has written the software differently with respect to iPhones such as the SUBJECT DEVICE with operating system (“iOS”)9, Apple yet retains the capacity to provide the assistance sought herein that may enable the government to access the SUBJECT DEVICE pursuant to the search warrant.”
The government is referring to the changes that Apple initially made with iOS8, that exist in iOS9 as well. Apple released iOS9 in September 2015, three months before the San Bernardino attacks occurred, so it’s very possible this is indeed the version installed on the San Bernardino phone.
After today, technology vendors need to consider that they might be the adversary they’re trying to protect their customers from.
What Does the Government Want?
A lot of people have misconstrued the government’s request and believe it asked the court to order Apple to unlock the phone, as Apple has done in many cases before. But as noted, the particular operating system installed on this phone does not allow Apple to bypass the passcode and unlock the phone. So the government wants to try bruteforcing the password without having the system auto-erase the decryption key and without additional time delays. To do this, it wants Apple to create a special version of its operating system, a crippled version of the firmware that essentially eliminates the bruteforcing protections, and install it on the San Bernardino phone. It also wants Apple to make it possible to enter password guesses electronically rather than through the touchscreen so that the FBI can run a password-cracking script that races through the password guesses automatically. It wants Apple to design this crippled software to be loaded into memory instead of on disk so that the data on the phone remains forensically sound and won’t be altered.
Note that even after Apple does all of this, the phone will still be locked, unless the government’s bruteforcing operation works to guess the password. And if Farook kept the iOS9 default requirement for a six-character password, and chose a complex alpha-numeric combination for his password, the FBI might never be able to crack it even with everything it has asked Apple to do.
Apple CEO Tim Cook described the government’s request as “asking Apple to hack our own users and undermine decades of security advancements that protect our customers—including tens of millions of American citizens—from sophisticated hackers and cybercriminals. The same engineers who built strong encryption into the iPhone to protect our users would, ironically, be ordered to weaken those protections and make our users less safe.”
What Exactly Is the Loophole You Said the Government Is Exploiting?
The loophole is the fact that Apple even has the ability to run crippled firmware on a device like this without requiring the user to approve it, the way software updates usually work. If this required user approval, Apple would not be able to do what the government is requesting.
How Doable Is All of This?
Guido says the government’s request is completely doable and reasonable.
“They have to make a couple of modifications. They have to make it so that the operating system boots inside of a RAM disk…[and] they need to delete a bunch of code—there’s a lot of code that protects the passcode that they just need to trash,” he said.
Making it possible for the government to test passwords with a script instead of typing them in would take a little more effort he says. “[T]hat would require a little bit of extra development time, but again totally possible. Apple can load a new kernel driver that allows you to plug something in over the Thunderbolt port… It wouldn’t be trivial but it wouldn’t be massive.”
Could This Same Technique Be Used to Undermine Newer, More Secure Phones?
There has been some debate online about whether Apple would be able to do this for later phones that have newer chips and the secure enclave. It’s an important question because these are the phones that most users will have in the next one or two years as they replace their old phones. Though the secure enclave has additional security features, Guido says that Apple could indeed also write crippled firmware for the secure enclave that achieves exactly what the FBI is asking for in the San Bernardino case.
“It is absolutely within the realm of possibility for Apple themselves to tamper with a lot of the functionality of the secure enclave. They can’t read the secure private keys out of it, but they can eliminate things like the passcode delay,” he said. “That means the solution that they might implement for the 5c would not port over directly to the 5s, the 6 or the 6s, but they could create a separate solution for [these] that includes basically crippled firmware for the secure enclave.”
If Apple eliminates the added time delays that the secure enclave introduces, then such phones would only have the standard 80-millisecond delay that older phones have.
“It requires more work to do so with the secure enclave. You have to develop more software; you have to test it a lot better,” he said. “There may be some other considerations that Apple has to work around. [But] as far as I can tell, if you issue a software update to the secure enclave, you can eliminate the passcode delay and you can eliminate the other device-erase [security feature]. And once both of those are gone, you can query for passcodes as fast as 80 milliseconds per request.”
What Hope Is There for Your Privacy?
You can create a strong alpha-numeric password for your device that would make bruteforcing it essentially infeasible for the FBI or anyone else. “If you have letters and numbers and it’s six, seven or eight digits long, then the potential combinations there are really too large for anyone to bruteforce,” Guido said.
And What Can Apple Do Going Forward?
Guido says Apple could and should make changes to its system so that what the FBI is asking it to do can’t be done in future models. “There are changes that Apple can make to the secure enclave to further secure their phones,” he said. “For instance, they may be able to require some kind of user confirmation, before that firmware gets updated, by entering their PIN code … or they could burn the secure enclave into the chip as read-only memory and lose the ability to update it [entirely].”
These would prevent Apple in the future from having the ability to either upload crippled firmware to the device without the phone owner’s approval or from uploading new firmware to the secure enclave at all.
“There’s a couple of different options that they have; I think all of them, though, are going to require either a new major version of iOS or new chips on the actual phones,” Guido said. “But for the moment, what you have to fall back on is that it takes 80 milliseconds to try every single password guess. And if you have a complex enough password then you’re safe.”
Is the Ability to Upload Crippled Firmware a Vulnerability Apple Should Have Foreseen?
Guido says no.
“It wasn’t until very recently that companies had to consider: What does it look like if we attack our own customers? What does it look like if we strip out and remove the security mitigations we put in specifically to protect customers?”
He adds: “Apple did all the right things to make sure the iPhone is safe from remote intruders, or people trying to break into the iPhone.… But certainly after today, technology vendors need to consider that they might be the adversary they’re trying to protect their customers from. And that’s quite a big shift.” (Great job on this Kim)