Cook wrote that the backdoor, which removes limitations on how often an attacker can incorrectly guess an iPhone passcode, would set a dangerous precedent and “would have the potential to unlock any iPhone in someone’s physical possession,” even though in this instance, the FBI is seeking to unlock a single iPhone belonging to one of the killers in a 14-victim mass shooting spree in San Bernardino, California, in December.
It’s true that ordering Apple to develop the backdoor will fundamentally undermine iPhone security, as Cook and other digital security advocates have argued. But it’s possible for individual iPhone users to protect themselves from government snooping by setting strong passcodes on their phones — passcodes the FBI would not be able to unlock even if it gets its iPhone backdoor.
See also: If Apple loses to the FBI, people will be screwed.
The technical details of how the iPhone encrypts data, and how the FBI might circumvent this protection, are complex and convoluted, and are being thoroughly explored elsewhere on the internet. What I’m going to focus on here is how ordinary iPhone users can protect themselves.
The short version: If you’re worried about governments trying to access your phone, set your iPhone up with a random, 11-digit numeric passcode. What follows is an explanation of why that will protect you and how to actually do it.
If it sounds outlandish to worry about government agents trying to crack into your phone, consider that when you travel internationally, agents at the airport or other border crossings can seize, search, and temporarily retain your digital devices — even without any grounds for suspicion. And while a local police officer can’t search your iPhone without a warrant, cops have used their own digital devices to get search warrants within 15 minutes, as a Supreme Court opinion recently noted.
The most obvious way to try and crack into your iPhone, and what the FBI is trying to do in the San Bernardino case, is to simply run through every possible passcode until the correct one is discovered and the phone is unlocked. This is known as a “brute force” attack.
For example, let’s say you set a six-digit passcode on your iPhone. There are 10 possibilities for each digit in a numbers-based passcode, and so there are 106, or 1 million, possible combinations for a six-digit passcode as a whole. It is trivial for a computer to generate all of these possible codes. The difficulty comes in trying to test them.
One obstacle to testing all possible passcodes is that the iPhone intentionally slows down after you guess wrong a few times. An attacker can try four incorrect passcodes before she’s forced to wait one minute. If she continues to guess wrong, the time delay increases to five minutes, 15 minutes, and finally one hour. There’s even a setting to erase all data on the iPhone after 10 wrong guesses.
This is where the FBI’s requested backdoor comes into play. The FBI is demanding that Apple create a special version of the iPhone’s operating system, iOS, that removes the time delays and ignores the data erasure setting. The FBI could install this malicious software on the San Bernardino killer’s iPhone, brute force the passcode, unlock the phone, and access all of its data. And that process could hypothetically be repeated on anyone else’s iPhone.
(There’s also speculation that the government could make Apple alter the operation of a piece of iPhone hardware known as the Secure Enclave; for the purposes of this article, I assume the protections offered by this hardware, which would slow an attacker down even more, are not in place.)
Even if the FBI gets its way and can clear away iPhone safeguards against passcode guessing, it faces another obstacle, one that should help keep it from cracking passcodes of, say, 11 digits: It can only test potential passcodes for your iPhone using the iPhone itself; the FBI can’t use a supercomputer or a cluster of iPhones to speed up the guessing process. That’s because iPhone models, at least as far back as May 2012, have come with a Unique ID (UID) embedded in the device hardware. Each iPhone has a different UID fused to the phone, and, by design, no one can read it and copy it to another computer. The iPhone can only be unlocked when the owner’s passcode is combined with the the UID to derive an encryption key.
So the FBI is stuck using your iPhone to test passcodes. And it turns out that your iPhone is kind of slow at that: iPhones intentionally encrypt data in such a way that they must spend about 80 milliseconds doing the math needed to test a passcode, according to Apple. That limits them to testing 12.5 passcode guesses per second, which means that guessing a six-digit passcode would take, at most, just over 22 hours.
You can calculate the time for that task simply by dividing the 1 million possible six-digit passcodes by 12.5 per seconds. That’s 80,000 seconds, or 1,333 minutes, or 22 hours. But the attacker doesn’t have to try each passcode; she can stop when she finds one that successfully unlocks the device. On average, it will only take 11 hours for that to happen.
But the FBI would be happy to spend mere hours cracking your iPhone. What if you use a longer passcode? Here’s how long the FBI would need:
- seven-digit passcodes will take up to 9.2 days, and on average 4.6 days, to crack
- eight-digit passcodes will take up to three months, and on average 46 days, to crack
- nine-digit passcodes will take up to 2.5 years, and on average 1.2 years, to crack
- 10-digit passcodes will take up to 25 years, and on average 12.6 years, to crack
- 11-digit passcodes will take up to 253 years, and on average 127 years, to crack
- 12-digit passcodes will take up to 2,536 years, and on average 1,268 years, to crack
- 13-digit passcodes will take up to 25,367 years, and on average 12,683 years, to crack
Nerd tip: If you’re using a Mac or Linux, you can securely generate a random 11-digit passcode by opening the Terminal app and typing this command:
python -c 'from random import SystemRandom as r; print(r().randint(0,10**11-1))'
It’s also important to note that we’re assuming the FBI, or some other government agency, has not found a flaw in Apple’s security architecture that would allow them to test passcodes on their own computers or at a rate faster than 80 milliseconds per passcode.
Once you’ve created a new 11-digit passcode, you can start using it by opening the Settings app, selecting “Touch ID & Passcode,” and entering your old passcode if prompted. Then, if you have an existing passcode, select “Change passcode” and enter your old passcode. If you do not have an existing passcode, and are setting one for the first time, click “Turn passcode on.”
Then, in all cases, click “Passcode options,” select “Custom numeric code,” and then enter your new passcode.
Here are a few final tips to make this long-passcode thing work better:
- Within the “Touch ID & Passcode” settings screen, make sure to turn on the Erase Data setting to erase all data on your iPhone after 10 failed passcode attempts.
- Make sure you don’t forget your passcode, or you’ll lose access to all of the data on your iPhone.
- Don’t use Touch ID to unlock your phone. Your attacker doesn’t need to guess your passcode if she can push your finger onto the home button to unlock it instead. (At least one court has ruled that while the police cannot compel you to disclose your passcode, they can compel you to use your fingerprint to unlock your smartphone.)
- Don’t use iCloud backups. Your attacker doesn’t need to guess your passcode if she can get a copy of all the same data from Apple’s server, where it’s no longer protected by your passcode.
- Do make local backups to your computer using iTunes, especially if you are worried about forgetting your iPhone passcode. You can encrypt the backups, too.
This article was originally published in The Intercept.
0 Comments