Why Apple Opposing FBI Request is Not About Privacy

On Friday, Apple legally declared their intent to fight the court order to weaken security on an iPhone 5C involved with the San Bernardino attack. It’s not a surprise as Apple has been quite vocal about it’s opposition to the FBI’s request, but now the legal fight is official, and the media circus is truly about to begin.

Over the past week, tech media outlets took this news story under it wings, declaring their support of Apple’s defiance with near unanimous acceptance. Yet, I feel like it’s all for the wrong reasons. During this last week, there has been an ongoing outcry about government overreach and the death of privacy, but none of that is true, if you look at the actual facts of this case. While I, too, support Apple’s decision here, it’s because I feel that the FBI’s request of Apple is unfair to Apple. But there are no privacy implications.

At least there wasn’t when this Apple vs. FBI snafu first became public. Now that the tech press has taken over, I feel like it’s become a privacy issue. The problem is, it’s entirely possible that Apple may lose this case. And if they lose this case when the entire nation believes that privacy is on the line, that means it will have privacy implications, even though it really doesn’t.

What the Hell am I Talking About?

This all started with an attack by what may or may not be a terrorism-motivated group of individuals. The actual details of that attack and the investigation doesn’t really matter here, and I warn people from taking the “terrorism” part of this particular case too seriously. It shouldn’t matter if this has to do with terrorists, an enemy invasion, or a speeding ticket. The only reason it does today, is because of the Emperor Palpatine-war-powers-like overreach that came along with the Patriot Act.

FBI Director James Corney

Regardless, the FBI is investigating the case. Despite FBI Director James Corney going on record to say that the evidence shows they were only inspired by foreign groups, not connected with them, they are still taking all steps to determine if there was contact with any other terrorist cells. And it’s pretty easy to see why, considering the couple did manage to obliterate their own PC hard drives, personal phones, and pretty much any type of digital storage they had. You don’t do that if you don’t have something to hide.

But one device survived their digital holocaust. The work iPhone of the male attacker, Sayed Farook. He was an employee of the county of San Bernardino, specifically for the Health Department, which was the department hosting the party that was attacked. He also had a bright blue work-issued iPhone 5C, which was still functional, and not at all smashed.

Sayed Farook

I would guess that it’s unlikely that this guy would be stupid enough to call for any terrorist booty calls while on the clock with his work iPhone, but the FBI doesn’t really have anything else to go on, so why not, I guess.

Techno Babyl

The physical storage of Apple’s more recent iPhones, like the iPhone 5C, is encrypted. The data is, by this very nature, also encrypted, but the point is that the encryption is hardware-based. In addition, the key is generated on the phone when the user sets up their 4-digit pin number (or fingerprint in the case of the 6-series iPhones). In order for Apple to communicate with the device, in the case of pushing app downloads or updates, or uploading iCloud data, the company uses an RSA-token based system, using your Apple ID to authenticate your phone with Apple’s servers.

By using separate methods of user authentication, Apple ensures that it never receives the key that will decrypt the hardware encryption on the iPhone. This is the reason why the FBI can’t just pull out the flash storage and hook it up to something. They can’t decrypt the data without that key, which also happens to be a key that only a dead man knows. But, encryption is only as secure as its weakest link.

iCloud would normally be the weakest link in this equation, as Apple does not actually encrypt iCloud data on their servers, and because most of the data the FBI would want would be backed up to iCloud anyway. We know that Apple has been collaborating with the government on this case for a while, with Tim Cook even stating that they’ve done everything within their power to provide the information the FBI requires. Presumably, this includes iCloud data.

But the last iCloud backup that Mr. Farook did was on October 19th. And in order even know that, and access the data that had been backed up since October 19th, the county needed to reset the password on the county-owned Apple ID. But, by resetting the Apple ID, they also removed any chance of getting the device to upload a new backup. So there’s potentially a few months’ worth of unbacked up data on that phone that can only be accessed by breaking the encryption.

The next weakest link is the 4-digit passkey, of which there are only 10,000 possible combinations. By using a software algorithm, the FBI can essentially guess every single possible combination until they get it right, which wouldn’t take all that long. But there are a few problems with this approach:

  1. The iPhone is set up to automatically erase the contents of the phone after 10 unsuccessful attempts
  2. The iPhone will extend the delay required before a new password can be entered every time an unsuccessful attempt is made.

10 times is a lot less than 10,000. So, the FBI asked Apple to help.

An Unconventional Request

Apple has been collaborating with the County of San Bernardino and the FBI consistently throughout the working of this case. This is now proven thanks to some publically available court documents. Starting only three days after the attack, Apple started providing the information they had available, including texting, contact, e-mail, and iCloud data.

I’d also expect that Apple helped the county reset the Apple ID, provide vital iCloud data afterwards, and I’d also expect that the FBI’s request to unlock the iPhone came to Apple before it was made public. It’s widely speculated that Apple actually requested the FBI to keep the request out of the public, considering that if Apple actually broke the hardware encryption of an iPhone, it would seriously damage its undeserved reputation for security and privacy. But, the FBI made its court mandated demands public, perhaps hoping that they could use the power that the word “terrorism” has over the mob mentality to get Apple to do what they want.

What’s odd is that the actual request is very technically precise, indicating exactly what Apple needs to do to help them break past the encryption. Many of us in the tech press didn’t even know that Apple could do these things, but there it is, with Apple spinning around and crying foul. I think that this is how the FBI got that information:

FBI Dude: “So…if we were to say, try to use a brute force tool to break the 4-digit key, what would happen?”
Apple dude: “Uh, it’d flash itself after 10 times. Even if that were to be removed, each entry would take exponentially longer. It starts at the hardware limit of 80 miliseconds. So clearly it’s impossible.”

FBI Dude: “Huh. Is there a way to remove the 10 guess limit?”

Apple Dude: “Heck no, man. Well, okay, we can push updates without unlocking the phone. Theoretically it’s possible to remove the 10 guess limit with an update. But that’d be ridiculous right? That’d take an entire software team and shit, and nah, that’s impossible.

FBI Dude: “Hm…” *Goes to court* “Gimme a court order saying Apple needs to push an update that removes the 10 guess limit and keeps the timeframe for guesses at 80 milliseconds.”

Court Dude: “K.”

FBI Dude: *Goes to Apple Dude’s place* “Hey you know the thing you told me would be ridiculous and stuff? Yeah here’s a court order to do it.”

Apple Dude: God damn it….

Never give out the answer to a problem if you’re not willing to do it. Anyway, that’s what happened. The FBI got a court order requiring that Apple push an update to the phone that would break the 10-guess limit and keep the timeframe for password guessing at 80 milliseconds. That way, the FBI could use a brute force tool and crack the 4-digit passkey in about 15 minutes.

Why This Isn’t a Privacy Issue

Okay, let’s first break down why people think this is a privacy issue, and why Apple wants people to believe this is a privacy issue:

  1. This manufactured software update could leak out, providing a tool for hackers to break into or modify any pre-iPhone 6 devices (which there are a lot of).
  2. The tool would need to be delivered to the county of San Bernardino or the FBI, and would eventually leak down to the level of beat cops, which would mean that iPhones could no longer secure information from the authorities.
  3. This is essentially akin to requesting Apple create a “backdoor” into their encryption leading to a slipperly slope in which unbreakable encryption will become essentially illegal and impossible.
  4. Another precedent, allowing the government to break into the encrypted data of user’s devices without their permission, by requesting that the company who provided the encryption break it.
  5. The FBI director uploaded a list of 12 other iPhones waiting to be cracked by Apple – proof of the desire for a precedent.

None of this is true.

FIRST, let’s mention the most important piece of information that’s not being reported nearly anywhere in the press. In fact, the only way I found out about it at first, was to listen to Steve Gibson’s Security Now podcasts on TWiT. Which is a ridiculous way to get information. But I’ve since researched it and determined it to be true. Due to the fact that Apple devices are hardware encrypted, and Apple has to use an RSA-token to communicate with the device, all software updates are by necessity customized for each device. If someone got access to this tool, it could only be provided to this one single iPhone 5C. A hacker would essentially need to break into the device in order to get the information they’d need to modify the code so they could break into it. Catch-22.

SECOND, the court order actually requests that Apple be given the device, that they upload the software themselves, and then provide the device back to the FBI for brute forcing. In other words, this tool would never be provided to the government.

THIRD, this is not a backdoor. A backdoor is an inherent weakness in the hardware encryption, essentially like opening up a port for the government to walk through. That’s not what’s happening here. Apple is being requested, on a one-time basis, to weaken the security of the device so that the government can break into the front door. The biggest security flaw in the iPhone 5C is that even with all this extensive hardware encryption, everything still rides on a 4-digit passkey. That’s a pretty flawed encryption method to begin with. I’m assuming that this is one of the reasons why Apple went with a fingerprint-based authentication with the 6-series of devices.

FOURTH, this phone is not owned by the attacker. It was purchased and owned by the County of San Bernardino. In this case, the owner of the device gave Apple permission to do whatever was needed to help the FBI get into it. So this has nothing to do with consent.

So, for the ease of the internet, let’s list these in order:

  1. The requested “tool” will by necessity only work on this one device
  2. The FBI will never be given physical access to the tool
  3. This is not a backdoor, just an exploit of Apple’s already shitty security
  4. The phone is owned by the damn government, who gave complete permission

The Precedent

You might have noticed that I didn’t address the 5th point above, that’s because it’s true, but it doesn’t really mean what you think it means. Yes, the FBI wants a precedent here. It will make getting access to those other 12 phones much easier. But what do they want a precedent for? Well, we don’t know anything about those 12 phones other than what jurisdiction they came from.

My guess is that they are all government-owned iPhones that were vital pieces of cases that have either not been solved or were not solved to the government’s desires. Once again, Apple giving them access to these phones would not be a privacy issue since it would be the same thing as if you locked yourself out of your iPhone, went to an Apple store, somehow proved that you were the owner, and asked them to help you get back into it. If all of those conditions were met, you bet a Genius Counter worker would try to help you out.

The only difference is that in the above case, you’d also have to specify that you don’t want to lose any data on that device. Well, most consumers couldn’t demand that, because it’s not technically possible for some low-level employee to do the kind of thing that the government is asking of Apple.

But it’s important to note that this request is not immoral or inappropriate. If it were possible for Apple to help a user reset their pin code without losing data, it would be reasonable for the owner to ask Apple to do so, as long as they prove that they are the owner and that the iPhone belongs to them. It is only immoral if the iPhone is owned by a different person, and that is not what is at stake here.

Why Apple is Bitching

So here is where we come to the heart of the matter. Since Apple declared its intent to fight this request, Google and Microsoft have both thrown in their support for this case. I believe that their fight is correct, and I believe these companies are doing the right thing. However, it’s not because of privacy, but because this order would place an unfair burden on Apple’s resources.

Creating and implementing this request would require the time and work of programmers, QA workers, technical writers, etc. Essentially, Apple would have to create and pay an entire team of people dedicated to creating custom software for these government iPhones so the FBI could crack into them. And presumably, the FBI isn’t footing the bill.

I may be a pretty liberal guy, but I also don’t believe a private company should be working for the government without getting paid for their time. If the government were to compensate them for their time, that’s another matter. However, I don’t expect that to happen, since that would require congressional approval.

The Danger of Escalation

I fear that the damage has already been done. The meager words of this lone man will probably never echo much further than a small set of readers, and this case will be heralded as a definitive battle between Security and Privacy.

Please, no.

It’s not that I disagree with Apple in this case, but I fear that Apple is going to lose. That despite all of their resources, all of their support, that the evidence is against them. The only argument that Apple has to stand on is that the government is being unfair to their employee resources, which is an argument that will probably fall on deaf ears, given Apple’s enormous vault of cash and record profits.

This case will absolutely go to the Supreme Court, which hasn’t exactly been the most privacy-conscious entity lately. Maybe if Obama gets a chance to choose a replacement for Scalia, a liberal shift might give privacy a chance. But given Republican opposition, and given the likelihood of new Democratic president, I doubt we’ll get a replacement until after this case has been decided. Which, of course, means there is a high likelihood that either the Supreme Court will rule in favor of the FBI or will rule in a 3-3 split decision. Should that happen, the lower court’s decision will stand, and I doubt that’ll be favorable to Apple either.

This is not the case to declare as Security v. Privacy. It’s like taking a last stand with a rusty sword and a gimp leg. If we’re going to make that case happen, I’d rather the deck be stacked in privacy’s favor.