Yes, The Backdoor That The FBI Is Requesting Can Work On Modern iPhones Too
from the beware dept
So... over the past couple days, plenty of folks (including us) have reported that the backdoor demanded by the FBI (and currently granted by a magistrate judge) would likely work on the older iPhone model in question, the iPhone 5C, but that it would not work on modern iPhones that have Apple's "Secure Enclave" -- basically a separate chip that stores the key.However, earlier this morning Apple started telling a bunch of people, including reporters, that this is not true. Effectively they're saying that, yes, the new software could update the Secure Enclave firmware and keep the key intact -- meaning that this backdoor absolutely can be used against modern iPhones. One of the guys who helped design the whole Secure Enclave setup in the first place, John Kelley, has basically said the same thing, admitting that updating the firmware will not delete the key:
@AriX Not true, if Apple can be forced to modify iOS, they can be forced to modify SEP firmware as well. @trailofbits has SEP details wrong
— John Kelley (@JohnHedge) February 17, 2016
@AriX I have no clue where they got the idea that changing SPE firmware will destroy keys. SPE FW is just a signed blob on iOS System Part
— John Kelley (@JohnHedge) February 17, 2016
Apple can update the SE firmware, it does not require the phone passcode, and it does not wipe user data on update. Apple can disable the passcode delay and disable auto erase with a firmware update to the SE. After all, Apple has updated the SE with increased delays between passcode attempts and no phones were wiped.I've asked some security folks if it's possible that future iPhones could be designed to work the way people thought the Secure Enclave worked, and the basic answer appears to be "that's a fairly difficult problem." People have some ideas of how it might work, but all came back with reasons why it might not. I asked one security expert if there was a way for Apple to build a more secure version that was immune to such an FBI request, and the response was: "I don't know. I sure hope so."
Update: I should add that this backdoor still just makes it easier for the FBI to then try to brute force a user's PIN or passcode. If the user sets a significantly strong passcode, you have a better chance of protecting your data, but that's on the user (and, also, many users likely find it hellishly inconvenient to have a strong passcode on their phone).
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: backdoors, crypto, doj, encryption, fbi, going dark, iphone, secure enclave
Companies: apple
Reader Comments
Subscribe: RSS
View by: Time | Thread
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Re: Re:
[ link to this | view in chronology ]
Re: Re: Re:
I'm soooo glad you speak for 99% of Apple's customers. I was a bit worried that they might actually take their privacy seriously and not want the Gov. poking around in their phone whenever they please. Thanks for clearing that up for me.
[ link to this | view in chronology ]
The terrible truth is that iPhones are already backdoored.
The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control. -- Apple's Public Statement
If the security bypass can be built, then we should be assuming that it will be built, by the FBI, by China, by someone, and this is hobbled encryption. This is insufficient security.
Imagine if the iPhone in question didn't belong to a dead terrorist, but a live rape victim assaulted by an Apple executive or someone that Apple officials would want to protect. How long would it be before the phone was cracked then?
The answer should be: Millennia. Eons. Forever.
[ link to this | view in chronology ]
Re: The terrible truth is that iPhones are already backdoored.
That is not completely true. In addition to being able to build the firmware, anyone other than Apple would also have to figure out how to properly sign the firmware so it would actually work on the iPhone. That adds a huge barrier for anyone else.
[ link to this | view in chronology ]
Re: Re: The terrible truth is that iPhones are already backdoored.
[ link to this | view in chronology ]
Re: Re: The terrible truth is that iPhones are already backdoored.
The data isn't smart. The critical algo on which Apple depended is not tied to Apple or any signature and can be written and applied externally.
[ link to this | view in chronology ]
Re: Re: Re: The terrible truth is that iPhones are already backdoored.
It's not an emulator, it's a simulator, and I don't think you can use it to attack an actual phone. What did you have in mind?
[ link to this | view in chronology ]
Emulators vs. Simulators.
Essentially, a team of hackers would break open the iPhone, extract the flash chips and pull the data directly from those chips. It then could revise how the emulator interacts with the data any way it wants (e.g. allow unlimited password guesses without delays).
If we were at war and this was the phone of an enemy operative, that's how I'd guess the military approach would be: you don't trust the original unit any more than you have to.
[ link to this | view in chronology ]
Re: Emulators vs. Simulators.
There's two keys involved. The key encryption key, and the data encryption key. The data encryption key is itself encrypted and stored in the secure portion of the CPU - the "trust zone" or "secure enclave". The data encryption key is used to encrypt and decrypt all the "main" information on the phone.
The key encryption key is used to encrypt / decrypt the data encryption key. The key encryption key is generated using several inputs - the PIN and some hardware identifiers. Some of these hardware identifiers are only stored in the trust zone. The boot code calls upon the trust zone and feeds it user input (the PIN code), the trust zone combines that with hardware identifiers, and runs it through some super expensive (in terms of time and RAM) math to recreate the key encryption key. The key encryption key is used to decrypt the data encryption key and some known string of bits. The data encryption key is handed back to the boot code after the known string of bits is verified as decrypted correctly. If the verification fails, the trust zone doesn't hand back anything. 10 wrong attempts and the trust zone wipes the encrypted data encryption key.
So far everyone agrees that it is practically impossible to extract the necessary hardware identifiers and/or the encrypted data encryption key from the trust zone. In short, cloning / emulating / simulating will not work, because necessary components to recreating the key encryption key will not be available. This is why it doesn't matter if the code knows whether or not it's being emulated. All it has to do is attempt to decrypt something known beforehand; if the inputs were wrong (user PIN and / or machine identifiers), the decryption simply fails to decrypt a known value.
Running an old coinop is in no way comparable because those old coinops did not have 1) data that could not be feasibly replicated, and therefore 2) code that used mathematics that would only yield a correct value if run in the correct environment. If someone could clone the trust zone (with all the data), then this whole argument would be moot - the government could take any similar iphone and clone the trust zone (and data) to it.
[ link to this | view in chronology ]
Re: Re: Emulators vs. Simulators.
Thanks!
[ link to this | view in chronology ]
Re: Re: Emulators vs. Simulators.
Apple may disagree with me.
Doing it this way may speed things up for the Feds.
But but the magitech that seals data in the trust zone sounds to me as dubious as moon cheese. I wouldn't trust it to keep a CRYPTAN team out of my phone.
[ link to this | view in chronology ]
Re: Re: Re: Emulators vs. Simulators.
I don't know if their trust zone is or is not breakable, but the consensus is that if you physically tamper with it to physically "read" the bits, the data retrieval is less than 100%. And of course the process physically destroys the chip. No second chances. And the trust zone is actually part of the main CPU die, so that may or may not complicate matters a bit.
But as far as software / firmware goes, yeah the trust zone is the weak link. This whole article talks about how we were all mistaken in one assumption. The common wisdom would say that the trust zone should wipe the data encryption key if told to reflash itself. But it doesn't. Normal operation wouldn't be affected, because the flash rewriter could feed the data encryption key back to it.
[ link to this | view in chronology ]
Re: Re: Re: Re: Emulators vs. Simulators.
The process it uses to protect other data makes sense. But yeah, it makes me curious about the tamper resistance tech.
I learn something new every day.
[ link to this | view in chronology ]
Re: The terrible truth is that iPhones are already backdoored.
Remember, Apple already complied with the legitimate warrant in this case.
[ link to this | view in chronology ]
So, a good Diceware Passphrase is the last defence against APT's
Only something that is "the product of your mind and to which your mind is the sole custodian"
Granted, entering a 14 word diceware passphrase (out of N diceware wordlists) + 6 (or more) randomly selected chars, randomly positioned along the passphrase + 2 (or more) or more personal pads... that's a major deal breaker.
Nothing less than 300 bits of entropy will do.
The take away is always the same: words are not going anywhere if you care about your privacy.
[ link to this | view in chronology ]
Two words; parallel construction
If just means already has been as far as agencies like the NSA are concerned. As the Snowden revelations should have made clear, the only problem with the tin foil hat wearing paranoid conspiracy proclaiming subculture was that they weren’t paranoid enough.
Assuming the FBI gets it's way, the only thing it will change is that every LEO, government agency, and divorce attorney will now be able to get access to anyone's iPhone contents. Well that and the data that just happens to fall out of an NSA briefcase and into an LEO/CIA operatives hands will attributed to this perfectly legal and above board strongly tailored and judicially approved source.
Well that and people who care anything about security will migrate to other platforms.
[ link to this | view in chronology ]
Re: Two words; parallel construction
"tin foil hat wearing paranoid conspiracy proclaiming"
Also see: observant
[ link to this | view in chronology ]
Re: Two words; parallel construction
Are they more secure?
[ link to this | view in chronology ]
Strong Passwords on iOS
It's not really inconvenient at all to have a strong passcode; just create an easy (for you) to remember 15 character passcode, set it, enable TouchID, and you're done. Only time you need to enter the passcode now is at firmware updates and reboots; the rest of the time, your TouchID does the trick. So it still gets to be convenient, while encrypted with a strongish code.
[ link to this | view in chronology ]
Re: Strong Passwords on iOS
[ link to this | view in chronology ]
Re: Re: Strong Passwords on iOS
[ link to this | view in chronology ]
Re: Re: Re: Strong Passwords on iOS
Unless you are talking about Tim Cook.....
[ link to this | view in chronology ]
Re: Re: Re: Re: Strong Passwords on iOS
Once you're dealing with an group with no oversight and no morals, (the mob, ISIL, the NSA...) there are ways to get in.
and for the record, that isn't meant as a solution any more then 1981 is supposed to be a guidebook.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Strong Passwords on iOS
[ link to this | view in chronology ]
Re: Strong Passwords on iOS
There was a big and growing issue of people being robbed, or mugged over their iPhones. You could plug them into a PC and Wipe the phone and be as good as a new. iPhones have a high resale value. Since iOS 8, you can no longer do that. You need the passcode to get on the phone and wipe it. Otherwise it's worthless. About all it's good for is a paperweight. Not going to get much parting it out.
Apple has had hardware encryption on iPhones since the 3G or 3GS I believe. Apple has expanded on it every year. The whole Secure Enclave these days for Security on the phone and Apple Pay and Touch ID is part all part of Apple's custom A* processors.
[ link to this | view in chronology ]
Re: Strong Passwords on iOS
[ link to this | view in chronology ]
Fridge logic
[ link to this | view in chronology ]
So, while this is terrible, it's not quite as bad as it seems.
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
If the argument is "if we were made to crack this phone all other phones (the 5C) will be compromised" but it's worth bearing in mind that the FBI have stated they do not want possession of the hack-update that will unlock this phone. See here: http://www.bbc.co.uk/news/technology-35601035 so if anything Apple are at liberty to destroy the hack after it's all unlocked and finished.
It's also worth bearing in mind that several phones beforehand were hacked open at the request of a court: http://www.nydailynews.com/news/national/apple-unlocked-70-iphones-refusal-article-1.2536178
What's being asked is, essentially, to customise the iOS source code and compile it for this device only, which again it must be stressed is not an update required to be in possession of the FBI. If this were not the case I would probably protest against that, but that alone, not the hacking at the behest of a warrant. Because it would be the equivalent of - don't think "masterkey" - giving a copy of all unique keys to everybodys' houses for the sake of allowing law enforcement to find one key into a single house. But it is quite possible to be against the former and be for the latter, and suspect that the keybearer is incorrect when he says the handing over of one key will lead to the handover of all keys as some kind of "slippery slope".
Slippery slope fallacies exist too, and this could be one of them.
"Keys" isn't even the right metaphor here anyway. What we are talking about is not encryption, but hacking. It's also not right to say "if someone else has a way in bad guys will exploit it" - well, does the fact that "good guys" can change their source code for updates on their end also mean a slippery slope where "bad guys" will exploit? In other words, is the fact that source code is updateable at ALL in itself a "backdoor"? It can't be, unless you really stretch it. I mean, criminals can break into companies and steal source code all the time e.g. Half Life 2, but we wouldn't talk about thatin the rhetoric of "You SEE? This is why we need end-to-end encry-*cough*, sorry, source code security!" There is no such thing as "end-to-end" source code which is unhackable by a good or bad middleman.
The point about true "end-to-end" encryption is that not even the encryption designer, Apple, can get in. But if Apple can get in here, the "backdoor" is already there, but that's because of source code, not asymmetric keys or anything.
As for this slippery slope, again no court or even government agency would seriously say that because we need to break into this guy's house, therefore all citizens from now on must have no locks on their doors. It wouldn't get far.
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Re:
And next time the FBI want it applied to a phone it they will say:
"You have the hack, and we can get a court order, so be good boys and do it for us and lets avoid all the fuss of using the courts. Oh, and by the way here is a gag order to stop you talking about it."
[ link to this | view in chronology ]
Re: Re:
I mean, state forces can bully anyone into spying without a warrant in secrecy in regards to any kind of pretence. You could therefore say the slippery slope would apply to any court order and do away with prosecutions completely.
[ link to this | view in chronology ]
Re: Re:
So I don't believe it is honourable to hack a company through its security hole in order to make them aware of that security hole.
However, I am in favour of raising awareness of those security holes (discretely, so that noone may take advantage of them in the meantime). And ultimately that means Apple need to make it so that they can't hack their way into a phone whatsoever.
If they are still able to hack into the phone, it's a flaw with the initial design in the first place. And if you acknowledge that this flaw is unavoidable because that's the nature of coding - that is, someone with the source code can recode and infiltrate - you have to accept that it's the case regardless, and the FBI are entitled to get Apple to hack this single phone without hacking others.
Asking for one key is not the same as asking for all keys. If it were, we wouldn't be trusting our door locks at all.
[ link to this | view in chronology ]
Re: Re: Re:
...Unless its build us a master-key and we'll use it on this one lock. Now you understand the problem with all of this.
[ link to this | view in chronology ]
All those iPhone users have been mislead
[ link to this | view in chronology ]
Re: All those iPhone users have been mislead
Newer iPhones are even harder. I don't see Apple being able to break the Encryption. Check out this PDF from Apple on Security with iOS9 currently. Pretty interesting how it works.
https://www.apple.com/business/docs/iOS_Security_Guide.pdf
This is of course a older iPhone 5C with iOS 7 and no TouchID. So maybe Apple could get around it, but should they? NO! The Federal Government can't even protect it's self from hacks.
[ link to this | view in chronology ]
Re: Re: All those iPhone users have been mislead
My interest is to stimulate discussion. I haven't heard whether Apple ever claimed the encrypted data and its decryption key could be obtained. It looks like it can regardless of what iPhone owners thought.
[ link to this | view in chronology ]
[ link to this | view in chronology ]
They did it wrong
If the security hardware can have is firmware updated all you have is security through obscurity which is no security at all.
Now that we know the truth time to start the countdown for some hacker to exploit secure enclave.
[ link to this | view in chronology ]
This is a non-issue, isn't it
* An update must be signed by a secret key secured by Apple
* S/N in HW can be unspoofably interrogated in signed update
If the signed update says "if s/n equals bad-guy-phone open command pipe over wifi and bypass authentication-attempt delays", why does it matter whether the FBI (or anyone else) has it?
This order should change nothing in either legal precedent or user security, and may be performed with a modicum of low-complexity code.
What am I missing here?
-BNAD
[ link to this | view in chronology ]
Re: This is a non-issue, isn't it
The FBI is trying to establish the precedent that companies are required to install backdoors on specified devices when the FBI demands them to do so. That is only half a step away from demanding that they install them on all devices.
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
indirect backdoor
Isn't there a way to prevent those "sneak" updates? That is quite a large breach of security, since all it requires is a single certificate to fool the system into accepting anything.
[ link to this | view in chronology ]