Yes, The Backdoor That The FBI Is Requesting Can Work On Modern iPhones Too

from the beware dept

So... over the past couple days, plenty of folks (including us) have reported that the backdoor demanded by the FBI (and currently granted by a magistrate judge) would likely work on the older iPhone model in question, the iPhone 5C, but that it would not work on modern iPhones that have Apple's "Secure Enclave" -- basically a separate chip that stores the key.
Plenty of reports -- including the Robert Graham post that we linked to, and a story by Bruce Schneier -- suggested that an attempt to follow through with the FBI's request in the presence of the Secure Enclave would effectively eliminate the key and make decryption nearly impossible.

However, earlier this morning Apple started telling a bunch of people, including reporters, that this is not true. Effectively they're saying that, yes, the new software could update the Secure Enclave firmware and keep the key intact -- meaning that this backdoor absolutely can be used against modern iPhones. One of the guys who helped design the whole Secure Enclave setup in the first place, John Kelley, has basically said the same thing, admitting that updating the firmware will not delete the key:


A blog post by Dan Guido -- which originally asserted that the Secure Enclave would be wiped on update -- now admits that's not true and, yes, this backdoor likely works on modern iPhones as well:
Apple can update the SE firmware, it does not require the phone passcode, and it does not wipe user data on update. Apple can disable the passcode delay and disable auto erase with a firmware update to the SE. After all, Apple has updated the SE with increased delays between passcode attempts and no phones were wiped.
I've asked some security folks if it's possible that future iPhones could be designed to work the way people thought the Secure Enclave worked, and the basic answer appears to be "that's a fairly difficult problem." People have some ideas of how it might work, but all came back with reasons why it might not. I asked one security expert if there was a way for Apple to build a more secure version that was immune to such an FBI request, and the response was: "I don't know. I sure hope so."

Update: I should add that this backdoor still just makes it easier for the FBI to then try to brute force a user's PIN or passcode. If the user sets a significantly strong passcode, you have a better chance of protecting your data, but that's on the user (and, also, many users likely find it hellishly inconvenient to have a strong passcode on their phone).
Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: backdoors, crypto, doj, encryption, fbi, going dark, iphone, secure enclave
Companies: apple


Reader Comments

Subscribe: RSS

View by: Time | Thread


  1. icon
    Whatever (profile), 18 Feb 2016 @ 11:05am

    ...and it starts to come out. This is this the sort of thing Apple really didn't want anyone to know.

    link to this | view in thread ]

  2. icon
    Uriel-238 (profile), 18 Feb 2016 @ 11:12am

    The terrible truth is that iPhones are already backdoored.

    The debate is whether Apple should provide for the FBI the key.

    The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control. -- Apple's Public Statement

    If the security bypass can be built, then we should be assuming that it will be built, by the FBI, by China, by someone, and this is hobbled encryption. This is insufficient security.

    Imagine if the iPhone in question didn't belong to a dead terrorist, but a live rape victim assaulted by an Apple executive or someone that Apple officials would want to protect. How long would it be before the phone was cracked then?

    The answer should be: Millennia. Eons. Forever.

    link to this | view in thread ]

  3. identicon
    Kenpachi, 18 Feb 2016 @ 11:18am

    So, a good Diceware Passphrase is the last defence against APT's

    No PIN, no password, no fingerprint ID, no touch pattern, no iris recognition, no any other BS.

    Only something that is "the product of your mind and to which your mind is the sole custodian"

    Granted, entering a 14 word diceware passphrase (out of N diceware wordlists) + 6 (or more) randomly selected chars, randomly positioned along the passphrase + 2 (or more) or more personal pads... that's a major deal breaker.

    Nothing less than 300 bits of entropy will do.

    The take away is always the same: words are not going anywhere if you care about your privacy.

    link to this | view in thread ]

  4. identicon
    Michael, 18 Feb 2016 @ 11:30am

    Re: The terrible truth is that iPhones are already backdoored.

    If the security bypass can be built, then we should be assuming that it will be built, by the FBI, by China, by someone, and this is hobbled encryption

    That is not completely true. In addition to being able to build the firmware, anyone other than Apple would also have to figure out how to properly sign the firmware so it would actually work on the iPhone. That adds a huge barrier for anyone else.

    link to this | view in thread ]

  5. identicon
    AJ, 18 Feb 2016 @ 11:33am

    Re:

    The backdoor will only work if Apple creates it. If they do create it, then China, Russia, and pretty much every other government will be next in line for their copy of it. No one will trust Apple again. It will crush Apple into sauce!

    link to this | view in thread ]

  6. identicon
    Anonymous Coward, 18 Feb 2016 @ 11:36am

    Re: Re: The terrible truth is that iPhones are already backdoored.

    Just how long do think Apple can keep their key away from the hands of governments. Given the perceived value of such a facility to governments, how many hours will it remain out of government hands>

    link to this | view in thread ]

  7. icon
    jilocasin (profile), 18 Feb 2016 @ 11:50am

    Two words; parallel construction

    Two words; parallel construction

    If just means already has been as far as agencies like the NSA are concerned. As the Snowden revelations should have made clear, the only problem with the tin foil hat wearing paranoid conspiracy proclaiming subculture was that they weren’t paranoid enough.

    Assuming the FBI gets it's way, the only thing it will change is that every LEO, government agency, and divorce attorney will now be able to get access to anyone's iPhone contents. Well that and the data that just happens to fall out of an NSA briefcase and into an LEO/CIA operatives hands will attributed to this perfectly legal and above board strongly tailored and judicially approved source.

    Well that and people who care anything about security will migrate to other platforms.

    link to this | view in thread ]

  8. identicon
    Anonymous Coward, 18 Feb 2016 @ 12:08pm

    Re: The terrible truth is that iPhones are already backdoored.

    No, that answer should be, "Why are these criminals not in jail yet?" as a response to the FBI's request.

    Remember, Apple already complied with the legitimate warrant in this case.

    link to this | view in thread ]

  9. identicon
    Anonymous Coward, 18 Feb 2016 @ 12:10pm

    Strong Passwords on iOS

    Update: I should add that this backdoor still just makes it easier for the FBI to then try to brute force a user's PIN or passcode. If the user sets a significantly strong passcode, you have a better chance of protecting your data, but that's on the user (and, also, many users likely find it hellishly inconvenient to have a strong passcode on their phone).


    It's not really inconvenient at all to have a strong passcode; just create an easy (for you) to remember 15 character passcode, set it, enable TouchID, and you're done. Only time you need to enter the passcode now is at firmware updates and reboots; the rest of the time, your TouchID does the trick. So it still gets to be convenient, while encrypted with a strongish code.

    link to this | view in thread ]

  10. icon
    Lord_Unseen (profile), 18 Feb 2016 @ 12:16pm

    Let's make one thing perfectly clear. The FBI already had the means to crack this iPhone. All this backdoor does is make is slightly easier to do. There are software/hardware out there that can crack a 4-6 digit PIN, even with the lockouts/erase enabled. It just takes longer. That's really what this is about. The FBI didn't want to take the amount of time it would take to brute force the PIN without Apple's help, so they used the courts to force Apple to backdoor the lockout/secure erase functions, shaving quite a bit of time off the brute force attempt.
    So, while this is terrible, it's not quite as bad as it seems.

    link to this | view in thread ]

  11. icon
    Uriel-238 (profile), 18 Feb 2016 @ 12:39pm

    Re: Re: The terrible truth is that iPhones are already backdoored.

    Why would the iPhone emulator used to generate the AES key know it was being unlocked by an iPhone or an emulator? Why would it care if the emulated firmware was signed?

    The data isn't smart. The critical algo on which Apple depended is not tied to Apple or any signature and can be written and applied externally.

    link to this | view in thread ]

  12. identicon
    Anonymous Coward, 18 Feb 2016 @ 12:40pm

    So since apple already HAS a backdoor, it comes down to whether we trust the government, and the requirements of a court order to use it; or apple, who can use it any time they damned well chose without anyone's knowledge. We don't get to choose "C", neither (unless we choose not to purchase anything from apple, and then we're at someone else's mercy).

    link to this | view in thread ]

  13. icon
    jameshogg (profile), 18 Feb 2016 @ 12:40pm

    I'm starting to have second thoughts about this particular case. Don't get me wrong I'm all for civil liberties and "warrant or gtfo", but a warrant was delivered here and I'm beginning to suspect Apple's "masterkey" theory is a bit far-fetched.

    If the argument is "if we were made to crack this phone all other phones (the 5C) will be compromised" but it's worth bearing in mind that the FBI have stated they do not want possession of the hack-update that will unlock this phone. See here: http://www.bbc.co.uk/news/technology-35601035 so if anything Apple are at liberty to destroy the hack after it's all unlocked and finished.

    It's also worth bearing in mind that several phones beforehand were hacked open at the request of a court: http://www.nydailynews.com/news/national/apple-unlocked-70-iphones-refusal-article-1.2536178

    What's being asked is, essentially, to customise the iOS source code and compile it for this device only, which again it must be stressed is not an update required to be in possession of the FBI. If this were not the case I would probably protest against that, but that alone, not the hacking at the behest of a warrant. Because it would be the equivalent of - don't think "masterkey" - giving a copy of all unique keys to everybodys' houses for the sake of allowing law enforcement to find one key into a single house. But it is quite possible to be against the former and be for the latter, and suspect that the keybearer is incorrect when he says the handing over of one key will lead to the handover of all keys as some kind of "slippery slope".

    Slippery slope fallacies exist too, and this could be one of them.

    "Keys" isn't even the right metaphor here anyway. What we are talking about is not encryption, but hacking. It's also not right to say "if someone else has a way in bad guys will exploit it" - well, does the fact that "good guys" can change their source code for updates on their end also mean a slippery slope where "bad guys" will exploit? In other words, is the fact that source code is updateable at ALL in itself a "backdoor"? It can't be, unless you really stretch it. I mean, criminals can break into companies and steal source code all the time e.g. Half Life 2, but we wouldn't talk about thatin the rhetoric of "You SEE? This is why we need end-to-end encry-*cough*, sorry, source code security!" There is no such thing as "end-to-end" source code which is unhackable by a good or bad middleman.

    The point about true "end-to-end" encryption is that not even the encryption designer, Apple, can get in. But if Apple can get in here, the "backdoor" is already there, but that's because of source code, not asymmetric keys or anything.

    As for this slippery slope, again no court or even government agency would seriously say that because we need to break into this guy's house, therefore all citizens from now on must have no locks on their doors. It wouldn't get far.

    link to this | view in thread ]

  14. icon
    streetlight (profile), 18 Feb 2016 @ 12:42pm

    All those iPhone users have been mislead

    I'm not an iPhone user and pretty ignorant about what's going on here with regard to Apple's ability to undo the encryption on it's phones, old or new, but my guess is that a vary high percentage (>>99% ?) of iPhone purchasers believed that no one, not even Apple, could gain access to the encrypted data on an iPhone that was properly secured. I'm not sure Apple ever said that but it seems to be the conventional wisdom. It's also possible that a lot of folks didn't care, but for knowledgeable people who use their iPhones for things that ordinary, law abiding folks do, such as banking, retail purchases at brick and mortar stores, etc., such security was very, very important. Apple needs to match perception or admit it can't be done.

    link to this | view in thread ]

  15. identicon
    Anonymous Coward, 18 Feb 2016 @ 12:44pm

    Re: Re:

    The backdoor already exists. The only thing apple needs to create is a specific key to it to unlock this phone. And, of course, that all can be changed so it's not possible in a future hardware update. Why would anyone think the data that's stored on a portable electronic device was secure in the first place? 99% of apple's customers don't care about whether the government can get into their phones, they're worried about the spouse they're cheating on doing so.

    link to this | view in thread ]

  16. identicon
    Anonymous Coward, 18 Feb 2016 @ 12:48pm

    Re: Strong Passwords on iOS

    And then someone just needs your finger or fingerprints, which can be gotten without even a court order!

    link to this | view in thread ]

  17. identicon
    Anonymous Coward, 18 Feb 2016 @ 12:49pm

    How about I just don't buy a cell phone and we can see where the tracking, back doors, and metadata works on that?

    link to this | view in thread ]

  18. identicon
    Anonymous Coward, 18 Feb 2016 @ 1:32pm

    Re: Re: Strong Passwords on iOS

    as usual, Randall gets it right, even when noone else does: https://xkcd.com/538/

    link to this | view in thread ]

  19. icon
    jilocasin (profile), 18 Feb 2016 @ 1:48pm

    Re: Re: Re: Strong Passwords on iOS

    The only problem with the xkcd solution you've provided is that, seeing as the owner is already dead, there's no one left to drug or hit with a $5 wrench.

    Unless you are talking about Tim Cook.....

    link to this | view in thread ]

  20. identicon
    Anonymous Coward, 18 Feb 2016 @ 2:02pm

    Re: Re: Re: Re: Strong Passwords on iOS

    I suppose I wasn't clear, the implication is grotesque violence in general. Cutting off of fingers (even on a corpse) to get past the fingerprint scanner in the more modern phones is what I was thinking. If you look at the comment I was responding to, he was specifically talking about fingerprint readers in modern phones.

    Once you're dealing with an group with no oversight and no morals, (the mob, ISIL, the NSA...) there are ways to get in.

    and for the record, that isn't meant as a solution any more then 1981 is supposed to be a guidebook.

    link to this | view in thread ]

  21. identicon
    AJ, 18 Feb 2016 @ 2:07pm

    Re: Re: Re:

    "99% of apple's customers don't care about whether the government can get into their phones, they're worried about the spouse they're cheating on doing so."

    I'm soooo glad you speak for 99% of Apple's customers. I was a bit worried that they might actually take their privacy seriously and not want the Gov. poking around in their phone whenever they please. Thanks for clearing that up for me.

    link to this | view in thread ]

  22. identicon
    Anonymous Coward, 18 Feb 2016 @ 2:09pm

    They did it wrong

    The hardware securing the keys should be built in such a way that it is impossible to change its firmware.

    If the security hardware can have is firmware updated all you have is security through obscurity which is no security at all.

    Now that we know the truth time to start the countdown for some hacker to exploit secure enclave.

    link to this | view in thread ]

  23. identicon
    Anonymous Coward, 18 Feb 2016 @ 2:18pm

    Re: Re: Re: Re: Re: Strong Passwords on iOS

    1984*

    link to this | view in thread ]

  24. icon
    morganwick (profile), 18 Feb 2016 @ 4:03pm

    Re:

    These days, it seems like nothing is far-fetched when it comes to the government getting whatever information they want.

    link to this | view in thread ]

  25. identicon
    JBDragon, 18 Feb 2016 @ 5:19pm

    Re: Strong Passwords on iOS

    Ya, I use a 8 Didgit Pin on my iPhone, but most of the time I'm logging in with TouchID which is almost as fast as no password at all. Which was the whole point of Apple adding TouchID in the first place. To get people to password lock their phones!!!

    There was a big and growing issue of people being robbed, or mugged over their iPhones. You could plug them into a PC and Wipe the phone and be as good as a new. iPhones have a high resale value. Since iOS 8, you can no longer do that. You need the passcode to get on the phone and wipe it. Otherwise it's worthless. About all it's good for is a paperweight. Not going to get much parting it out.

    Apple has had hardware encryption on iPhones since the 3G or 3GS I believe. Apple has expanded on it every year. The whole Secure Enclave these days for Security on the phone and Apple Pay and Touch ID is part all part of Apple's custom A* processors.

    link to this | view in thread ]

  26. identicon
    JBDragon, 18 Feb 2016 @ 5:21pm

    Re:

    Apple doesn't already have a Backdoor. They want Apple to create one to use on this One phone. Which really once the FBI get's their hand on that Phone after that, it's a free for all to the FBI, CIA to get their hands on it!

    link to this | view in thread ]

  27. identicon
    JBDragon, 18 Feb 2016 @ 5:28pm

    Re: All those iPhone users have been mislead

    You don't seem t know what you're talking about. They are trying to get Apple to create a hack, wither Apple can do it or not? It's only a hack to stop the 10 failed tries and the iPhone will wipe it's self if turned on, and the Delay that grows worse and worse and you enter the wrong passcode. That makes is almost pointless to brute force attack the phones 4 digit code. If it's longer almost impossible.

    Newer iPhones are even harder. I don't see Apple being able to break the Encryption. Check out this PDF from Apple on Security with iOS9 currently. Pretty interesting how it works.

    https://www.apple.com/business/docs/iOS_Security_Guide.pdf

    This is of course a older iPhone 5C with iOS 7 and no TouchID. So maybe Apple could get around it, but should they? NO! The Federal Government can't even protect it's self from hacks.

    link to this | view in thread ]

  28. identicon
    Anonymous Coward, 18 Feb 2016 @ 7:11pm

    Re: Two words; parallel construction

    It's funny how history changes meanings.

    "tin foil hat wearing paranoid conspiracy proclaiming"

    Also see: observant

    link to this | view in thread ]

  29. icon
    streetlight (profile), 18 Feb 2016 @ 8:26pm

    Re: Re: All those iPhone users have been mislead

    I admitted I was somewhat ignorant of what Apple could do and what folks who bought iPhones thought Apple or anyone else could not do, that is get at the decrypted data in the phone. According to Mike Masnick's post above even new phones can be hacked without destroying the data or the encryption key. Updating the phone's OS does not harm the data or or the encryption key if done properly. Apple can do this and apparently has done this at least 70 times for law enforcement.

    My interest is to stimulate discussion. I haven't heard whether Apple ever claimed the encrypted data and its decryption key could be obtained. It looks like it can regardless of what iPhone owners thought.

    link to this | view in thread ]

  30. identicon
    Anonymous Coward, 19 Feb 2016 @ 12:38am

    Re:

    but it's worth bearing in mind that the FBI have stated they do not want possession of the hack-update that will unlock this phone

    And next time the FBI want it applied to a phone it they will say:
    "You have the hack, and we can get a court order, so be good boys and do it for us and lets avoid all the fuss of using the courts. Oh, and by the way here is a gag order to stop you talking about it."

    link to this | view in thread ]

  31. icon
    jameshogg (profile), 19 Feb 2016 @ 1:07am

    Re: Re:

    Well you could say that could happen after any crime. You wouldn't therefore dismiss the validity of warranted searches on that basis.

    I mean, state forces can bully anyone into spying without a warrant in secrecy in regards to any kind of pretence. You could therefore say the slippery slope would apply to any court order and do away with prosecutions completely.

    link to this | view in thread ]

  32. icon
    Bemused Non-Apple Developer (profile), 19 Feb 2016 @ 2:13am

    This is a non-issue, isn't it

    I believe these two things to be true,

    * An update must be signed by a secret key secured by Apple
    * S/N in HW can be unspoofably interrogated in signed update

    If the signed update says "if s/n equals bad-guy-phone open command pipe over wifi and bypass authentication-attempt delays", why does it matter whether the FBI (or anyone else) has it?

    This order should change nothing in either legal precedent or user security, and may be performed with a modicum of low-complexity code.

    What am I missing here?

    -BNAD

    link to this | view in thread ]

  33. icon
    jameshogg (profile), 19 Feb 2016 @ 2:41am

    Re: Re:

    And it's worth asking as well: we always talk about on this site the importance of security holes in programs coming to light, precisely so that they can be patched and fixed with greater security in the end. Sometimes not in ways I approve of - I don't exactly think that because someone's front door is left wide open that therefore you need to walk into the house and tresspass in order to make the point that there's a security hole. In fact local police here in the U.K. did that exact same thing and were rightly criticised for it: http://www.huffingtonpost.co.uk/2016/01/26/coventry-police-criticised-over-burglary-patrol-tactic_n_ 9075722.html

    So I don't believe it is honourable to hack a company through its security hole in order to make them aware of that security hole.

    However, I am in favour of raising awareness of those security holes (discretely, so that noone may take advantage of them in the meantime). And ultimately that means Apple need to make it so that they can't hack their way into a phone whatsoever.

    If they are still able to hack into the phone, it's a flaw with the initial design in the first place. And if you acknowledge that this flaw is unavoidable because that's the nature of coding - that is, someone with the source code can recode and infiltrate - you have to accept that it's the case regardless, and the FBI are entitled to get Apple to hack this single phone without hacking others.

    Asking for one key is not the same as asking for all keys. If it were, we wouldn't be trusting our door locks at all.

    link to this | view in thread ]

  34. identicon
    Anonymous Coward, 19 Feb 2016 @ 3:00am

    Re: This is a non-issue, isn't it

    What am I missing here
    ?
    The FBI is trying to establish the precedent that companies are required to install backdoors on specified devices when the FBI demands them to do so. That is only half a step away from demanding that they install them on all devices.

    link to this | view in thread ]

  35. identicon
    Anonymous Coward, 19 Feb 2016 @ 4:44am

    Another thing to consider is that they may not actually need to buteforce the passcode, and just claim they did. With all the information the various three letter agencies suck up who is to say that they don't already have what they want but need a justification to use it without revealing where they got it from. We've already seen several stories aon Techdirt about such scenarios actually occurring.

    link to this | view in thread ]

  36. icon
    nasch (profile), 19 Feb 2016 @ 6:48am

    Re: Re: Re: The terrible truth is that iPhones are already backdoored.

    Why would the iPhone emulator used to generate the AES key know it was being unlocked by an iPhone or an emulator?

    It's not an emulator, it's a simulator, and I don't think you can use it to attack an actual phone. What did you have in mind?

    link to this | view in thread ]

  37. icon
    nasch (profile), 19 Feb 2016 @ 6:51am

    Re: Two words; parallel construction

    Well that and people who care anything about security will migrate to other platforms.

    Are they more secure?

    link to this | view in thread ]

  38. identicon
    Anonymous Coward, 19 Feb 2016 @ 7:16am

    Re:

    If they already has the passcode, it is easy to disguise where they got it, by claiming for instance that is was written down on something in his wallet.

    link to this | view in thread ]

  39. icon
    Wyrm (profile), 19 Feb 2016 @ 9:04am

    indirect backdoor

    So there already is a backdoor in iOS, and that is the simple fact that iOS can be updated without approval from a signed user. From there, anything is possible, even if the "anything" has yet to be built.
    Isn't there a way to prevent those "sneak" updates? That is quite a large breach of security, since all it requires is a single certificate to fool the system into accepting anything.

    link to this | view in thread ]

  40. icon
    Uriel-238 (profile), 19 Feb 2016 @ 11:09am

    Emulators vs. Simulators.

    When you want to run an old coin-op video game chipset on a PC, you run an emulator which emulates / simulates / imitates the motherboard that those games used.

    Essentially, a team of hackers would break open the iPhone, extract the flash chips and pull the data directly from those chips. It then could revise how the emulator interacts with the data any way it wants (e.g. allow unlimited password guesses without delays).

    If we were at war and this was the phone of an enemy operative, that's how I'd guess the military approach would be: you don't trust the original unit any more than you have to.

    link to this | view in thread ]

  41. icon
    beltorak (profile), 19 Feb 2016 @ 1:24pm

    Re: Emulators vs. Simulators.

    Ok; here's a quick (probably a little oversimplified) rundown. [I'm only really speaking to the modern implementation, there are likely details that are wrong about the specific phone in question.] It doesn't really matter that it knows or doesn't know that it is running in an emulator.

    There's two keys involved. The key encryption key, and the data encryption key. The data encryption key is itself encrypted and stored in the secure portion of the CPU - the "trust zone" or "secure enclave". The data encryption key is used to encrypt and decrypt all the "main" information on the phone.

    The key encryption key is used to encrypt / decrypt the data encryption key. The key encryption key is generated using several inputs - the PIN and some hardware identifiers. Some of these hardware identifiers are only stored in the trust zone. The boot code calls upon the trust zone and feeds it user input (the PIN code), the trust zone combines that with hardware identifiers, and runs it through some super expensive (in terms of time and RAM) math to recreate the key encryption key. The key encryption key is used to decrypt the data encryption key and some known string of bits. The data encryption key is handed back to the boot code after the known string of bits is verified as decrypted correctly. If the verification fails, the trust zone doesn't hand back anything. 10 wrong attempts and the trust zone wipes the encrypted data encryption key.

    So far everyone agrees that it is practically impossible to extract the necessary hardware identifiers and/or the encrypted data encryption key from the trust zone. In short, cloning / emulating / simulating will not work, because necessary components to recreating the key encryption key will not be available. This is why it doesn't matter if the code knows whether or not it's being emulated. All it has to do is attempt to decrypt something known beforehand; if the inputs were wrong (user PIN and / or machine identifiers), the decryption simply fails to decrypt a known value.

    Running an old coinop is in no way comparable because those old coinops did not have 1) data that could not be feasibly replicated, and therefore 2) code that used mathematics that would only yield a correct value if run in the correct environment. If someone could clone the trust zone (with all the data), then this whole argument would be moot - the government could take any similar iphone and clone the trust zone (and data) to it.

    link to this | view in thread ]

  42. icon
    nasch (profile), 19 Feb 2016 @ 1:44pm

    Re: Re: Emulators vs. Simulators.

    Ok; here's a quick (probably a little oversimplified) rundown.

    Thanks!

    link to this | view in thread ]

  43. identicon
    Anonymous Coward, 19 Feb 2016 @ 2:42pm

    Re: Re: Re:

    "Asking for one key is not the same as asking for all keys. If it were, we wouldn't be trusting our door locks at all."

    ...Unless its build us a master-key and we'll use it on this one lock. Now you understand the problem with all of this.

    link to this | view in thread ]

  44. icon
    Uriel-238 (profile), 19 Feb 2016 @ 3:22pm

    Re: Re: Emulators vs. Simulators.

    Essentially, I suspect the trust zone is easy to crack.

    Apple may disagree with me.

    Doing it this way may speed things up for the Feds.

    But but the magitech that seals data in the trust zone sounds to me as dubious as moon cheese. I wouldn't trust it to keep a CRYPTAN team out of my phone.

    link to this | view in thread ]

  45. icon
    beltorak (profile), 19 Feb 2016 @ 4:17pm

    Re: Re: Re: Emulators vs. Simulators.

    I would love to see that. I've seen a few presentations (DEFCON and the like) about defeating shoddy TPM chips. It's quite fascinating. The trust zone is essentially a TPM chip, so I'd imagine the same sorts of hardware tamper resistances have been built into it (which is why you can't just pop the cover off and hook up a jtag).

    I don't know if their trust zone is or is not breakable, but the consensus is that if you physically tamper with it to physically "read" the bits, the data retrieval is less than 100%. And of course the process physically destroys the chip. No second chances. And the trust zone is actually part of the main CPU die, so that may or may not complicate matters a bit.

    But as far as software / firmware goes, yeah the trust zone is the weak link. This whole article talks about how we were all mistaken in one assumption. The common wisdom would say that the trust zone should wipe the data encryption key if told to reflash itself. But it doesn't. Normal operation wouldn't be affected, because the flash rewriter could feed the data encryption key back to it.

    link to this | view in thread ]

  46. icon
    Uriel-238 (profile), 19 Feb 2016 @ 5:55pm

    Re: Re: Re: Re: Emulators vs. Simulators.

    Thanks for turning me on to the whole TPM thing. Reading up on it now.

    The process it uses to protect other data makes sense. But yeah, it makes me curious about the tamper resistance tech.

    I learn something new every day.

    link to this | view in thread ]

  47. identicon
    Anonymous Coward, 22 Feb 2016 @ 4:21am

    Re: Strong Passwords on iOS

    The problem with fingerprint unlock is that, in the US, you can be compelled to provide your finger for the purpose of unlocking the phone. A password is protected by the first and fifth amendments. If your concern in privacy from government or law enforcement during an adversarial engagement, I would recommend leaving touchID out of it.

    link to this | view in thread ]

  48. icon
    Uriel-238 (profile), 22 Feb 2016 @ 9:42am

    Fridge logic

    Also, you generally don't want your eye or extremity to suddenly become more valuable than the rest of you.

    link to this | view in thread ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.