Apple Responds To Order To Help Decrypt Phone, As More Details Come To Light
from the both-not-as-bad-and-just-as-bad dept
Update: Please see our more recent article detailing why it appears that this attack could apply to more modern iPhones as well.Last night, we wrote about a judge's order commanding Apple to help the FBI effectively decrypt the contents of Syed Farook's iPhone 5C. Farook, of course, along with his wife, was responsible for the San Bernardino attacks a few months ago. Many of the initial reports about the order suggested that it simply ordered Apple to break the encryption -- which made many people scoff. However, as we noted, that was not accurate. Instead, it was ordering something much more specific: that Apple create a special firmware that would disable two distinct security features within iOS -- one that would effectively wipe the encrypted contents following 10 failed attempts at entering the unlocking PIN (by throwing away the stored decryption key) and a second one that would progressively slow down the amount of time between repeated attempts at entering the PIN.
Late last night, Apple's Tim Cook also posted a very direct Message to Our Customers that highlights the importance of strong encryption and why this order is such a problem (some of which we'll discuss below).
Compromising the security of our personal information can ultimately put our personal safety at risk. That is why encryption has become so important to all of us.He notes -- as I did in my original post -- that the FBI is demanding (and the court has agreed) to force Apple to create a backdoor and that creates a number of concerns:
For many years, we have used encryption to protect our customers’ personal data because we believe it’s the only way to keep their information safe. We have even put that data out of our own reach, because we believe the contents of your iPhone are none of our business.
Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.Having spent a bunch of time overnight reading through the details of the DOJ's original application, as well as reading through a few public discussions by security experts and, especially, Robert Graham's useful thoughts about the order as well, I have some further thoughts -- including that I think Cook may be slightly overstating his case with his comments, because it's not actually clear that the backdoor the FBI is asking for would actually work on most modern iPhones, though it might work on older phones. However, the larger concerns about the order are still very much valid.
The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.
Some would argue that building a backdoor for just one iPhone is a simple, clean-cut solution. But it ignores both the basics of digital security and the significance of what the government is demanding in this case.
In today’s digital world, the “key” to an encrypted system is a piece of information that unlocks the data, and it is only as secure as the protections around it. Once the information is known, or a way to bypass the code is revealed, the encryption can be defeated by anyone with that knowledge.
The government suggests this tool could only be used once, on one phone. But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.
- One of the concerns I raised last night was probably inaccurate: that this could force Apple into creating a tool that would put many people's privacy at risk. The order does seem fairly specific to just this phone. Yes, as Cook notes, if Apple is forced to do this and Apple does it successfully, that would open up similar orders for other phones, but the impact of that may be somewhat limited in that it only applies to older phones, and quite possibly older iPhones that have not updated their operating systems.
- It does seem clear that if this were a newer iPhone, which includes Apple's "Security Enclave" system, this request would likely be impossible to meet. It's quite interesting to read the details of how Apple's security now works, where the Security Enclave basically cuts off this possibility, because the firmware update itself would wipe out the encryption key, effectively making it impossible to decrypt the content. It's also possible that even in the older phone this order is still impossible, if the operating system was properly updated -- in part because they may not be able to update the firmware without putting in the passcode, which is the problem they want this new firmware to solve.
- I keep seeing people say "why can't they just copy the contents of the memory and brute force it elsewhere" but that's not possible with the iPhone, since a part of the key comes from the hardware itself, and there doesn't appear to be any way to extract it (and Apple does not keep it).
- The whole focus seems to be on allowing the FBI to bruteforce the passcode, which is in the realm of possibility should the two impediments above be removed, as opposed to bruteforce cracking AES encryption, which is not currently in the realm of possibility.
That said, there are still serious concerns. While the DOJ insists that its use of the 18th Century All Writs Act in this case is pretty ordinary and standard, they're exaggerating in the extreme. Some of the previous examples they discuss do include requirements to use certain machinery in order to execute a warrant, but that's quite different from ordering Apple to write entirely new software. The DOJ again insists that there are examples of All Writs Act requests in the past that have required software, but it's notable that when the DOJ says this, it does not immediately cite any cases, but rather says just that sometimes providers have to "write code in order to gather information in response to subpoenas or other process." There's a pretty big difference between writing some scraping or search code, which this implies, and creating a special firmware as the DOJ is asking for in this case.
Also, as Cook notes, the unprecedented nature of this is that it's not at all similar to previous cases, because this would involve actively undermining the security of devices, rather than just helping to gather information that is readily available.
In fact, even more ridiculous is the idea, as laid out in the DOJ's application for this order, that this will not be burdensome to Apple simply because Apple writes software:
While the order in this case requires Apple to provide modified software, modifying an operating system - writing software code - is not an unreasonable burden for a company that writes software code as part of its regular business.Uh, yeah, it can be when what you're asking for is fairly complex and may not even be possible depending on the specifics of the way the security in the iPhone 5C is designed. And, seriously, just saying "Apple writes software, therefore any request for it to write software is not burdensome" is ridiculous on its face.
There's a separate question, raised by people such as Chris Soghoian, about whether or not this particular use of the All Writs Act to force Apple to use its code signing keys to "sign" this new firmware violates the First Amendment in compelling speech. It will be interesting to see if Apple raises this issue in its inevitable appeal of the order.
In the end, this is both a big deal and potentially not a big deal. It's a big deal in that after a few previous attempts to use the All Writs Act to force Apple to "decrypt" content on a phone, a court has not only done so, but done so with fairly specific instructions on what Apple has to do to create a very specific bit of software that removes a couple security features. That raises a bunch of legal questions, that I would imagine Apple will quickly raise in response as well. However, from a technological standpoint, it appears that many of these questions will soon be moot, so long as people have more modern and updated phones. But the bigger concern, as Cook notes, is the precedent here that a court can order, at the behest of the FBI, that a tech company undermine the security of a device. As he notes, once you start down that slippery slope it's not hard to see where that can lead:
The implications of the government’s demands are chilling. If the government can use the All Writs Act to make it easier to unlock your iPhone, it would have the power to reach into anyone’s device to capture their data. The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge.This particular legal battle is going to get very, very interesting.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: all writs act, backdoors, decryption, doj, encryption, fbi, going dark, iphone, syed farook, tim cook
Companies: apple
Reader Comments
Subscribe: RSS
View by: Time | Thread
Not just a promise, a /pinky/ promise
If the code can be used to unlock one device, it can be used to do the same to any other device with similar security, and given how utterly pathetic government computer security seems to be I wouldn't give it so much as a month, max, before someone else got their hands on the code and started using it.
[ link to this | view in chronology ]
No Stone Unturned?
The DOJ and FBI ignored possible critical evidence in the shooters' apartment when they released it to the landlord just *two* days after the shooting.
NYPD Detective Harry Houck said he was "shocked" the FBI released the apartment. Houck told Anderson Cooper on CNN: "This apartment clearly is full of evidence. I don’t see any fingerprint dust on the walls where they went in there and checked for fingerprints for other people that might have been connected to these two. You’ve got documents laying all over the place — you’ve got shredded documents that need to be taken out of there and put together to see what was shredded.” “You have passports, drivers licenses — now you have thousands of fingerprints all over inside this crime scene.” (http://newyork.cbslocal.com/2015/12/04/san-bernardino-killers-apartment-media/)
Yea, the victims' families should be angry that their loved ones are being used as an excuse to undermine civil liberties.
NYT article: http://www.nytimes.com/2016/02/18/technology/apple-timothy-cook-fbi-san-bernardino.html
[ link to this | view in chronology ]
Re: No Stone Unturned?
Just the Families?
I think it is fair to say WE ALL should be a bit miffed about this whole problem. Terrorism has become the wet dream of our TYRANT leaders and 3 letter agencies!
[ link to this | view in chronology ]
Re: No Stone Unturned?
[ link to this | view in chronology ]
Re: Not just a promise, a /pinky/ promise
Tell the FBI to fuck off, and stop being lazy.
[ link to this | view in chronology ]
There is another more problematic issue here
This is essentially tyranny, abuse of power, and reckless to the point that the Judge should be considered to have willfully sullied the bench on which he sits and should be removed, disbarred, and criminally charged! This is no different than a court telling a safe manufacturer that they must use a specific set of locks that will allow a key they must then give to the court to use on said safe.
There are so many things wrong with this "Order" that it is despicable! The fact that this type of stuff will continue is why those idiots in the federal land cases are going to gain traction. All of the governments branches are beginning to lean into a full assault on "The People" and their "Liberties". It's like the US Government is wanting another Civil War.
[ link to this | view in chronology ]
Re: There is another more problematic issue here
Actually that issue is not raised by this order.
The reason is imple.
If it is possible for Apple to comply with this order in respect of the particular device - then it is also possible for someone else to do it - in other words this particular cat is already out of the bag.
On the other hand if Apple cannot do it then they cannot comply.
What would be problematic would be for the government to order Apple to make sure that its future phones are crackable - in which case the issue that you raise would be valid - and very troubling - however it doesn't seem that that is what is being requested here.
[ link to this | view in chronology ]
Re: Re: There is another more problematic issue here
How is that even legal?
Sure the government could hire a locksmith (developer/cryptographer) and pay them to break through the security.
But to demand that someone perform the work for them, even with compensation, just seems like something that should not be legal in a free society.
[ link to this | view in chronology ]
Re: Re: There is another more problematic issue here
No, because the program has to be digitally signed by Apple in order to run on the phone. Unless they give away their digital signature key (which would pretty much render any security on their phones useless, because anyone could just update the phone with whatever code they wanted) they are, in fact, the only ones who can do this.
[ link to this | view in chronology ]
Judge to Apple...
[ link to this | view in chronology ]
A: Even if you are fast enough to try a thousand passwords a second the hardware probably couldn't withstand you pushing so many buttons in so little time. Chances are you'll physically break something.
B: Any half decent security system will introduce a delay between successive password attempts after you failed to enter the correct password three times or so.
C: Any half decent security system will then proceed to disable you from attempting any more passwords and alert the administrator of your many failed attempts.
But those science fiction movies never consider these things.
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Re: Re:
This kind of incompetence is not reserved to government.
AT&T for example, had a web site that would show you your information. If you increased or decreased a number in the address bar by typing over it, you could see another customer's information. In fact, you could easily get information for hundreds or thousands of customers.
If you were to tell anyone of this bad security -- then YOU ARE THE CRIMINAL!!!
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Response to: Anonymous Coward on Feb 17th, 2016 @ 8:17am
[ link to this | view in chronology ]
Response to: Anonymous Coward on Feb 17th, 2016 @ 8:17am
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Re: Re:
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Re: Re:
[ link to this | view in chronology ]
Re: Re:
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Re:
Secondly this isn't about using it to brute force one iPhone, it's about that if this is forced to happen, the government could start using the same legal tools to force developers to include backdoors in all their software "In the name of national security" and would have a precedence that could force this through.
[ link to this | view in chronology ]
Re: Re:
[ link to this | view in chronology ]
Re: Re: Re:
[ link to this | view in chronology ]
Re: Re: Re: Re:
A vulnerability is a security weakness that wasn't necessarily put there intentionally.
A backdoor is a vulnerability that was intentionally put there prior to deployment so that it can later be exploited.
The government is not asking Apple to create a backdoor. They are either asking apple to exploit an existing vulnerability or to exploit an existing backdoor (an already existing way that enabled apple to weaken a security feature). Apple can't create a backdoor after deployment, they can only exploit an existing vulnerability or backdoor.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re:
If they replace the firmware and/or OS with a new version, isn't that creating a new backdoor? You could say that the fact that it's possible to do is an existing vulnerability, but I don't see how that makes this any less a new backdoor installed on the phone after release.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re:
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re:
[ link to this | view in chronology ]
Re: Re: Re: Re: Re:
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re:
I think you place too much faith in their marketing.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re:
and that point can be argued separately. But to claim that this is bad because it's a backdoor distracts from the other points that could be argued and ends up just discrediting the source that's conflating a back door with something that's not a backdoor. A backdoor has a specific meaning and we shouldn't misuse it just because of headlines.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: Re:
[ link to this | view in chronology ]
While I'm not a fan of Apple's walled garden approach to their products, I have huge respect for them taking a stance for private security and it's certainly something I'll keep in mind when I'm eventually shopping for a new phone.
[ link to this | view in chronology ]
It all starts with just one...
I don't fancy Apple that much, but I value the stand they take here immensely, because if they give in, the escalating effect is sure to come to the detriment of us all.
[ link to this | view in chronology ]
Re: It all starts with just one...
You are failing to see the big picture here.
Piracy!!!
[ link to this | view in chronology ]
Re: It all starts with just one...
I am inclined to believe that it is already too late. The escalating effect is sure to come to the detriment of us all, as you say. Whether Apple wins or loses this one. It will come up again. This battle with the clipper chip and mandatory weak encryption with a government key already happened in the 90's. Now the government wants more. Much more.
Sorry to sound like a pessimist. But I think I am actually a realist.
[ link to this | view in chronology ]
Good for Apple
[ link to this | view in chronology ]
This request might be the start of something
[ link to this | view in chronology ]
On the other hand, if Apple insists they have to have the device in-shop to do the install, the courts will insist they keep on doing this until hell freezes over. "You did it once, it can't be burdensome to do it again."
On the gripping hand, if there's a flaw with the implementation of the fix that causes the device to be wiped anyway, well, "oops, these things happen in software development. Sorry." And "no, we HAD to test it on the target device and none other." And maybe, "so get YOUR software development experts to prove this was deliberate, if you feel so strongly about it. We'll see you in court. Again."
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
UB Trippin', DoJ
Like it wouldn't be an unreasonable burden for the DoJ to operate within the law since that's part of its regular business? Oh, right.
[ link to this | view in chronology ]
Re: UB Trippin', DoJ
[ link to this | view in chronology ]
Re: Re: UB Trippin', DoJ
That's what he meant by "Oh, right".
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Re:
'We want you to fork your entire OS to change an underlying security feature, run it through all your normal processes, force the phone to update to it, and then dump the fork.'
'Also, we're not going to pay you for this.'
'Also, once we've done this once, we'll come back again and again, and eventually try to force you to make this a standard feature in your OS. After all, won't that be easier?'
[ link to this | view in chronology ]
Re: Re:
Can't Apple request compensation?
[ link to this | view in chronology ]
Re:
For the FBI to brute force the phone, Apple has to give it to them with the compromised firmware in place and once it's out of Apple's control, all bets are off.
[ link to this | view in chronology ]
Re:
But even if, then this simply becomes the camel's nose under the tent. Or the foot in the door.
All law enforcement agencies will now want a revolving door into Apple for an endless stream of 'break into this iPhone' demands.
Of course, with the newest phone hardware this is simply impossible. What makes this case more interesting is that this is an older iPhone.
[ link to this | view in chronology ]
Re: Re:
I hope Apple sticks to their guns, but I think in the end Apple will lose. If the FBI can't force Apple to comply, new telecom rules will come into place that bans wireless operators from activating devices that can't be decrypted by law enforcement. Telecom companies are notoriously friendly to the feds.
[ link to this | view in chronology ]
Re: Re: Re:
Forcing US companies to make insecure products makes the other 96 % of the world's population aware that US products are mandated to be insecure by design.
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Two sides to this for Apple
As for brute forcing:
I would think they could copy all the data off the existing phone, encrypted though it may be, just in case they "accidentally" cause a data wipe; whether they could copy that (encrypted) data back may be an issue, but it would seem to be a way to start.
[ link to this | view in chronology ]
Re: Two sides to this for Apple
What the FBI wants here is to get hold of the key. But wouldn't it be easier to simply torture the defendant? It seems the court could simply order that, it is more likely to be effective, and would cost significantly less. And it is the in accordance with the values of the Untied Police States of America.
[ link to this | view in chronology ]
Re: Re: Two sides to this for Apple
[ link to this | view in chronology ]
Re: Re: Re: Two sides to this for Apple
[ link to this | view in chronology ]
Re: Re: Re: Re: Two sides to this for Apple
I'm sure that that will work - I've seen it on TV cop shows - that "Medium" program!
[ link to this | view in chronology ]
Re: Re: Two sides to this for Apple
[ link to this | view in chronology ]
Re: Re: Re: Two sides to this for Apple
That varies by jurisdiction, it hasn't been settled by the Supreme Court.
[ link to this | view in chronology ]
Re: Two sides to this for Apple
[ link to this | view in chronology ]
Re: Two sides to this for Apple
Because that wouldn't do them any good. They are currently trying to brute force the phone's access key, which requires the phone hardware. If all they had was the encrypted data, they would have to directly brute force the encryption, which isn't feasible in any useful time frame. I don't know if it's hundreds of years, or millions, or billions, but way too long. That's my understanding anyway.
[ link to this | view in chronology ]
Unduly Burdensome
Will Apple be compensated for the opportunity cost of lost time to market? What if Apple were to miss a major market opportunity because they have to divert significant resources into a development effort? Would Apple be compensated for that?
Can the court require the FBI to post a several billion dollar bond to cover this possibility?
Or maybe the FBI can use their own resources to recruit and employ the developers who would do this work? (Oh, yeah, a government run software development project. Those always work out well.)
Is there some standard dollar amount to how high a barrier Apple should be required to jump over? What if Apple designs a device where the cost to break into it is absurdly high? Why should Apple be considered more capable of breaking it than our good fiends at the NSA? [sic]
In this present case, is the cost to absurdly high to require Apple to comply with this order without compensation?
The ultimate question: can the government require that nobody is allowed to build devices that are secure enough that it is infeasible to break into them? Or should the world be aware that US devices are mandated to be insecure by design?
[ link to this | view in chronology ]
Re: Unduly Burdensome
http://arstechnica.com/tech-policy/2016/02/kids-forget-console-gaming-play-the-fbis-browser-bas ed-game-instead/
[ link to this | view in chronology ]
Re: Unduly Burdensome
According to the order, "Apple shall advise the government of the reasonable cost of providing this service." So I would guess that means the government (and therefore the taxpayers) are paying.
According to the order, "To the extent that Apple believes that compliance with this Order would be unreasonably burdensome, it may make an application to this Court for relief within five business days of receipt of the Order." So if they stood to lose a billion dollars from a missed market opportunity, they could inform the court and get this order delayed or something.
The FBI cannot create this program - even if they had the knowledge allowing them to create it, they cannot provide the signature that would allow it to run, since only Apple has that. And Apple, given the choice, would much rather write this themselves than blindly sign something that the FBI gives them. There's a huge difference in security risk.
Some government officials do think that backdoors should be mandated, but that's not currently the law.
[ link to this | view in chronology ]
risks and penalties
Now, let's consider just who is supposed to have custody of said backdoor. We're now learning how our past four secretaries of state mishandled critical state secrets. This isn't a partisan issue, and it's more than just one person. Powell, Rice, Clinton, and Kerry seem to have issues with keeping critical information to appropriately secure communications pathways. Those guys are at the top of the security food chain, and they can't stick to procedure.
Such backdoors should never happen, and even pressuring for such a security weakness should be considered providing material support when that backdoor is eventually misused. A government that cannot protect the secrets that it has certainly can't be trusted to protect secrets when we weaken security even further.
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Let's blame Sheri Pym
Need we say that the Riverside branch is one of the most corrupt and abusive branches in one of he most corrupt and abusive federal jurisdictions?
[ link to this | view in chronology ]
The scariest part is...
[ link to this | view in chronology ]
even scarier
[ link to this | view in chronology ]
Re: even scarier
You prefer people who disagree with you should not be allowed to vote?
[ link to this | view in chronology ]
Re: Re: even scarier
[ link to this | view in chronology ]
At the point any or all of the 3 are implemented, I would thus have to make a decision as to whether I wished to support the Government that implemented them any longer. Support with my votes, with my taxes, or with my consent that this Government represented me.
They are:
- Banning of guns
- Banning of physical cash
- Banning encryption
It sure appears that several of these are soon to be on the table to be banned. And when one goes, the next...and all 3...are likely soon to follow.
If my nightmare scenario comes true, there may be a point at which all those dildos and lube sent to the Malheur Mormon Crazies are no longer so damn humorous.
They may have been (in my opinion) the wrong people, at the wrong time, in the wrong place, but at their heart...all they are saying is that the Federal Government is overreaching.
Reddit bathroom humor and derision aside...is there not the slightest grain of truth to this fear?
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Once useful software exists it doesn't ever go away.
[ link to this | view in chronology ]
It's All About the Attitude Here
And since enhanced interrogation seems to be OK in the U.S. I'm sure (not really, but some think it works) they could have extracted the password in time.
Some will say the terrorists were heavily armed and maybe had explosives with them, and so force begets force. But I think if you think more tactically and with the a bit of logic the SUV could have easily have been trapped and the police could have hid inside their armored vehicles and easily have wait out the terrorists if they thought about this before they "Bonnie and Clyde" the crap of them.
And maybe they would have martyred themselves anyway, but at least try to use non-lethal means in a situation. What is the real cost of a little bit time and and a little bit of non-lethal persuasion?
[ link to this | view in chronology ]
Re: It's All About the Attitude Here
Having to prove a case in court.
[ link to this | view in chronology ]
PR and Propaganda circle jerk.
Here's a much better press release:
http://www.americansfortaxfairness.org/badapple/
I wonder how many favours a company has to do to get away with shit like that....
Apple doesn't need to protect customers- they need customers to believe they're trying to protect them; and they need plausible deniability when it's shown they didn't.
[ link to this | view in chronology ]
Re: PR and Propaganda circle jerk.
[ link to this | view in chronology ]
Re: PR and Propaganda circle jerk.
Well go ahead, tell us what's wrong with it.
[ link to this | view in chronology ]
Here's an idea for the FBI
[ link to this | view in chronology ]
History Repeating Itself
Can anyone here remember the clipper chip?
Need I say more?
[ link to this | view in chronology ]
What scares me, is their mentalities, they see absolutely nothing wrong with this.......so i ask myself, what are these people gonna turn our world into
THAT's, whats scary! A million times so then the distractions they pile on us.........THIS, going down this road, has the potential to be SOOOO much worse then anything we face today
Absolute power, non restrained, and no meaningfull safeguards.........and even then, they should only be granted say 0.1% of everything their asking......as some of the shit their asking to have the ability too do, should be straight up denied........ON INSTINCT
[ link to this | view in chronology ]
With an observation like
"DOJ again insists that there are examples of All Writs Act requests in the past that have required software, but it's notable that when the DOJ says this, it does not immediately cite any cases"
I imediately think
...because their lying, or stretching a "truth"
....or their afraid of opening another can of secret worms
[ link to this | view in chronology ]
Usually works
[ link to this | view in chronology ]
Moreover, if Apple can do it easily (say a supplier does keep a list of keys, or Apple has in fact kept a list or an algo which can be re-run to obtain the key), then they would take a seriously huge hit.
See, if the truth is that Apple cannot do it, they would just comply with the court order by attempting to do it, prove that they cannot do it, and it would end at some point (say a year or two from now) when Apple proves it would take years of machine time to try to brute force because there is no other choice.
The choice to fight against the court order may be philisophical, may be part of their belief system, but the vigor and aggressive nature of the response makes me think that they already know the answer, and that they don't want you to know it.
[ link to this | view in chronology ]
Re:
I don't think anyone, including Apple, is arguing that it cannot be done. Only that it should not be done.
[ link to this | view in chronology ]
Re:
Why would Apple spend a year trying to do something they know they can't do, rather than tell the court now that it can't be done? That would be insane. And the order itself tells Apple that it should tell the court within 5 days if compliance is not reasonable. Why shouldn't they just do that?
Even if Apple did secretly have the key or a way to regenerate it, this order doesn't even ask for that. The order does not direct them to attempt to unlock the phone or provide the key; it orders them to give the government a program to do specific things to make it easier for the government to brute-force the passcode.
"Apple's reasonable technical assistance shall accomplish the following three important functions: (1) it will bypass or disable the auto-erase function whether or not it has been enabled; (2) it will enable the FBI to submit passcodes to the SUBJECT DEVICE for testing electronically via the physical device port, Bluetooth, Wi-Fi, or other protocol available on the SUBJECT and (3) it will ensure that when the FBI submits passcodes to the SUBJECT DEVICE, software running on the device will not purposefully introduce any additional delay between passcode attempts beyond what is incurred by Apple hardware."
[ link to this | view in chronology ]
Re: Re:
[ link to this | view in chronology ]
Re: Re:
The key of course is that this would render Apple's encryption scheme moot for law enforcement, as having the physical device (and an applied patch) would allow any device to be viewed within a reasonable period of time.
Apple may find itself fighting a losing battle on this one.
[ link to this | view in chronology ]
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Re:
Do you have any references indicating that this is possible?
I doubt that Apple would totally change the security architecture of their devices between releases.
ARM (which Apple uses) substantially changed the security architecture of their chipset, and Apple took advantage of that new architecture.
https://www.quora.com/What-is-Apple%E2%80%99s-new-Secure-Enclave-and-why-is-it-importan t
I'm sure there are plenty of other references on Secure Enclave if you're interested.
[ link to this | view in chronology ]
They don't need a firmware change
and yes, 200 not 200k.
[ link to this | view in chronology ]
Re: They don't need a firmware change
What is a $uk?
Interesting though, I wonder why they went to Apple rather than using this hack.
[ link to this | view in chronology ]
Re: Re: They don't need a firmware change
[ link to this | view in chronology ]
Re: Re: They don't need a firmware change
[ link to this | view in chronology ]
Re: Re: Re: They don't need a firmware change
[ link to this | view in chronology ]
[ link to this | view in chronology ]
What if Apple fails?
[ link to this | view in chronology ]
"Regular business"
"While the order in this case requires Apple to provide a mathematical proof that P=NP, creating a mathematical proof -- doing maths -- is not an unreasonable burden for a company that does maths as part of its regular business."
[ link to this | view in chronology ]
Bad mistake by the FBI
[ link to this | view in chronology ]
Fools
turn on the phone (no password), and see what it is communicating with (MITM attack).
this way, we the people, can truly see who our friends are (right now, apparently Apple)
[ link to this | view in chronology ]
New article..
[ link to this | view in chronology ]