Alternate Titles: Apple Now Looking To Close The Backdoor The FBI Discovered
from the real-talk dept
Yesterday the NY Times put out a story claiming that Apple Is Said to Be Working on an iPhone Even It Can't Hack, with the underlying thrust being that this is a response to the big DOJ case against it, in which the court has ordered Apple to undermine key security features, which would then enable the FBI to brute force the (almost certainly weak) passcode used by Syed Farook on his work iPhone. But, here's the thing: prior to that order and its details coming to light, many people were under the impression that the existing iPhones were ones that it "couldn't hack." After all, it was offering full disk encryption tied to the device where it didn't hold the key.And thus, a key reality of this debate is that Apple already had a bit of a backdoor in its devices: it could update the code on the device, without it wiping the key, and that updated operating system could, theoretically, remove key security protections that made the iPhone's security workable. It's just that the FBI found the backdoor.
So, really, it appears that what Apple is doing is what a few of us asked about as the details became clear: why can't Apple build a phone that works the way many people assumed it worked prior to this court order: and not allow for such a software update to work without first being approved by the end user.
So, really, this is just Apple closing the backdoor that the FBI revealed. Nice work, FBI, for disclosing this vulnerability.
Of course, as this so-called "arms race" continues, the surveillance state apologists are coming out of the woodwork to insist that the law must stop what the technology allows:
“We are in for an arms race unless and until Congress decides to clarify who has what obligations in situations like this,” said Benjamin Wittes, a senior fellow at the Brookings Institution.Or, Congress can leave things as they are, and Apple and others can continue to better protect the security of all of us. That seems like a good idea.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: backdoor, doj, encryption, fbi, iphones, security
Companies: apple
Reader Comments
Subscribe: RSS
View by: Time | Thread
And for the FBI to disclose the vulnerability they cut their own throat with regards to not being able to use the vulnerability in question. Way to go. If you were a spy and found a way to spy without being detected then by revealing the way you spy you lose forever that way of spying without being detected as that way will be plugged up to stop you from spying once it is revealed how you do it.
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Re: Re:
* Firmware must be approved by the user.
* That approval can only be done by unlocking the phone first in order to use it to give the approval.
* Firmware updates to the phone OS should NOT be able to compromise security.
* Firmware updates to the security apparatus should require destruction of all keys -- effectively wiping the phone.
* Make sure customers are fully aware: if you lose your password you have lost all your family photos forever. Make backups of things like this.
That would allow you to test new security apparatus updates in your labs as many times as you want. But you would want the security apparatus firmware to be effectively 'baked in'. Or updating it must require at a minimum that you back up everything first, and be willing to completely wipe the phone for an update to the security apparatus.
[ link to this | view in chronology ]
Re: Re:
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Re:
Microsoft have a wide open backdoor there.
[ link to this | view in chronology ]
Just when I thought I could have a reason to like Apple
So, yeah, I guess no surprises here after all.
[ link to this | view in chronology ]
Re: Just when I thought I could have a reason to like Apple
Interesting how over time Microsoft has become more like Apple in that sense.
[ link to this | view in chronology ]
Re: Re: Just when I thought I could have a reason to like Apple
[ link to this | view in chronology ]
Re: Re: Just when I thought I could have a reason to like Apple
Thanks goodness Microsoft no longer has time to copy Apple anymore.
They are too busy copying Google.
[ link to this | view in chronology ]
Utterly disgraceful
[ link to this | view in chronology ]
Can I screem
I am one of the AC's that was asking why apple was not already telling the FBI that they could not hack their shit, and if they could hack it then their encryption was already fucking bunk!
The real news story all along should have been this headline...
"FBI is asking Apple to hack a phone and Apple's response WAS NOT 'we do not have the ability to decrypt our phones, sorry'"
Now that we go that out of the way... the judicial overreach should still be getting a very hefty slap down for it tyrannical approach to so called 'Justice'.
[ link to this | view in chronology ]
For me, it looks much more like something Apple came to realize at some point in the past, a paradox of making the encryption "harder" but relying on a user supplied pincode as an access method. No matter how big the wall, how strong the lockbox, if you secure it with a 29 cent lock, that is the weakest point.
You are the weakest link, goodbye.
I also think Apple may have been keeping this one in their back pocket as a way of dealing with issues that might come up in a country like China. You can bet pretty solidly that the Chinese government made it abundantly clear to Apple that their position in the market place comes with great responsiblity - read into that what you like, I guess.
Tim Cook's massive agro approach in the last week, the beyond full court press of horror stories and the end of personal privacy narrative seems to be there mostly to cover up and stop us paying attention to the basic concept that Apple's encryption ain't all that good or all that secure. Don't look at the issue, let's just scare you with this stuff over here instead.
Mike, it would be great if Techdirt took as critical a look at Apple as you have at the FBI and it's approach. Apple doesn't come off lily white innocent in all of this either.
[ link to this | view in chronology ]
Re:
I like where your going with this, but I'm not so sure this would be the best issue to beat them over the head with. Apple very well may have decided not to patch this particular vulnerability because they were the only ones capable of exploiting it. I would suggest that Apple had a reasonable expectation that the government would never require them to create software that could exploit a vulnerability in their own products. So it was probably a known vulnerability, but deemed not worth the effort to fix. So evil? No.// Lazy? Maybe so.
[ link to this | view in chronology ]
Re:
Right on cue. Here come "the surveillance state apologists".
[ link to this | view in chronology ]
Re:
But it is secure enough that the FBI can't break into it, unless Apple helps them. Unless you're not paying attention to that little tidbit of information...
[ link to this | view in chronology ]
Re: Re:
I think in part it's a question of what all would have to be hacked to make it happen. Not only would the FBI have to develop the patch, they would also have to hack the updating process, which in itself is a whole lot of work. It is way more expedient and way safer to get Apple to do the work.
I also think the FBI thinks that Apple already has such a patch made for countries like China. So it's not exactly that it would take long to create.
[ link to this | view in chronology ]
Re: Re: Re:
So like I said in another post - the FBI's just lazy?
And you think because the FBI's lazy, they should be able to compel a company to do their work for them?
[ link to this | view in chronology ]
Re:
So lemme make sure I understand your "logic:"
1. Apple, in having a product that you feel is insecure makes Apple a villain.
2. The fact that the FBI can't break into it without Apple's help also makes Apple the villain.
3. The FBI, in demanding that Apple develop something that weakens the security further is just fine, despite the 1st point.
How do you reconcile such a stupid viewpoint in that head of yours?
[ link to this | view in chronology ]
Re: Re:
1 - Apple has a product that is suppose to be super secure, but they have a fairly obvious attack vector open that they likely knew about for a long time
2 - Apple's response initially wasn't "oh, a hole, let's fix it" and instead "What is at stake here is can the government compel Apple to write software that we believe would make hundreds of millions of customers vulnerable around the world, including the US". Rather than worrying about fixing the hole and issuing a patch, they seem more cocerned with protecting the hole
3 - Nobody is asking Apple to write code and release it to all of the public's phones. Apple tightly controls the update process through intense levels of security (see the story about a week ago about replacement parts on an Apple phone). Nobody is asking Apple to break security for millions of Americans.
The level of arm waving and histrionics coming from Apple make me think they have something to hide, that's all.
[ link to this | view in chronology ]
Re: Re: Re:
2. See number 1.
3. If they open the door, the flood gates will open. They will get thousands of requests from all over the U.S., not counting other countries. There are literally thousands of iphones sitting in evidence rooms waiting for this very thing. They are already lining up to take advantage of this. This could cost Apple Billions in PR loses. Even the FBI admits that this case could set a legal precedent. Once this hacked O.S. gets out into the wild, there is no telling what damage it could/will do to apple and the public. Apple damn well better stand it's ground. No one will ever trust that company again if they do this.
Why exactly do you want to see Apple destroyed so badly? Is the data on that phone really worth putting millions of people at risk?
http://abcnews.go.com/Technology/york-da-access-175-iphones-criminal-cases-due/story?id=3702969 3
http://www.theguardian.com/technology/2016/feb/25/fbi-director-james-comey-apple-encryption-case-le gal-precedent
[ link to this | view in chronology ]
Re: Re: Re: Re:
Because ITunes caused the labels to start to lose control over music distribution. (whatever always supports the labels point of view while crying think of the artists).
[ link to this | view in chronology ]
I don't want Congress to decide...
I want the people to decide.
[ link to this | view in chronology ]
Re: I don't want Congress to decide...
[ link to this | view in chronology ]
Re: Re: I don't want Congress to decide...
Congress has been largely ducking their responsibilities until manufactured public outrage by the press forces their hand.
[ link to this | view in chronology ]
Re: I don't want Congress to decide...
Something about being secure in one's property, papers, effects. Search requiring a warrant. A warrant requiring actual suspicion.
The US is becoming the very thing its founders were trying to protect us from.
[ link to this | view in chronology ]
Re: Re: I don't want Congress to decide...
[ link to this | view in chronology ]
It's just that the FBI found the backdoor.
Now, who has the ability to change that firmware? ...it's not just apple. Apple uses the standard method the chipmaker designed- through the standard authorization channels that the carrier and protocals enable.
Castles on foundations of sand.
Dig deeper.
[ link to this | view in chronology ]
Re: It's just that the FBI found the backdoor.
[ link to this | view in chronology ]
Re: Re: It's just that the FBI found the backdoor.
[ link to this | view in chronology ]
Re: Re: Re: It's just that the FBI found the backdoor.
[ link to this | view in chronology ]
Re: Re: Re: Re: It's just that the FBI found the backdoor.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: It's just that the FBI found the backdoor.
[ link to this | view in chronology ]
Re: It's just that the FBI found the backdoor.
It's a house of cards, black-box security setup, based on "trust us".
[ link to this | view in chronology ]
Re: Re: It's just that the FBI found the backdoor.
Like I had mentioned in a past article... the ONLY reason Apple was fighting is because they had something to hide from the public. Were they able to give the FBI what they wanted and keep it under the rug...??? Well... Apple would have sucked FBI dick so fast they would have thrown their neck out.
Lets never pretend that any business has the interest of the public or its customers in mind... therefore we, as a public must make it clear that public interest also matches business interest. Sadly there is fat little chance of that because the public itself is a fucking tool ripe for abuse and ignorance!
[ link to this | view in chronology ]
Re: Re: Re: It's just that the FBI found the backdoor.
[ link to this | view in chronology ]
Re: Re: It's just that the FBI found the backdoor.
[ link to this | view in chronology ]
Under Seal
Actually, I think we need to consider the judge-shopping that the FBI did to get a judge who would approve the order.
[ link to this | view in chronology ]
speaking of samsung...
[ link to this | view in chronology ]
Re: apple can impose baseband standards.
They could segregate the system from the baseband, to limit what control and access it has over the phone. As far as I'm aware they have not done that.
As far as I'm aware (and I have done a fair amount of reaserch into this- admitedly mostly on android based systems) NO phone is currently available that segregates the baseband co-processor. The baseband has full access to every aspect of the phone- and the main cpu/system/os has no oversight or control of this.
[ link to this | view in chronology ]
The FBI will also ask Apple to sign a firmware image or perhaps just an App which runs on an iPhone in the background to either capture the user's passcode and send it to the government or-- if Apple builds a secure path from the user's fingers to the passcode/key-hash facility-- to brute-force the passcode (using the phone's own CPU-- and battery life) then send it to the FBI and NSA, plus all the other governments and well-funded crooks in the world.
[ link to this | view in chronology ]
Re: Brute Force
Possibly we can secure our border where 2.5 million have crossed in in the last 7 years, http://www.washingtontimes.com/news/2015/jul/20/number-of-illegals-levels-off-fewer-crossing-mexic/? page=all, to secure this country in before assaulting the Fourth Amendment, encryption technology, American ingenuity, in the name of figuring out what's on the phone of two people who never should have passed screening to get into the USA in the first place. What happened in San Bernardino is being used as hyoe to get backdoors.
[ link to this | view in chronology ]
Re:
You are falling for the narrative, not reality. There is nothing in the law that would allow them to do that. You would have to get past all sorts of issues, such as CALEA (as Mike has pointed out) as well as the need for a warrant, etc.
Don't fall for the scare tactics being used by Apple. Pay attention to the whole story and you will quickly realize Apple has something to hide. They want you to look away while they deal with it.
[ link to this | view in chronology ]
Re: Re:
[ link to this | view in chronology ]
Re:Re: Re: Re: Re: Re: It's just that the FBI found the backdoor.
It's not a fabrication, at least not entirely... it's just not exactly what it purports to be about. The (known by both parties) pretext about backdoors is classified information- so the public argument can't be about what it's really about...
Everyone agrees the "debate" needs to happen- this is the best way they can do that. Debate the structural integrity of the building foundation- ignore the classified sink hole below- at the end of the day we're all still sort-of discussing weather the building is going to collapse.., and who will bear responsibility if it does. -imo this is likely what apple is more concerned with.
The industry standard is that big corp quietly cowers to big gov and enables surveillance and data collection. Good for the goose, good for the gander; back scratches and a grotesquely suspicious lack of antitrust litigation, consumer protection, and tax enforcement all around.
Win10 is only distinguishable from Malware/Trojan/Spyware by fallacious categorical error, and OEM's Android is really not much better for non-technical users.
Apple is an outlier with their privacy stance- at the very least, in their marketing; I would hope more, but I wouldn't place much faith in that notion at all. A legally informed read of there EULA would likely reveal the standard, loophole swiss cheese that amounts to "you have no rights or recourse, and we can do whatever we want"
A proprietary closed source walled garden with "trust us" security, being the last bastion of hope for security/privacy- that shouldn't sit right with anyone.
If they really cared about privacy/security they'd segregate the baseband and go open source- then I'd sing their praise from the tree-tops after making a substantial investment in company stock.
[ link to this | view in chronology ]
Re: Re:
That's a good question... The better one would be why you'd assume we have lousy spies. We have the best spies in the world, and they have more/better capabilities then anytime in history.
Apple had over 650 vulnerabilities in 2015 alone (incidentally, worse then any other vendor)
I'd say there's not a even snowballs chance in hell they don't have that key.
[ link to this | view in chronology ]
Re: the things we thought were secure.
If your intent is to covertly collect valuable intel, your targets belief in their privacy/security is paramount. If they don't have faith in the security/privacy of the device/method they're not going to trust it with any of their secrets. It's not outlandish to consider that this fight may have been picked to lose. lose the battle, win the war.
[ link to this | view in chronology ]