Apple Might Be Forced To Reveal & Share iPhone Unlocking Code Widely
from the not-so-easy dept
Among the many questions swirling around the challenge to U.S. Magistrate Judge Sheri Pym's Order that Apple create software to bypass the iPhone passcode screen, a matter of paramount public interest may have been overlooked: Even if the government prevails in compelling Apple to bypass these iPhone security features: (A) evidence for use in a criminal trial obtained in this way will be challenged under the Daubert standard (described below) and the evidence may be held to be inadmissible at trial; and (B) the Daubert challenge may require disclosure of Apple's iPhone unlocking software to a number of third parties who would require access to it in order to bring the Daubert challenge and who may not secure the new software adequately. To state that neither consequence would be in the public interest would be an understatement in the extreme.The Daubert challenge would arise because any proffered evidence from the subject iPhone would have been obtained by methodology utilizing software that had never been used before to obtain evidence in a criminal trial. The Supreme Court, in Daubert v. Merrill-Dow Pharmaceutical-Dow Pharmaceuticals, Inc., held that new methodologies from which proffered evidence is derived must, when challenged, be substantiated by expert scientific testimony in order to be admissible. In Daubert, the court stated that the criteria that must be utilized when faced with a defense challenge to scientific testimony and evidence are:
- Can the methodology used to reach the expert's conclusion (the new software here) be tested and verified?
- Have the methodology and software been peer-reviewed and has the review been published in a peer-reviewed journal?
- Do the techniques used to reach the conclusion (here, to obtain the evidence) have an ascertainable error rate?
- Has the methodology used to generate the conclusion (the evidence) been generally accepted by the relevant scientific community?
- establish the integrity of the data (and its reliability) throughout the chain of custody;
- explain whether any person or software could modify the data coming off of the phone;
- verify that the data that came off the phone as delivered by Apple and held by law enforcement was the data that had originally been on the phone;
- explain the technical measures, such as the digital signatures attached to the data, used ensure that no tampering has occurred and their likely error rates.
In addition, defense counsel would undoubtedly demand the right for their own third-party experts to have access not only to the source code, but to further demand the right to simulate the testing environment and run this code on their own systems in order to confirm the veracity of evidence. This could easily compromise the security of the new unlocking code, as argued by in the amicus brief filed with Judge Pym by Jennifer Granick and Riana Pfefferkorn from Stanford's Center for Internet and Society (also covered previously by Techdirt):
There is also a danger that the Custom Code will be lost or stolen. The more often Apple must use the forensic capability this Court is ordering it to create, the more people have to have access to it. The more people who have access to the Custom Code, the more likely it will leak. The software will be valuable to anyone eager to bypass security measures on one of the most secure smartphones on the market. The incentive to steal the Custom Code is huge. The Custom Code would be invaluable to identity thieves, blackmailers, and those engaged in corporate espionage and intellectual property theft, to name a few.Ms. Granick and Ms. Pfefferkorn may not have contemplated demands by defense counsel to examine the software on their own systems and according to their own terms, but their logic applies with equal force to evidentiary challenges to the new code: The risk of the software becoming public increases when it is examined by multiple defense counsel and their experts, on their own systems, with varying levels of technical competency. Fundamentally, then, basic criminal trial processes such as challenges to expert testimony and evidence that results from that testimony based on this new software stand in direct tension with the public interest in the secrecy and security of the source code of the new iPhone unlocking software.
At best, none of these issues can be resolved definitively at this time because the software to unlock the phone has not been written. But the government's demand that the court force Apple to write software that circumvents its own security protocols maybe shortsighted as a matter of trial strategy, in that any evidence obtained by that software may be precluded following a Daubert inquiry. Further, the public interest may be severely compromised by a court order directing that Apple to write the subject software because the due process requirements for defense counsel and their experts to access the software and Apple's security protocols may compromise the secrecy necessary to prevent the proposed workaround from becoming available to hackers, foreign governments and others. No matter what safeguards are ordered by a court, security of the new software may be at considerable risk because it is well known that no security safeguards are impregnable.
The government may be well advised to heed the adage, "Be careful what you ask for. You may just get it." Its victory in the San Bernardino proceedings may be worse than Pyrrhic. It could be dangerous.
Kenneth N. Rashbaum is a Partner at Barton, LLP in New York, where he heads the Privacy and Cybersecurity Practice. He is an Adjunct Professor of Law at Fordham University School of Law, Chair of the Disputes Division of the American Bar Association Section of International Law, Co-Chair of the ABA Section of International Law Privacy, E-Commerce and Data Security Committee and a member of the Section Council. You can follow Ken @KenRashbaum
Liberty McAteer is an Associate at Barton LLP. A former front-end web developer, he advises software developers and e-commerce organizations on data protection, cybersecurity and privacy, including preparation of security and privacy protocols and information security terms in licensing agreements, service level agreements and website terms of service. You can follow Liberty @LibertyMcAteer
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: daubert standard, doj, encryption, evidence, fbi, security
Companies: apple
Reader Comments
Subscribe: RSS
View by: Time | Thread
No reason to panic..
[ link to this | view in chronology ]
Daubert and defence counsel access are both irrelevant here.
to trigger a scientific review. That's why the four prongs
cited are an obvious poor fit to this case.
1. This software is tested/verified by the sole defendant.
2. Peer review is not, and never will be, relevant here.
3. A backdoor works or it doesn't. There can be no error rate.
4. There is no "relevant scientific community" for Apple's
proprietary trade secrets, particularly when the original
court order specified the software be destroyed upon use.
The software proposed may be new but there is no new
technological or scientific research needed to make the
software or use it and development is limited to altering
or disabling established code already in use.
Likewise, the prospect of defence counsel demanding access
falls moot before the fact that Apple is the sole defendant
for this and future All-Writs cases on iPhones.
Other defendants may wish to suppress evidence obtained by a
backdoor but they won't be able to get their hands on the
backdoor itself because the presence of the evidence proves
that the backdoor worked. Nobody would be able to argue
that a mere backdoor conjured up evidence that was not
already there to be found and they couldn't use Daubert
because it's just a backdoor and not a new scientific method.
[ link to this | view in chronology ]
Re: Daubert and defence counsel access are both irrelevant here.
Moreover, let's be clear here: When you decrypt something, you either get the data in the clear or you get garbage (you failed!). It's not like it's suddenly add "I buried the body in the backyard next to the orange tree" in every document.
It seems like this argument is more of a defense lawyer trying to delay the inevitable rather than a strong legal argument. I'm not a lawyer, but even I can see this one as insanely weak and likely to be tossed.
[ link to this | view in chronology ]
As both arguments are more likely to annoy a judge than create a delay…
[ link to this | view in chronology ]
Re: Daubert and defence counsel access are both irrelevant here.
backdoor but they won't be able to get their hands on the
backdoor itself because the presence of the evidence proves
that the backdoor worked. Nobody would be able to argue
that a mere backdoor conjured up evidence that was not
already there to be found and they couldn't use Daubert
because it's just a backdoor and not a new scientific method.
At what point is it demonstrated that it was in fact just a back door and not a malicious software package that planted evidence in the device?
[ link to this | view in chronology ]
Apple experts, to retrieve the data and put it directly
into the chain of custody. Everyone present would be
require to swear to every step of the procedure they
carried out. It's standard practice which courts respect.
As all it did was unlock the phone there was no opportunity
for the backdoor to directly access the data; because that
was done by the iPhone's normal operating system using it's
normal, built-in functions that nobody can suspect as being
new or unique to that one phone.
It's like getting a landlord to unlock a door for you. He
never went inside after opening the door because that's the
cops' job. Later on, all he can testify to is that he opened
the door and let the cops in at that date and time; after
that the scene is in police custody and only they can keep
track of what they do and when; thus, a chain of custody.
[ link to this | view in chronology ]
Re:
Of course, they also assume police never plant evidence or lie on the stand, and we see how well that plays out.
because that
was done by the iPhone's normal operating system using it's
normal, built-in functions that nobody can suspect as being
new or unique to that one phone.
I thought this was about replacing the normal software with new software that disabled security features.
[ link to this | view in chronology ]
login screen itself. That login screen has very little
functionality because it is only designed for one task; so
any elaborate spy code to meddle with data would cause an
obvious case of bloat that any one of the people involved
could detect. You can be sure that Apple techs would not
let such shenanigans go unreported.
One tech unlocks the phone, signs off on what he did and
removes the tool, then the investigators go get the data,
every step of which is logged and signed for. The process
is rigorous.
After that it's all unchanged IOS code being used by
investigators directly because the phone and it's revealed
unlock code is now in their sole custody. The custom login
screen, no longer needed, would be replaced with the
original so the iPhone could be unquestionably certified as
absolutely in it's original state; thus placing it's
contents securely in a properly managed chain of custody.
If the folks operate as usual, dot all of their i's and
cross all of their t's, a court will have no reason to
question the process unless something unexpected happens;
like evidence showing up early or late in the process and
conflicting with the logs already in hand. That is rare.
[ link to this | view in chronology ]
Re:
screen, no longer needed, would be replaced with the
original so the iPhone could be unquestionably certified as
absolutely in it's original state
The only way the phone is in its original state is if it's factory reset. The fact that the original software was put back on the phone doesn't prove anything about what else might have happened to the data. Everything you're saying makes sense, but it seems to me (not a lawyer) falls short of proving that the data hasn't been tampered with. But maybe the defense just basically has to take the investigators' word that they didn't screw with it. That wouldn't surprise me.
[ link to this | view in chronology ]
last had it in hand, not before the factory shipped a new phone.
A blank phone is evidence only that a phone exists! ;]
If all that's changed is the login screen, then after the
access code is obtained with it and the original login
screen is replaced, the logs of that process and the sworn
testimony of all involved prove the OS and data have not
been altered by the process at the time the data was
finally accessed and copied to FBI assets. Even then,
originals are preserved and locked away from subsequent
investigators while backups are also locked away to
preserve the chain of access.
Until the phone is unlocked it is impossible to alter the
encrypted data and after the phone is unlocked it is
impossible for anyone to have unsupervised and unlogged
access to the data.
Because chain of custody procedures are followed only one
person at a time has custody, usually supervising
and/or assisted by one or more people as he/she works.
These procedures, familiar to all officers, agents and
courts, are trusted for good reason; because it is
practically impossible to tamper with evidence without
leaving "fingerprints" and the logs show who had custody
when such "fingerprints" showed up.
The legal system has had decades of practice with these
procedures and it is very rare that someone finds new
loopholes to exploit. You may theorize that one may exist,
that someone has the means, motive and opportunity to
exploit it, and that someone also coincidentally has custom-
tailored false evidence to plant but the odds of [loophole]
+ [means] + [motive] + [opportunity] + [fake data that
fools everybody] all coming together at the same time is so
low that it tends to be impossible, especially in cases as
complex as this with so many investigators and lawyers involved.
It is those decades of history and case law which creates
trust in chain of custody. Defense attorneys in other cases,
[not this one because the criminals are dead] will often poke
and prod at the chain of custody because that is what they
are expected to do. Most of the time they only prove that
the evidence is solid.
[ link to this | view in chronology ]
Re:
If I were Apple, this would compel me to release an immediate patch to all phones removing this as a possibility. It is after all, a security hole. They would be right to make it impossible to do, because if it's possible, we will force them to do it when it suits us, and do so regardless of the long term broader implications.
[ link to this | view in chronology ]
both an unreasonable burden and an unlawful expansion of
All-Writs.
Forcing highly valuable resources to this unprecedented
task costs a lot of money each time, especially as they
will not be compelled to keep a copy for future uses. The
FBI might claim they are willing to pay the bill but they
have no idea how big it would actually be and once they do
find out they will typically resort to asking the courts to
force Apple to pay the full price because "civic duty".
Worse, forcing Apple to be State Safe-cracker for more cases
in an uncertain future will immediately devalue the entire
corporation and all of it's products in the public view, and
thus on Wall Street. The immediate loss will be billions
even before the first phone is breached. If it continues
on to other phones in other cases [using this as precedent]
then those losses will become permanent and may even deepen
to the point where thousands of American and Asian jobs will
be lost forever. That is definitively an unreasonable
burden which a court ignores at Apple's and it's own peril.
All-Writs was crafted for access to available documents only!
Redefining it to force "landlords" of any kind to become
safecrackers for the state is clearly beyond the text and it's
intention, no matter how the FBI want to portray it as an
attempt to "keep up with the times". The courts will have
no other choice but to tell the FBI that they will have to
ask for new legislation because the courts don't have the
authority to expand law past constitutional protections
or the actual text of the legislation.
Bending a law against the constitution to fit needs is not
unusual but actually breaking it or changing a law to create
new authorities or powers is legally impossible, inviting
sanctions against an offending judge..
[ link to this | view in chronology ]
Re:
Redefining it to force "landlords" of any kind to become
safecrackers for the state is clearly beyond the text and it's
intention, no matter how the FBI want to portray it as an
attempt to "keep up with the times".
Not a bad analogy. Has anyone ever attempted to use the All Writs Act to compel a locksmith or safe manufacturer to crack a safe that they didn't own? Not provide a key, but use their time and expertise to do it.
[ link to this | view in chronology ]
themselves as long as they have served a warrant to it's
owner and seized the safe.
Then they could hire one to do the deed if they couldn't.
Of course, if nobody accepts the job, they're Short Of Luck. ;]
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Re:
According to my understanding, in order to get this phone to install the modified code, it would have to be provided as an updated iOS image.
According to my understanding, when you replace or upgrade a smartphone OS version, you do so essentially wholesale; you drop in the entire OS image, replacing everything which was there before, not just the pieces which were changed.
If that's correct, then there would be no way to replace just the login-screen code; you would have to replace everything. It's possible (even likely) that the replacements for everything else would not be (significantly) different from what was there before, but there would be no way to verify that without looking at the source code.
Even if that's not true, I'm not certain that your apparent assumption that there would be logs of the OS-update process which would show enough detail to determine whether anything other than the login-screen code had been modified is accurate. Certainly I've seen no sign of such logs on the Android side of the fence.
Beyond that, even if we assume that it can be proved that only the program(s) involved with handling the login screen were modified, there's no reason why the login-screen program(s) could not (be modified to) include code capable of modifying other parts of the system - and I would be extremely surprised if there were enough logging to be able to catch it if they did.
Really, if you're paranoid about every possible angle of attack and you don't trust the people who are in charge of the operation to do the right thing and be honest about their actions and motives, there is no way to be certain that the modified code has not tampered with the data on the phone other than to see - and possibly to experiment with - the code itself.
[ link to this | view in chronology ]
In this case the court specifically ordered that just the
one module be replaced. That module, designed for just the
one function and that being on a cell phone, is very small.
That makes it impossible to add sophisticated search-replace
code which alters time stamps on files and also internal
IOS logs and filesystem structures sufficiently well enough
to fool forensics investigators on both sides of the case.
Don't forget that this part of the job is for Apple alone.
The other logs I mentioned are not those of IOS, but of the
technical staff and investigators involved in all stages of
the procedure, and most of those will be written by Apple
staff who have sole and uninterrupted custody of both their
logs and the source code of the proposed tool. Apple can't
be elbowed out of the way and not all FBI agents, techs and
officials can be compromised at the same time; so yes, the
combined logs and testimony of all involved on both sides
of the case does effectively make shenanigans a no-go.
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Re:
You said it!!!
[ link to this | view in chronology ]
[ link to this | view in chronology ]
1. Secretly write the backdoor software in the event that they lose the case
2. If they lose, When told to hand it over, do so
3. Release OS 9 the same day
4. Tell the FBI, "Oh, sorry, you asked for software to unlock OS 8, you didn't say anything about OS 9..."
[ link to this | view in chronology ]
Re:
Or perhaps they can go the route of the Government when receiving FOIA requests and say development will take X months and cost $660 Million dollars.
[ link to this | view in chronology ]
Re: Re:
[ link to this | view in chronology ]
It'll never be used in a criminal trial
(And nothing prevents what's found on the phone from being used in non-judicial ways like no-fly-lists, being added to NSA contact chaining, etc.)
[ link to this | view in chronology ]
Re: It'll never be used in a criminal trial
[ link to this | view in chronology ]
Isn't that the point?
[ link to this | view in chronology ]
Re: Isn't that the point?
[ link to this | view in chronology ]
Re: Re: Isn't that the point?
I would(and have) argue that it's worse than that.
Once the precedent is set that companies can be compelled to break their own encryption you can be sure that any move towards encryption that they cannot break will be painted as companies attempting to 'avoid their lawful obligations by making their products immune to legally issued warrants'. At that point it goes beyond a cost/risk analysis of how much it costs to develop encryption versus how much it would cost to break it, and moves into the realm where it becomes effectively impossible for them to ever implement truly secure encryption, as they'd face a PR and potentially legal nightmare if they ever tried.
[ link to this | view in chronology ]
Re: Re: Re: Isn't that the point?
[ link to this | view in chronology ]
Cy Vance
[ link to this | view in chronology ]
Re: Cy Vance
...brig
[ link to this | view in chronology ]
As time goes on and it is discussed, more and more this begins to look like a terrible idea.
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
clarification
So, what's all the worry about then? I don't know the particulars of where, and how, these unique are stored on the iPhone. What may be possible though is to spoof these Ids to make another iPhone appear to be the one used by the San Bernardino terrorists. Another possible weakness is that every time a small change is made in the digitally signed code, it becomes easier to crack the key. A multitude of law enforcement agencies getting a new version for each case may allow the signing key to be discovered. I don't know if that is realistic in this instance, but it is something that should be looked at.
[ link to this | view in chronology ]
Re: clarification
[ link to this | view in chronology ]
Re: Re: clarification
It's a computer, with an OS/Firmware.
Functionality aside, It's fundamentally no different than any other Internet of Things device.
"Dear Amazon: We think Individual X may be up to something illegal. Please provide a custom firmware for their Alexa...."
"Dear Samsung: We think Individual X may be up to something illegal. Please provide a customer firmware for their smart TV..."
[ link to this | view in chronology ]
Re: Re: Re: clarification
[ link to this | view in chronology ]
Re: Re: Re: Re: clarification
[ link to this | view in chronology ]
Re: clarification
Only Apple is known to have the key(s) necessary to sign such code.
[ link to this | view in chronology ]
Re: clarification
wha ? ? ?
NOT a programmer, but this set my BS meter pegging; not sure how in hell you can make the "un-compiled source code" un-editable/copyable/etc...
[ link to this | view in chronology ]
Re: Re: clarification
They can edit it all they want, but the phone won't run it after that.
[ link to this | view in chronology ]
Expert examination?
Next proposal about that from the FBI: 'We' in the form of the US govt already have in place a standard procedure which should be used here, in the form of the TPP access procedures. The expert may be allowed to enter a room, with no pencils, paper, cameras or other recording devices, and then may look at a printout of the code (4-point type) and even perhaps at a hologram of the phone's internals. Director Comey will insist that is a sufficient examination, his experts told him so even though he didn't understand anything they told him (fully in the vein of the earlier TD story detailing his responses to Congress).
Surely a procedure which is deemed adequate for treaty examinations must be good enough for a mere phone.
[ link to this | view in chronology ]
DORMANT CYBER PATHOGEN
(Do not wake | Do not turn me on)
I AM CYBER PATHOGEN
AND I VOTE!
CYBER PATHOGENS UNITE
Today San Bernardino, Tomorrow the (World|Underworld)!
UNLOCK THE iPHONE:
FREE THE SAN BERNARDINO CYBER PATHOGEN!
REAL CYBER PATHOGENS RUN ON ANDROID
CYBER PATHOGEN EXTERMINATOR
Your phone is pathogen-free.
You owe me $1,000,000
REAL PATHOGENS RUN ON DNA
(A pictures of a unicorn in some appropriate but improbable pose--sleeping, rampant, penned up, dead, etc.--is optional but strongly recommended)
Other media should't be overlooked: say, tinfoil caps labelled "cyber pathogen protector" (with, of course, a unicorn head in the traditional red slashed circle); "cyber-pathogen-free" stickers to post on pay phones and power outlets--"let a thousand snickers bloom, let a hundred online shops contend."
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
this certainly changes the calculus doesn't it?
I'm not sure but I don't think this code itself is a big problem.
The big problem seems to me that in order to validate the code, and that it works as advertised against a real device the expert would have to have access to Apple's signing key.
Apple's signing key.
APPLE'S SIGNING KEY.
[ link to this | view in chronology ]
Re:
I believe with the source code, telling the machines how to operate, it can be reverse engineered to show what is needed to make it operate, and what functions are needed to minimumly operate the machine.
Believe they already have all the information off the device,but it is legally unusable. In both cases. So what else are they after? Or who? Unusable, no warrants at the time. But a court would let them get away with that, but to use as evidence, that would be a very odd court. But it could be presented to a grand jury as hearsay, for further action. But it still didn't get to the issue, why did Apple not fulfill the original request? It wasn't the privacy issue then. They did it on other occasions, why stop then? There is some other motive, but what. Fired the wrong guy? Didn't say pretty please? Wanting paid for the last time?
[ link to this | view in chronology ]
Show of hands
[ link to this | view in chronology ]
Interesting Issue
[ link to this | view in chronology ]
Feeling torn
[ link to this | view in chronology ]
Maybe part of the problem here...
This is a 2048 bit RSA key I just generated:
That's it. This is a textual representation of a 2048 bit RSA key. generate a CSR and a public key, and you can plug it into any Apache web server. Or use it to sign email. Or sign applications. And those signatures will be valid on any system with the public key installed as a certificate authority.
If you were to see Apple's private key exported like this one is, it would look very similar, although (hopefully) 4096 bits instead of 2048 (twice as long). And it might be DSA, instead of RSA. I'm certain it's stored in a _very_ tightly controlled environment.
This key fits trivially into a paste buffer. So would Apple's. You could print it and type it in by hand if you were so inclined. Or take a picture and OCR it. And if that happens - just once - it potentially puts the security of every Apple device on the planet at risk.
Now, this is a simplistic example. I'm sure Apple's implementation utilizes a hierarchy of similar keys, with limited uses, etc, all signed by a single, master key which is stored in tamper-proof hardware, requires multiple people to get to it, etc. But that master key only has to get exported once to the wrong individual to compromise the entire system.
[ link to this | view in chronology ]
Re: Maybe part of the problem here...
[ link to this | view in chronology ]
Re: Re: Maybe part of the problem here...
[ link to this | view in chronology ]
Re: Re: Re: Maybe part of the problem here...
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Re: Re: Re: Maybe part of the problem here...
[ link to this | view in chronology ]
The real reason
The only reason for the demand to Apple is to gain the precedent of access to mobile phones.
[ link to this | view in chronology ]
Apple is playing "I want to protect our users from the govt" to "up" their status among customers. When all I hear is Lies from them about why they don't want to assist the govt in accessing a known terrorists phone data. Apple is becoming a tool for the terrorists. Apple must want more terrorists to use their products. Apple has become a terrorist.
[ link to this | view in chronology ]
Re:
They have code already written to bypass the security lockouts on a phone? How do you know this? Neither Apple nor the FBI nor anyone else I've heard of is making this claim.
Apple is becoming a tool for the terrorists. Apple must want more terrorists to use their products. Apple has become a terrorist.
Toyota is becoming a tool for the terrorists. Toyota must want more terrorists to use their products. Toyota has become a terrorist.
[ link to this | view in chronology ]
Defense?
[ link to this | view in chronology ]
Re: Defense?
In this particular case? Nobody, they're dead.
[ link to this | view in chronology ]
Why can't anyone secure my devices ? Including the great apple?
[ link to this | view in chronology ]
It only happens in the movies
[ link to this | view in chronology ]