We Read All 20 Filings In Support Of Apple Against The FBI; Here Are The Most Interesting Points
from the it's-a-crypto-party dept
Amicus BriefsNot entirely sure why, but Apple (at least as I'm writing this) did not include one other amicus brief on its side, which is from Lavabit, the one company out there should should have tremendous perspective on Apple's situation, given that it shut down its entire business to avoid having to hand over its encryption keys to the government. At this point, we just have Lavabit's motion for leave to file its brief, rather than the full brief. If and when that brief is released, we may address it separately.
- 32 Law Professors
- Access Now and Wickr Foundation | Press Release
- ACT/The App Association | Medium Post
- Airbnb, Atlassian, Automattic, CloudFlare, eBay, GitHub, Kickstarter, LinkedIn, Mapbox, Medium, Meetup, Reddit, Square, Squarespace, Twilio, Twitter and Wickr | Automattic & WordPress.com Blog Post | Tweet from Twitter
- Amazon, Box, Cisco, Dropbox, Evernote, Facebook, Google, Microsoft, Mozilla, Nest, Pinterest, Slack, Snapchat, WhatsApp, and Yahoo | Tweet from Box | Evernote Blog Post | Facebook Statement | Microsoft Blog Post | Mozilla Blog Post | Snapchat Blog Post | WhatsApp Facebook Post | Yahoo Tumblr Post
- American Civil Liberties Union, ACLU of Northern California, ACLU of Southern California, and ACLU of San Diego and Imperial Counties | Blog Post
- AT&T | Public Policy Blog Post
- AVG Technologies, Data Foundry, Golden Frog, the Computer & Communications Industry Association (CCIA), the Internet Association, and the Internet Infrastructure Coalition | Golden Frog Blog | CCIA News
- BSA|The Software Alliance, the Consumer Technology Association, the Information Technology Industry Council, and TechNet | Press Release
- Center for Democracy & Technology | Insight
- Electronic Frontier Foundation and 46 technologists, researchers, and cryptographers | Blog Post | Press Release
- Electronic Privacy Information Center (EPIC) and eight consumer privacy organizations | EPIC Top News
- Intel | Blog Post
- iPhone security and applied cryptography experts including Dino Dai Zovi, Dan Boneh (Stanford), Charlie Miller, Dr. Hovav Shacham (UC San Diego), Bruce Schneier (Harvard), Dan Wallach (Rice) and Jonathan Zdziarski | Blog Post
- The Media Institute | Press Release
- Privacy International and Human Rights Watch
Letters to the Court
- Beats, Rhymes & Relief, Center for Media Justice, The Gathering for Justice, Justice League NYC, Opal Tometi and Shaun King
- David Kaye, United Nations Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression | Supporting Document
- Salihin Kondoker, San Bernardino, CA
I've also put all of these together in a collection which are embedded below (which also lets you search through the text of them all).
Having read through many of them (and skimmed the rest), there are certainly some common themes. Frankly, an awful lot of them do really feel like they're just a rehashing of Apple's motion to vacate, repeating the claim that the All Writs Act should not apply (and that CALEA already covers this issue and doesn't allow such a solution). Obviously, all the filings from the various tech companies are getting a lot of the press attention, and they're worth reading to perhaps better understand a few of the key legal points that Apple is raising. Similarly, the filings from civil liberties/human rights groups aren't all that surprising either -- focusing more on some of the human rights questions. The EFF brief is a bit different in focusing on the First Amendment issue, which many expect won't even be looked at by the courts. It's really the only brief that digs deeply into that issue and is worth a read for that reason (I've seen legal experts in fairly strong disagreement about the applicability of the First Amendment argument here).
Separately, I thought that the filing from EPIC was worthwhile as well, in focusing in on the importance of encryption on phones these days because so much personal information is accessible from phones and because unencrypted phones are such a big target for criminals. That's a slightly different perspective that isn't covered nearly as much by other filings and actually could be a useful perspective for the court.
However, the most interesting filing of all may be the one filed by a group of iPhone Security and Applied Cryptography Experts, and put together by Jennifer Granick and Riana Pfefferkorn from Stanford's Center for Internet and Society. That brief is super educational in getting down into the weeds of just how dangerous it would be for Apple to create this code. Since so many people -- including the DOJ -- seem to think that Apple can just create this code and then make it disappear, this filing is super useful in debunking that claim. It first notes that the Court did try to put in some "safeguards" against abuse of the code, but then explains why those won't help:
These rules are not meaningful barriers to misuse and abuse of the forensic capabilities this Court is ordering Apple to create. First, the Order assumes that Apple will create the Custom Code without any vulnerabilities in its implementation. Vulnerabilities are common in software code, including Apple’s iOS, and despite Apple’s best efforts. The government dismissively downplays the effort required to develop and update the Custom Code, stating that Apple “writes software code as part of its regular business,” including “routinely patch[ing] security or functionality issues” in iOS and “releas[ing] new versions of [iOS] to address issues.” Application at 15. But creating software (especially secure software) is complex, and software development requires rigorous testing.This is a point that so few people outside of the tech world seem to get. Security vulnerabilities exist and software (and hardware) are incredibly complex. Purposefully creating vulnerabilities likely creates even more vulnerabilities and those can be pretty dangerous. And that's especially true in a situation -- such as this one -- where Apple will have to create the code entirely by itself, and without any outside security audit possible.
Vulnerabilities are common in software code, despite vendors’ best efforts. To address this problem, vendors employ extensive pre-release testing; after-thefact audits, including by independent security researchers; and regular updates. Yet none of these practices, alone or in concert, can ensure that software will not be vulnerable and subject to misuse. Yet, given the circumstances of this case, this code is unlikely to go through this lifecycle, increasing the risk that it will introduce vulnerabilities into the iPhone ecosystem.
For example, since first introducing the earliest iPhone, Apple has waged a cat-and-mouse battle with “jailbreakers,” software developers who identified and exploited vulnerabilities in the devices in order to run software other than that signed by Apple and to defeat carrier locks that tied handsets to particular cellular providers. Apple warns its users against jailbreaking and the practice is contrary to Apple’s terms of service. Jailbreaking involves modifying the iPhone firmware so that it will run software code without checking to see if the code has been signed by Apple. When Apple releases a new iOS, scores of independent programmers study the code, successfully finding ways to circumvent Apple’s imposed restrictions. In response, Apple issues software updates to defeat these jailbreaks. Eventually, the jailbreaking community finds new ways to circumvent controls built into Apple’s increasingly secure iOSes. Today, all iOS versions through 9.2 are jailbroken, even though doing so is increasingly harder due to Apple’s efforts.
In other words, vulnerabilities in Apple’s software have persisted for years even though Apple very much does not want them to. This is a lesson for this case. Apple can try its very hardest to create this Custom Code as the Court directs. Nevertheless, it may well fail, as it has acknowledged to the Court.... Even with time and extensive testing, which the government’s sense of urgency seems designed to deny Apple, it is extremely difficult to write bug-free code. Software bugs can interact with existing code in complex ways, creating unanticipated new paths for bypassing iPhone security and exploiting the phone.
Furthermore, the brief notes the fact that, contrary to what many keep insisting, there's a high likelihood that any such code will leak out:
This Court also allowed the Custom Code to stay with Apple, rather than go to the FBI. Simply put, if no one else has the Custom Code, no one else should be able to use it, at least not without Apple’s knowledge. However, once created, this software is going to be very valuable to law enforcement, intelligence agencies, corporate spies, identity thieves, hackers, and other attackers who will want to steal or buy the Custom Code. Keeping the Custom Code secret is essential to ensuring that this forensic software not pose a broader security threat to iOS users. But the high demand poses a serious risk that the Custom Code will leak outside of Apple’s facilities.Finally, that brief also notes a key point being raised increasingly by security experts: that if a company can be forced to remove security features in an automatic update, it creates incentives for people not to update their devices, which will open up many more serious security problems:
Other governments, or ours, may eventually compel Apple to turn the Custom Code over so that law enforcement officials can unlock phones without delay or Apple oversight. Authoritarian governments will likely be the most enthusiastic customers for the Custom Code this Court is contemplating ordering Apple to create and sign. The software will be used in China, Russia, Turkey, the United Arab Emirates, and other governments with poor human-rights records where iPhones are sold.
Inadequate security practices by those governments increase the risk that attackers will acquire and use the Custom Code. Given the Custom Code’s value, unscrupulous government officials in corruption-plagued jurisdictions could foreseeably sell the Custom Code to third parties. For example, if the Russian government compelled Apple to hand over the Custom Code, it could end up being sold by a corrupt agent to a Russian identity-theft ring. Even without selling it, corrupt officials could also use the code for their own agendas, such as to target political or personal enemies who had broken no law. Journalists, human-rights advocates, religious and sexual minorities, and others in those countries are at much greater risk if software that can bypass passcode limitations exists.
There is also a danger that the Custom Code will be lost or stolen. The more often Apple must use the forensic capability this Court is ordering it to create, the more people have to have access to it. The more people who have access to the Custom Code, the more likely it will leak. The software will be valuable to anyone eager to bypass security measures on one of the most secure smartphones on the market. The incentive to steal the Custom Code is huge. The Custom Code would be invaluable to identity thieves, blackmailers, and those engaged in corporate espionage and intellectual property theft, to name a few.
Those technicians responsible for using the Custom Code to comply with access demands will likely be targeted by phishing attacks—emails carefully designed to seem legitimate but which contain malware—that seek to steal the Custom Code. The same technicians will be approached with offers to buy the software. The price offered could be irresistibly high, as the Custom Code will be worth a lot to foreign national security officials and organized crime syndicates, and can be sold to multiple customers. Or Apple technicians may be blackmailed to the same end. In short, the Custom Code will be exceedingly valuable and in danger of leaking or being stolen.
Automatic updates are an important way that software companies ensure their users are as protected as possible from attackers, without inconvenience, significant effort, or technical savvy on the part of the user (who is more likely to install security updates when there is little or nothing she needs to do). These autoupdates are one of the reasons why the millions of iPhones currently in use worldwide are very secure.Good stuff. One hopes that the judge will read this brief, in particular, carefully.
Consumers (and their devices) trust these auto-updates because they are signed by the vendor. A cryptographic signature from Microsoft or Apple assures the user that the software she is about to install legitimately comes from the company she trusts. It is akin to Apple saying, “This is Apple, and we stand behind this software.” Here, however, the Court is contemplating ordering Apple to sign software it does not stand behind and in fact considers “too dangerous to build.” .... And the next logical step if this Court enforces its Order is for the FBI to ask to compel other vendors, in addition to Apple, to sign other software that bypasses other customer security measures, creating new and different risks.
This is why compelling cryptographic signatures is extremely risky. Automatic software updates are a crucial vehicle for maintaining the security of iOS devices and other computers, but they can be effective only so long as users continue to trust them. If the Court compels Apple to create and sign the Custom Code in this high-profile case, then all computer users, especially those for whom smartphone privacy may already be a concern, could become suspicious of all software updates going forward. That is because a member of the public could reasonably fear that in the future, even a signed software update from a trusted vendor will bypass passcode limitations, convert her iPhone into an audio or video recording device, or otherwise interfere with her property, privacy, or security interests. Users will know that these updates could be software designed to extract private data from the user’s machine, but which a company was forced to sign at the behest of some court, law enforcement, or other government official. The code would be indistinguishable from a genuine update created, signed, and transmitted of the vendor’s own free will.
This distrust would have serious ramifications for computer security at large. In response, some users would likely stop accepting iOS updates (which users must choose to install), in which case their machines will remain unprotected against vulnerabilities that legitimate automatic updates would have patched. Importantly, the impact of unpatched devices is not limited to those devices. Vulnerable software that has not been updated can become a vector for spreading malware, potentially compromising other machines on the network. The more users who turn off automatic updates, the more devices, the more information, the more people put at risk. Just as herd immunity to a disease is lost if enough members of the group are not vaccinated against the disease, if enough users stop auto-updating their devices, it will weaken the entire device security ecosystem. Indeed, one computer security expert has likened automatic updates to “a public health system for the Internet.” It is this whole system which the Court ultimately threatens to put at risk should it enforce its Order to Apple.
The other two things that were most interesting were not amicus briefs, but merely two of the letters to the judge. First was the letter from the UN Special Rapporteur on "the protection and promotion of the freedom of opinion and expression," David Kaye. He adds in some additional perspective about how the right to secure communications and encryption is a fundamental part of freedom of expression, including quoting from his own report on the benefits of encryption for the public, along with the challenges it presents for law enforcement. As he concludes:
My concern is that the subject order implicates the security, and thus the freedom of expression, of unknown but likely vast numbers of people, those who rely on secure communications for the reasons identified above and explicated in more detail in the report. This is fundamentally a problem of technology, one where compromising security for one and only one time and purpose seems exceedingly difficult if not impossible. Again from the report, “States must show, publicly and transparently, that other less intrusive means are unavailable or have failed and that only broadly intrusive measures, such as backdoors, would achieve the legitimate aim.” ... It is not clear that the Government has tried other means short of compelling Apple’s code-writing in this case, such as enlisting the technical expertise of other agencies to access this particular phone.Finally, there's the letter from Salihin Kondoker, whose wife, Anies, was one of the people shot, but not killed, in the attack. That letter is truly worth reading. He notes that while he has been "frustrated that there isn't more information" as to why the attack occurred, and that he was initially frustrated that Apple might be a roadblock. But then, as he understood more of the details, he supports Apple's decision.
When I first learned Apple was opposing the order I was frustrated that it would be yet another roadblock. But as I read more about their case, I have come to understand their fight is for something much bigger than one phone. They are worried that this software the government wants them to use will be used against millions of other innocent people. I share their fear.You never know how much weight a court will put on these things. Amicus briefs often are little more than a chance for some publicity for those filing them. But many of these briefs (and letters) are detailed, thoughtful and actually do add compelling and important perspectives on the larger issues involved in this case. Hopefully the judge pays attention.
I support Apple and the decision they have made. I don’t believe Tim Cook or any Apple employee believes in supporting terrorism any more than I do. I think the vicious attacks I’ve read in the media against one of America’s greatest companies are terrible.
In my opinion it is unlikely there is any valuable information on this phone. This was a work phone. My wife also had an iPhone issued by the County and she did not use it for any personal communication. San Bernardino is one of the largest Counties in the country. They can track the phone on GPS in case they needed to determine where people were. Second, both the iCloud account and carrier account were controlled by the county so they could track any communications. This was common knowledge among my wife and other employees. Why then would someone store vital contacts related to an attack on a phone they knew the county had access to? They destroyed their personal phones after the attack. And I believe they did that for a reason.
[....]
Finally, and the reason for my letter to the court, I believe privacy is important and Apple should stay firm in their decision. Neither I, nor my wife, want to raise our children in a world where privacy is the tradeoff for security. I believe this case will have a huge impact all over the world. You will have agencies coming from all over the world to get access to the software the FBI is asking Apple for. It will be abused all over to spy on innocent people.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: all writs act, amicus briefs, backdoors, calea, doj, encryption, fbi
Companies: apple
Reader Comments
Subscribe: RSS
View by: Time | Thread
A tricky balance
And on the other hand you've got various companies, security experts, human rights and privacy groups all pointing out that forcing Apple to decrypt the phone would open a pandora's box of problems, both directly and indirectly.
Between the two you'd think the decision would be an easy one, hopefully the judge in this case, and whichever judges are involved in the appeal(s) rule on the right side.
[ link to this | view in thread ]
Both sides of the story
It's important to look at both sides of the issue, to eliminate any perceived bias.
[ link to this | view in thread ]
The Other Question
What if Apple hints, since they can't outright say, that participation would be career-limiting? And no engineer wants to do this work? Can the court force a specific engineer to do something like that? Can they force Apple to turn over the proprietary guts of their core business so an outsider can take a crack?
This isn't "turn over the key", this is "here's a lock you figure out how to pick it". It requires creative new input.
[ link to this | view in thread ]
Amicus briefs in opposition to Apple
However, yesterday, Courthouse News Service published a story by Matt Reynolds, which contains links to two of the amicus briefs filed in opposition to Apple.
“Battle Lines Drawn in Apple's Fight With FBI”
[ link to this | view in thread ]
The sound of desperation...
[ link to this | view in thread ]
Re: Both sides of the story
Also, one would want to eliminate bias, not perceived bias. Which is all the one side has in either of these things.
[ link to this | view in thread ]
Point of Order...
Not quite correct, as the request to submit describes:
So they did comply with the court order, in the end. They just salted the fields as they retreated.
[ link to this | view in thread ]
Re: The Other Question
[ link to this | view in thread ]
So updates have come to a halt.
[ link to this | view in thread ]
Re: Re: The Other Question
[ link to this | view in thread ]
Re: Re: Both sides of the story
[ link to this | view in thread ]
Re: Amicus briefs in opposition to Apple
https://www.techdirt.com/articles/20160304/00273733803/law-enforcement-groups-file-amicus-brie f-favor-fbi-which-undermines-dojs-claim-that-this-is-just-about-one-phone.shtml
[ link to this | view in thread ]
Re: Re: Both sides of the story
Was I wrong?
[ link to this | view in thread ]
Re: Both sides of the story
At least climate change deniers have "if we ignore it we can make more money in entrenched markets in the short term!" on their side, as well as the fact that there's been some shoddy science (some, not most) on the climate change side of things. The FBI doesn't even have THAT.
[ link to this | view in thread ]
Re: Re: The Other Question
Hey Apple, do you have any positions open for this? You don't even have to pay me!
[ link to this | view in thread ]
I'm just gonna say what everyone else is thinking here. He meant the US government not Russia.
[ link to this | view in thread ]
Re: Re: The Other Question
[ link to this | view in thread ]
Actual Brief
• San Bernardino County District Attorney Amicus Curiae Brief In Support of the United States Government
[ link to this | view in thread ]
13th Amendment.
Section 1. Neither slavery nor involuntary servitude, except as a punishment for crime whereof the party shall have been duly convicted, shall exist within the United States, or any place subject to their jurisdiction.
Section 2. Congress shall have power to enforce this article by appropriate legislation.
...
I wonder why this angle hasn't been raised yet. specifically the 'involuntary servitude' bit.
[ link to this | view in thread ]
Re: Actual Brief
[ link to this | view in thread ]
[ link to this | view in thread ]
Re: Re: Both sides of the story
Oh, did you think your side was inherently right? Sorry, science doesn't work that way. That would be religion.
[ link to this | view in thread ]
Re: A tricky balance
France had a "debate" after January last year. It was about pushing for new and quite extreme laws extending the government ability to spy on basically everyone.
A handful of politicians were all for it; all security, legal and even counterterrorism experts were against it.
Can you guess what the result was?
[ link to this | view in thread ]
[ link to this | view in thread ]
Re: Re: Re: Both sides of the story
[ link to this | view in thread ]
The courts order does not force Apple to update everyone's phone. It doesn't force them to make secret updates that will make your phone less secure. It doesn't order Apple to change their current OS that is on your phone or make your phone less secure.
Many of briefs seem to be stuck on this very notion. Much of the scaremongering going on runs exactly down that road. I personally think that Amazon pulled the "removing encryption" thing to scare people (and oh yeah, they walked it back so fast they probably have corporate whiplash).
It's really sad to see how quickly this has gone away from the subject at hand and been turned into something really scary and mean.
[ link to this | view in thread ]
Re: Re: Re: Re: Both sides of the story
No need to go that far. I can stop at "it is, always has been and always will be, a dynamic system which is only poorly understood." Look, but touch sparingly, because you've no idea really what anything will cause it to do.
Its behavior has been surprising us since we came to exist and that's pretty much the nature of the beast.
[ link to this | view in thread ]
Re:
[ link to this | view in thread ]
Re: Re:
If you want your phone to be more secure, move to an 8 or 10 digit pincode - or use an even longer pincode for your encrypted documents. Then even if Apple let this update out into the wild and supported it being applied to phones without bricking them, you would still be many times MORE secure than you are today.
[ link to this | view in thread ]
Re: Re: Re:
I haven't heard that many people making this argument, which is good since, as you say, it's a misleading one.
The actual argument is this: the government is using this case to set a legal precedent that would allow them to force any tech company to go to extraordinary lengths to break your security.
If this is successful, it simply means that this is another aspect where you are prevented by law from being able to trust the tech that you use and the companies that provide it.
[ link to this | view in thread ]