from the it's-a-crypto-party dept
Support our crowdfunding campaign to help us keep covering stories like these!
In the last week or so, it became quite clear that a fair number of tech companies and organizations in the civil liberties community would file amicus (friend of the court) briefs urging magistrate judge Sheri Pym to side with Apple over the Justice Department and the FBI. However, now that the briefs are in, it's
fairly staggering just how many companies, organizations and individuals signed onto briefs supporting Apple. Yes, many of them teamed up and filed briefs together, but it's still a ton. And that's especially true for an issue at the district court level in front of a magistrate judge. Here's the big list put together by Apple, including links to various blog posts and press releases about the filings:
Amicus Briefs
- 32 Law Professors
- Access Now and Wickr Foundation | Press Release
- ACT/The App Association | Medium Post
- Airbnb, Atlassian, Automattic, CloudFlare, eBay, GitHub, Kickstarter, LinkedIn, Mapbox, Medium, Meetup, Reddit, Square, Squarespace, Twilio, Twitter and Wickr | Automattic & WordPress.com Blog Post | Tweet from Twitter
- Amazon, Box, Cisco, Dropbox, Evernote, Facebook, Google, Microsoft, Mozilla, Nest, Pinterest, Slack, Snapchat, WhatsApp, and Yahoo | Tweet from Box | Evernote Blog Post | Facebook Statement | Microsoft Blog Post | Mozilla Blog Post | Snapchat Blog Post | WhatsApp Facebook Post | Yahoo Tumblr Post
- American Civil Liberties Union, ACLU of Northern California, ACLU of Southern California, and ACLU of San Diego and Imperial Counties | Blog Post
- AT&T | Public Policy Blog Post
- AVG Technologies, Data Foundry, Golden Frog, the Computer & Communications Industry Association (CCIA), the Internet Association, and the Internet Infrastructure Coalition | Golden Frog Blog | CCIA News
- BSA|The Software Alliance, the Consumer Technology Association, the Information Technology Industry Council, and TechNet | Press Release
- Center for Democracy & Technology | Insight
- Electronic Frontier Foundation and 46 technologists, researchers, and cryptographers | Blog Post | Press Release
- Electronic Privacy Information Center (EPIC) and eight consumer privacy organizations | EPIC Top News
- Intel | Blog Post
- iPhone security and applied cryptography experts including Dino Dai Zovi, Dan Boneh (Stanford), Charlie Miller, Dr. Hovav Shacham (UC San Diego), Bruce Schneier (Harvard), Dan Wallach (Rice) and Jonathan Zdziarski | Blog Post
- The Media Institute | Press Release
- Privacy International and Human Rights Watch
Letters to the Court
Not entirely sure why, but Apple (at least as I'm writing this) did not include one other amicus brief on its side, which is
from Lavabit, the one company out there should should have
tremendous perspective on Apple's situation, given that it shut down its entire business to avoid having to hand over its encryption keys to the government. At this point, we just have Lavabit's motion for leave to file its brief, rather than the full brief. If and when that brief is released, we may address it separately.
I've also put all of these together in a collection which are embedded below (which also lets you search through the text of them all).
Having read through many of them (and skimmed the rest), there are certainly some common themes. Frankly, an awful lot of them do really feel like they're just a rehashing of
Apple's motion to vacate, repeating the claim that the All Writs Act should not apply (and that CALEA already covers this issue and doesn't allow such a solution). Obviously, all the filings from the various tech companies are getting a lot of the press attention, and they're worth reading to perhaps better understand a few of the key legal points that Apple is raising. Similarly, the filings from civil liberties/human rights groups aren't all that surprising either -- focusing more on some of the human rights questions. The
EFF brief is a bit different in focusing on the First Amendment issue, which many expect won't even be looked at by the courts. It's really the only brief that digs deeply into that issue and is worth a read for that reason (I've seen legal experts in fairly strong disagreement about the applicability of the First Amendment argument here).
Separately, I thought that
the filing from EPIC was worthwhile as well, in focusing in on the importance of encryption on phones these days because so much personal information is accessible from phones and because unencrypted phones are such a big target for criminals. That's a slightly different perspective that isn't covered nearly as much by other filings and actually could be a useful perspective for the court.
However, the most interesting filing of all may be the one filed by a group of
iPhone Security and Applied Cryptography Experts, and put together by Jennifer Granick and Riana Pfefferkorn from Stanford's Center for Internet and Society. That brief is
super educational in getting down into the weeds of just how dangerous it would be for Apple to create this code. Since so many people -- including the DOJ -- seem to think that Apple can just create this code and then make it disappear, this filing is
super useful in debunking that claim. It first notes that the Court did try to put in some "safeguards" against abuse of the code, but then explains why those won't help:
These rules are not meaningful barriers to misuse and abuse of the forensic
capabilities this Court is ordering Apple to create. First, the Order assumes that
Apple will create the Custom Code without any vulnerabilities in its
implementation. Vulnerabilities are common in software code, including Apple’s
iOS, and despite Apple’s best efforts. The government dismissively downplays the
effort required to develop and update the Custom Code, stating that Apple “writes
software code as part of its regular business,” including “routinely patch[ing]
security or functionality issues” in iOS and “releas[ing] new versions of [iOS] to
address issues.” Application at 15. But creating software (especially secure
software) is complex, and software development requires rigorous testing.
Vulnerabilities are common in software code, despite vendors’ best efforts.
To address this problem, vendors employ extensive pre-release testing; after-thefact
audits, including by independent security researchers; and regular updates. Yet
none of these practices, alone or in concert, can ensure that software will not be
vulnerable and subject to misuse. Yet, given the circumstances of this case, this
code is unlikely to go through this lifecycle, increasing the risk that it will
introduce vulnerabilities into the iPhone ecosystem.
For example, since first introducing the earliest iPhone, Apple has waged a
cat-and-mouse battle with “jailbreakers,” software developers who identified and
exploited vulnerabilities in the devices in order to run software other than that
signed by Apple and to defeat carrier locks that tied handsets to particular cellular
providers. Apple warns its users against jailbreaking and the practice is contrary to
Apple’s terms of service. Jailbreaking involves modifying the iPhone firmware so
that it will run software code without checking to see if the code has been signed
by Apple. When Apple releases a new iOS, scores of independent programmers
study the code, successfully finding ways to circumvent Apple’s imposed
restrictions. In response, Apple issues software updates to defeat these jailbreaks.
Eventually, the jailbreaking community finds new ways to circumvent controls
built into Apple’s increasingly secure iOSes. Today, all iOS versions through 9.2
are jailbroken, even though doing so is increasingly harder due to Apple’s efforts.
In other words, vulnerabilities in Apple’s software have persisted for years
even though Apple very much does not want them to. This is a lesson for this case. Apple can try its very hardest to create this Custom Code as the Court directs.
Nevertheless, it may well fail, as it has acknowledged to the Court.... Even with time and extensive testing, which the government’s
sense of urgency seems designed to deny Apple, it is extremely difficult to write
bug-free code. Software bugs can interact with existing code in complex ways,
creating unanticipated new paths for bypassing iPhone security and exploiting the
phone.
This is a point that so few people outside of the tech world seem to get. Security vulnerabilities exist and software (and hardware) are incredibly complex.
Purposefully creating vulnerabilities likely
creates even more vulnerabilities and those can be pretty dangerous. And that's especially true in a situation -- such as this one -- where Apple will have to create the code entirely by itself, and without any outside security audit possible.
Furthermore, the brief notes the fact that, contrary to what many keep insisting, there's a high likelihood that any such code will leak out:
This Court also allowed the Custom Code to stay with Apple, rather than go
to the FBI. Simply put, if no one else has the Custom Code, no one else should be
able to use it, at least not without Apple’s knowledge. However, once created, this
software is going to be very valuable to law enforcement, intelligence agencies,
corporate spies, identity thieves, hackers, and other attackers who will want to steal
or buy the Custom Code. Keeping the Custom Code secret is essential to ensuring
that this forensic software not pose a broader security threat to iOS users. But the
high demand poses a serious risk that the Custom Code will leak outside of
Apple’s facilities.
Other governments, or ours, may eventually compel Apple to turn the
Custom Code over so that law enforcement officials can unlock phones without
delay or Apple oversight. Authoritarian governments will likely be the most
enthusiastic customers for the Custom Code this Court is contemplating ordering
Apple to create and sign. The software will be used in China, Russia, Turkey, the
United Arab Emirates, and other governments with poor human-rights records
where iPhones are sold.
Inadequate security practices by those governments increase the risk that
attackers will acquire and use the Custom Code. Given the Custom Code’s value,
unscrupulous government officials in corruption-plagued jurisdictions could
foreseeably sell the Custom Code to third parties. For example, if the Russian
government compelled Apple to hand over the Custom Code, it could end up being
sold by a corrupt agent to a Russian identity-theft ring. Even without selling it,
corrupt officials could also use the code for their own agendas, such as to target
political or personal enemies who had broken no law. Journalists, human-rights
advocates, religious and sexual minorities, and others in those countries are at
much greater risk if software that can bypass passcode limitations exists.
There is also a danger that the Custom Code will be lost or stolen. The more
often Apple must use the forensic capability this Court is ordering it to create, the
more people have to have access to it. The more people who have access to the
Custom Code, the more likely it will leak. The software will be valuable to anyone
eager to bypass security measures on one of the most secure smartphones on the
market. The incentive to steal the Custom Code is huge. The Custom Code would
be invaluable to identity thieves, blackmailers, and those engaged in corporate
espionage and intellectual property theft, to name a few.
Those technicians responsible for using the Custom Code to comply with
access demands will likely be targeted by phishing attacks—emails carefully
designed to seem legitimate but which contain malware—that seek to steal the
Custom Code. The same technicians will be approached with offers to buy the
software. The price offered could be irresistibly high, as the Custom Code will be
worth a lot to foreign national security officials and organized crime syndicates,
and can be sold to multiple customers. Or Apple technicians may be blackmailed
to the same end. In short, the Custom Code will be exceedingly valuable and in
danger of leaking or being stolen.
Finally, that brief also notes a key point being raised increasingly by security experts: that if a company can be forced to remove security features in an automatic update, it creates incentives for people not to update their devices, which will open up many more serious security problems:
Automatic updates are an important way that software companies ensure
their users are as protected as possible from attackers, without inconvenience,
significant effort, or technical savvy on the part of the user (who is more likely to
install security updates when there is little or nothing she needs to do). These autoupdates
are one of the reasons why the millions of iPhones currently in use
worldwide are very secure.
Consumers (and their devices) trust these auto-updates because they are
signed by the vendor. A cryptographic signature from Microsoft or Apple assures
the user that the software she is about to install legitimately comes from the
company she trusts. It is akin to Apple saying, “This is Apple, and we stand behind
this software.” Here, however, the Court is contemplating ordering Apple to sign
software it does not stand behind and in fact considers “too dangerous to build.”
.... And the next logical step if this Court enforces its Order is
for the FBI to ask to compel other vendors, in addition to Apple, to sign other
software that bypasses other customer security measures, creating new and
different risks.
This is why compelling cryptographic signatures is extremely risky.
Automatic software updates are a crucial vehicle for maintaining the security of
iOS devices and other computers, but they can be effective only so long as users
continue to trust them. If the Court compels Apple to create and sign the Custom
Code in this high-profile case, then all computer users, especially those for whom
smartphone privacy may already be a concern, could become suspicious of all
software updates going forward. That is because a member of the public could
reasonably fear that in the future, even a signed software update from a trusted
vendor will bypass passcode limitations, convert her iPhone into an audio or video
recording device, or otherwise interfere with her property, privacy, or security
interests. Users will know that these updates could be software designed to
extract private data from the user’s machine, but which a company was forced to
sign at the behest of some court, law enforcement, or other government official.
The code would be indistinguishable from a genuine update created, signed, and
transmitted of the vendor’s own free will.
This distrust would have serious ramifications for computer security at large.
In response, some users would likely stop accepting iOS updates (which users must
choose to install), in which case their machines will remain unprotected against
vulnerabilities that legitimate automatic updates would have patched. Importantly,
the impact of unpatched devices is not limited to those devices. Vulnerable
software that has not been updated can become a vector for spreading malware,
potentially compromising other machines on the network. The more users who turn
off automatic updates, the more devices, the more information, the more people put
at risk. Just as herd immunity to a disease is lost if enough members of the group
are not vaccinated against the disease, if enough users stop auto-updating their
devices, it will weaken the entire device security ecosystem. Indeed, one computer
security expert has likened automatic updates to “a public health system for the
Internet.” It is this whole system which the Court ultimately threatens to put at
risk should it enforce its Order to Apple.
Good stuff. One hopes that the judge will read this brief, in particular, carefully.
The other two things that were most interesting were not amicus briefs, but merely two of the letters to the judge. First was
the letter from the UN Special Rapporteur on "the protection and promotion of the freedom of opinion and expression," David Kaye. He adds in some additional perspective about how the right to secure communications and encryption is a fundamental part of freedom of expression, including quoting from his own report on the benefits of encryption for the public, along with the challenges it presents for law enforcement. As he concludes:
My concern is that the
subject order implicates the security, and thus the freedom of expression, of unknown
but likely vast numbers of people, those who rely on secure communications for the
reasons identified above and explicated in more detail in the report. This is
fundamentally a problem of technology, one where compromising security for one and
only one time and purpose seems exceedingly difficult if not impossible. Again from
the report, “States must show, publicly and transparently, that other less intrusive means
are unavailable or have failed and that only broadly intrusive measures, such as
backdoors, would achieve the legitimate aim.” ... It is not clear that the
Government has tried other means short of compelling Apple’s code-writing in this
case, such as enlisting the technical expertise of other agencies to access this particular
phone.
Finally, there's the
letter from Salihin Kondoker, whose wife, Anies, was one of the people shot, but not killed, in the attack. That letter is truly worth reading. He notes that while he has been "frustrated that there isn't more information" as to why the attack occurred, and that he was
initially frustrated that Apple might be a roadblock. But then, as he understood more of the details, he supports Apple's decision.
When I first learned Apple was opposing the order I was frustrated that it would be yet another
roadblock. But as I read more about their case, I have come to understand their fight is for
something much bigger than one phone. They are worried that this software the government
wants them to use will be used against millions of other innocent people. I share their fear.
I support Apple and the decision they have made. I don’t believe Tim Cook or any Apple
employee believes in supporting terrorism any more than I do. I think the vicious attacks I’ve
read in the media against one of America’s greatest companies are terrible.
In my opinion it is unlikely there is any valuable information on this phone. This was a work
phone. My wife also had an iPhone issued by the County and she did not use it for any personal
communication. San Bernardino is one of the largest Counties in the country. They can track the
phone on GPS in case they needed to determine where people were. Second, both the iCloud
account and carrier account were controlled by the county so they could track any
communications. This was common knowledge among my wife and other employees. Why then
would someone store vital contacts related to an attack on a phone they knew the county had
access to? They destroyed their personal phones after the attack. And I believe they did that for a
reason.
[....]
Finally, and the reason for my letter to the court, I believe privacy is important and Apple should
stay firm in their decision. Neither I, nor my wife, want to raise our children in a world where
privacy is the tradeoff for security. I believe this case will have a huge impact all over the world.
You will have agencies coming from all over the world to get access to the software the FBI is
asking Apple for. It will be abused all over to spy on innocent people.
You never know how much weight a court will put on these things. Amicus briefs often are little more than a chance for some publicity for those filing them. But many of these briefs (and letters) are detailed, thoughtful and actually do add compelling and important perspectives on the larger issues involved in this case. Hopefully the judge pays attention.
Support our crowdfunding campaign to help us keep covering stories like these!
Filed Under: all writs act, amicus briefs, backdoors, calea, doj, encryption, fbi
Companies: apple