One of the more ridiculous claims in the DOJ's filing against Apple last week, was its decision to pick up on former NSA lawyer Stewart Baker's conspiracy theory that Apple had built backdoors into its products for China (side note: I met Stewart in person for the first time recently, and he mocked me about this, saying that I should agree with him on this point). However, as we noted in our post last week, there doesn't seem to be much evidence to support Baker's claims. The two key issues were using the Chinese wireless standard WAPI -- which some have claimed includes some sort of backdoor, but it was also the only real local area wireless tech in China for a while -- and the decision to store iCloud data in China. However, as we noted, there have been reports that the Chinese government tried to then conduct a man in the middle attack against the iCloud servers. If Apple had actually given the government a backdoor, then why would it need to do that?
Apple uses the same security protocols everywhere in the world.
Apple has never made user data, whether stored on the iPhone or in iCloud, more technologically accessible to any country's government. We believe any such access is too dangerous to allow. Apple has also not provided any government with its proprietary iOS source code. While governmental agencies in various countries, including the United States, perform regulatory reviews of new iPhone
releases, all that Apple provides in those circumstances is an unmodified iPhone device.
It is my understanding that Apple has never worked with any government agency from any country to create a "backdoor" in any of our products or services.
Now, some may push back on the point about WAPI, but again, making use of a third party technology that potentially has backdoors (some of which could be protected against) and being told by the government to build special backdoors just for that government are still vastly different scenarios.
We already covered Apple's reply brief in the fight over getting into Syed Farook's encrypted work iPhone, highlighting a number of lies by the DOJ's filing. But I wanted to focus on a few more highlighted in the additional declarations filed by Apple as well. The DOJ kept insisting that Apple built this feature specifically to keep law enforcement out, which is ridiculous. Apple notes repeatedly that it built the feature to keep its customers safer from malicious attacks, most of which are not from law enforcement. But the DOJ keeps pretending that it was a deliberate attempt to mock law enforcement. In the DOJ's filing:
Here, Apple has deliberately used its control over its software to block law-enforcement requests for access to the contents of its devices, and it has advertised that feature to sell its products.
Since the introduction of iOS 8 in October 2014, Apple has placed
approximately 1,793 advertisements worldwide—627 in the United States alone—of
different types, including, print ads, television ads, online ads, cinema ads, radio ads
and billboards. Those advertisements have generated an estimated 253 billion
impressions worldwide and 99 billion impressions in the United States alone (an
impression is an estimate of the number of times an ad is viewed or displayed online).
Of those advertisements, not a single one has ever advertised or promoted
the ability of Apple’s software to block law enforcement requests for access to the
contents of Apple devices.
Indeed, only three of those advertisements reference security at all, and all three related to the Apple Pay service, and then only to say that Apple Pay is "safer than a credit card, and keeps your info yours."
I'm assuming the DOJ, if it decides to push this point, will argue that it wasn't talking about those kinds of advertisements, but Apple's statements to the press, but still, there's a strong point here. Contrary to what the DOJ is saying, no, the company does not proactively advertise the encryption as a way to keep law enforcement out. Or, in short, no, FBI, strong encryption on the iPhone just isn't about you.
As expected, Apple has now responded to the DOJ in the case about whether or not it can be forced to write code to break its own security features to help the FBI access the encrypted work iPhone of Syed Farook, one of the San Bernardino attackers. As we noted, the DOJ's filing was chock-full of blatantly misleading claims, and Apple was flabbergasted by just how ridiculous that filing was. That comes through in this response.
The government attempts to rewrite history by portraying the Act as an all-powerful magic wand rather than the limited procedural tool it is. As theorized by the government, the Act can authorize any and all relief except in two situations: (1) where Congress enacts a specific statute prohibiting the precise action (i.e., says a court may not “order a smartphone manufacturer to remove barriers to accessing stored data on a particular smartphone,” ... or (2) where the government seeks to “arbitrarily dragoon[]” or “forcibly deputize[]” “random citizens” off the street.... Thus, according to the government, short of kidnapping or breaking an express law, the courts can order private parties to do virtually anything the Justice Department and FBI can dream up. The Founders would be appalled.
The Founders would be appalled. That's quite a statement.
Apple also slams the DOJ for insisting that this really is all about the one iPhone and that the court should ignore the wider precedent, citing FBI Director James Comey's own statements:
It has become crystal clear that this case is not about a “modest” order and a “single iPhone,” Opp. 1, as the FBI Director himself admitted when testifying before Congress two weeks ago. Ex. EE at 35 [FBI Director James Comey, Encryption Hr’g] (“[T]he broader question we’re talking about here goes far beyond phones or far beyond any case. This collision between public safety and privacy—the courts cannot resolve that.”). Instead, this case hinges on a contentious policy issue about how society should weigh what law enforcement officials want against the widespread repercussions and serious risks their demands would create. “Democracies resolve such tensions through robust debate” among the people and their elected representatives, Dkt. 16-8 [Comey, Going Dark], not through an unprecedented All Writs Act proceeding.
Apple then, repeatedly, points out where the DOJ selectively quoted, misquoted or misleadingly quoted arguments in its favor. For example:
The government misquotes Bank of the United States v. Halstead,..., for the proposition that “‘[t]he operation of [the Act]’” should not be limited
“‘to that which it would have had in the year 1789.’” ... (misquoting Halstead, 23 U.S. (10 Wheat.) at 62) (alterations are the government’s). But what the Court actually said was that the “operation of an execution”—the ancient common law writ of “venditioni exponas”—is not limited to that “which it would have had in the year 1789.” ... see also... (“That executions are among the writs hereby authorized to be issued, cannot admit of a doubt . . . .”). The narrow holding of Halstead was that the Act (and the Process Act of 1792) allowed courts “to alter the form of the process of execution.” ... (courts are not limited to the form of the writ of execution “in use in the Supreme Courts of the several States in the year 1789”). The limited “power given to the Courts over their process is no more than authorizing them to regulate and direct the conduct of the Marshal, in the execution of the process.”
The authority to alter the process by which courts issue traditional common law writs is not authority to invent entirely new writs with no common law analog. But that is precisely what the government is asking this Court to do: The Order requiring Apple to create software so that the FBI can hack into the iPhone has no common law analog.
The filing then goes step by step in pointing out how the government is wrong about almost everything. The DOJ, for example, kept insisting that CALEA doesn't apply at all to Apple, but Apple points out that the DOJ just seems to be totally misreading the law:
Contrary to the government’s assertion that its request merely “brush[es] up against similar issues” to CALEA..., CALEA, in fact, has three critical limitations—two of which the government ignores entirely—that preclude the relief the government seeks.... First, CALEA prohibits law enforcement agencies from requiring “electronic communication service” providers to adopt “any specific design of equipment, facilities, services, features, or system configurations . . . .” The term “electronic communication service” provider is broadly defined to encompass Apple. ... (“any service which provides to users thereof the ability to send or receive wire or electronic communications”). Apple is an “electronic communication services” provider for purposes of the very services at issue here because Apple’s software allows users to “send or receive . . . communications” between iPhones through features such as iMessage and Mail....
The government acknowledges that FaceTime and iMessage are electronic communication services, but asserts that this fact is irrelevant because “the Court’s order does not bear at all upon the operation of those programs.” ... Not so. The passcode Apple is being asked to circumvent is a feature of the same Apple iOS that runs FaceTime, iMessage, and Mail, because an integral part of providing those services is enabling the phone’s owner to password-protect the private information contained within those communications. More importantly, the very communications to which law enforcement seeks access are the iMessage communications stored on the phone.... And, only a few pages after asserting that “the Court’s order does not bear at all upon the operation of” FaceTime and iMessage for purposes of the CALEA analysis..., the government spends several pages seeking to justify the Court’s order based on those very same programs, arguing that they render Apple “intimately close” to the crime for purposes of the New York Telephone analysis.
Second, the government does not dispute, or even discuss, that CALEA excludes “information services” providers from the scope of its mandatory assistance provisions.... Apple is indisputably an information services provider given
the features of iOS, including Facetime, iMessage, and Mail....
Finally, CALEA makes clear that even telecommunications carriers (a category of providers subject to more intrusive requirements under CALEA, but which Apple is not) cannot be required to “ensure the government’s ability” to decrypt or to create decryption programs the company does not already “possess.”... If companies subject to CALEA’s obligations cannot be required to bear this burden, Congress surely did not intend to allow parties specifically exempted by CALEA (such as Apple) to be subjected to it. The government fails to address this truism.
Next, Apple rebuts the DOJ saying that since CALEA doesn't address this specific situation, that means Congress is just leaving it up to the courts to use the All Writs Act. As Apple points out, in some cases, Congress not doing something doesn't mean it rejected certain positions, but in this case, the legislative history is quite clear that Congress did not intend for companies to be forced to help in this manner.
Here, Congress chose to require limited third-party assistance in certain statutes designed to aid law enforcement in gathering electronic evidence (although none as expansive as what the government seeks here), but it has declined to include similar provisions in other statutes, despite vigorous lobbying by law enforcement and notwithstanding its “prolonged and acute awareness of so important an issue” as the one presented here.... Accordingly, the lack of statutory authorization in CALEA or any of the complementary statutes in the “comprehensive federal scheme” of surveillance and telecommunications law speaks volumes.... To that end, Congress chose to “greatly narrow[]” the “scope of [CALEA],” which ran contrary to the FBI’s interests but was “important from a privacy standpoint.” ... Indeed, CALEA’s provisions were drafted to “limit[] the scope of [industry’s] assistance requirements in several important ways.”....
That the Executive Branch recently abandoned plans to seek legislation expanding CALEA’s reach... provides renewed confirmation that Congress has not acceded to the FBI’s wishes, and belies the government’s view that it has possessed such authority under the All Writs Act since 1789.
In fact, in a footnote, Apple goes even further in not just blasting the DOJ's suggestion that Congress didn't really consider a legislative proposal to update CALEA to suck in requirements for internet communications companies, but also highlighting the infamous quote from top intelligence community lawyer Robert Litt about how they'd just wait for the next terrorist attack and get the law passed in their favor at that point.
The government’s attempts to minimize CALEA II, saying its plans consisted of “mere[] vague discussions” that never developed into a formal legislative submission ..., but federal officials familiar with that failed lobbying effort confirmed that the FBI had in fact developed a “draft proposal” containing a web of detailed provisions, including specific fines and compliance timelines, and had floated that proposal with the White House..... As
The Washington Post reported, advocates of the proposal within the government dropped the effort, because they determined they could not get what they wanted from Congress at that time: “Although ‘the legislative environment is very hostile today,’ the intelligence community’s top lawyer, Robert S. Litt, said to colleagues in an August [2015] e-mail, which was obtained by The Post, ‘it could turn in the event of a terrorist attack or criminal event where strong encryption can be shown to have hindered law enforcement.’ There is value, he said, in ‘keeping our options open for such a situation.’”
Next Apple goes through the arguments for saying that, even if the All Writs Act does apply, and even if the court accepts the DOJ's made up three factor test, Apple should still prevail. It notes, again, that it is "far removed" from the issue and reminds the court that the order sought here is very different from past cases where Apple has cooperated:
The government argues that “courts have already issued AWA orders” requiring manufacturers to “unlock” phones ... but those cases involved orders requiring “unlocking” assistance to provide access through existing means, not the extraordinary remedy sought here, i.e., an order that requires creating new software to undermine the phones’ (or in the Blake case, the iPad’s) security safeguards.
It also mocks that weird argument from the DOJ that said because Apple "licenses" rather than "sells" its software, that means Apple is more closely tied to the case:
The government discusses Apple’s software licensing and data policies at length, equating Apple to a feudal lord demanding fealty from its customers (“suzerainty”). ... But the government does not cite any authority, and none exists, suggesting that the design features and software that exist on every iPhone somehow link Apple to the subject phone and the crime. Likewise, the government has cited no case holding that a license to use a product constituted a sufficient connection under New York Telephone. Indeed, under the government’s theory, any ongoing postpurchase connection between a manufacturer or service provider and a consumer suffices to connect the two in perpetuity—even where, as here, the data on the iPhone is inaccessible to Apple.
From there, Apple dives in on the question of how much of a "burden" this would be. This is the issue that Judge Pym has indicated she's most interested in, and Apple goes deep here -- again and again focusing on how the DOJ was blatantly misleading in its motion:
Forcing Apple to create new software that degrades its security features is unprecedented and unlike any burden ever imposed under the All Writs Act. The government’s assertion that the phone companies in Mountain Bell and In re Application of the U.S. for an Order Authorizing the Installation of a Pen Register or Touch-Tone Decoder and a Terminating Trap ..., were conscripted to “write” code, akin to the request here... mischaracterizes the actual assistance required in those cases. The government seizes on the word “programmed” in those cases and superficially equates it to the process of creating new software..... But the “programming” in those cases—back in 1979 and 1980—consisted of a “technician” using a “teletypewriter” in Mountain Bell ..., and “t[ook] less than one minute” in Penn Bell... Indeed, in Mountain Bell, the government itself stated that the only burden imposed “was a large number of print-outs on the teletype machine”—not creating new code..... More importantly, the phone companies already had and themselves used the tracing capabilities the government wanted to access.... And although relying heavily on Mountain Bell, the government neglects to point out the court’s explicit warning that “[t]his holding is a narrow one, and our decision today should not be read to authorize the wholesale imposition upon private, third parties of duties pursuant to search warrants.” ...This case stands light years from Mountain Bell. The government seeks to commandeer Apple to design, create, test, and validate a new operating system that does not exist, and that Apple believes—with overwhelming support from the technology community and security experts—is too dangerous to create.
Seeking to belittle this widely accepted policy position, the government grossly mischaracterizes Apple’s objection to the requested Order as a concern that “compliance will tarnish its brand”..., a mischaracterization that both the FBI Director and the courts have flatly rejected. [See Comey] (“I don’t question [Apple’s] motive”);... (disagreeing “with the government’s contention that Apple’s objection [to being compelled to decrypt an iPhone] is not ‘conscientious’ but merely a matter of ‘its concern with public relations’”). As Apple explained in its Motion, Apple prioritizes the security and privacy of its users, and that priority is reflected in Apple’s increasingly secure operating systems, in which Apple has chosen not to create a back door.
Apple also calls out the DOJ's technical ignorance.
The government’s assertion that “there is no reason to think that the code Apple writes in compliance with the Order will ever leave Apple’s possession” ... simply shows the government misunderstands the technology and the nature of the cyber-threat landscape. As Apple engineer Erik Neuenschwander states:
I believe that Apple’s iOS platform is the most-attacked software platform in existence. Each time Apple closes one vulnerability, attackers work to find another. This is a constant and never-ending battle. Mr. Perino’s description of third-party efforts to circumvent Apple’s security demonstrates this point. And the protections that the government now asks Apple to compromise are the most security-critical software component of the iPhone—any vulnerability or back door, whether introduced intentionally or unintentionally, can represent a risk to all users of Apple devices simultaneously.
... The government is also mistaken in claiming that the crippled iOS it wants Apple to build can only be used on one iPhone:
Mr. Perino’s characterization of Apple’s process . . . is inaccurate. Apple does not create hundreds of millions of operating systems each tailored to an individual device. Each time Apple releases a new operating system, that operating system is the same for every device of a given model. The operating system then gets a personalized signature specific to each device. This personalization occurs as part of the installation process after the iOS is created.
Once GovtOS is created, personalizing it to a new device becomes a simple process. If Apple were forced to create GovtOS for installation on the device at issue in this case, it would likely take only minutes for Apple, or a malicious actor with sufficient access, to perform the necessary engineering work to install it on another device of the same model.
. . . [T]he initial creation of GovtOS itself creates serious ongoing burdens and risks. This includes the risk that if the ability to install GovtOS got into the wrong hands, it would open a significant new avenue of attack, undermining the security protections that Apple has spent years developing to protect its customers.
And, not surprisingly, Apple angrily attacks the DOJ's bogus misleading use of Apple's transparency report statements about responding to lawaful requests for government information in China, by pointing out how that's quite different than this situation:
Finally, the government attempts to disclaim the obvious international implications of its demand, asserting that any pressure to hand over the same software to foreign agents “flows from [Apple’s] decision to do business in foreign countries . . . .”. Contrary to the government’s misleading statistics ..., which had to do with lawful process and did not compel the creation of software that undermines the security of its users, Apple has never built a back door of any kind into iOS, or otherwise made data stored on the iPhone or in iCloud more technically accessible to any country’s government.... The government is wrong in asserting that Apple made “special accommodations” for China, as Apple uses the same security protocols everywhere in the world and follows the same standards for responding to law enforcement requests.
Apple also points out that the FBI appears to be contradicting itself as well:
Moreover, while they now argue that the FBI’s changing of the iCloud passcode—which ended any hope of backing up the phone’s data and accessing it via iCloud—“was the reasoned decision of experienced FBI agents”, the FBI Director himself admitted to Congress under oath that the decision was a “mistake”.... The Justice Department’s shifting, contradictory positions on this issue—first blaming the passcode change on the County, then admitting that the FBI told the County to change the passcode after the County objected to being blamed for doing so, and now trying to justify the decision in the face of Director Comey’s admission that it was a mistake—discredits any notion that the government properly exhausted all viable investigative alternatives before seeking this extraordinary order from this Court.
On the Constitutional questions, again Apple points out that the DOJ doesn't appear to understand what it's talking about:
The government begins its First Amendment analysis by suggesting that “[t]here is reason to doubt that functional programming is even entitled to traditional speech protections” ... , evincing its confusion over the technology it demands Apple create. Even assuming there is such a thing as purely functional code, creating the type of software demanded here, an operating system that has never existed before, would necessarily involve precisely the kind of expression of ideas and concepts protected by the First Amendment. Because writing code requires a choice of (1) language, (2) audience, and (3) syntax and vocabulary, as well as the creation of (4) data structures, (5) algorithms to manipulate and transform data, (6) detailed textual descriptions explaining what code is doing, and (7) methods of communicating information to the user, “[t]here are a number of ways to write code to accomplish a given task.”... As such, code falls squarely within the First Amendment’s protection, as even the cases cited by the government acknowledge...
Later it points out that the DOJ's claim that since Apple can write such code however it wants it's not compelled speech, Apple points out that their argument says the exact opposite:
The government attempts to evade this unavoidable conclusion by insisting that, “[t]o the extent [that] Apple’s software includes expressive elements . . . the Order permits Apple to express whatever it wants, so long as the software functions” by allowing it to hack into iPhones.... This serves only to illuminate the broader speech implications of the government’s request. The code that the government is asking the Court to force Apple to write contains an extra layer of expression unique to this case. When Apple designed iOS 8, it consciously took a position on an issue of public importance.... The government disagrees with Apple’s position and asks this Court to compel Apple to write new code that reflects its own viewpoint—a viewpoint that is deeply offensive to Apple.
The filing is basically Apple, over and over again, saying, "uh, what the DOJ said was wrong, clueless, technically ignorant, or purposely misleading." Hell, they even attack the DOJ's claim that the All Writs Act was used back in 1807 to force Aaron Burr's secretary to decrypt one of Burr's cipher-protected letters. Apple points out that the DOJ is lying.
The government contends that Chief Justice Marshall once ordered a third party to “provide decryption services” to the government.... He did nothing of the sort, and the All Writs Act was not even at issue in Burr. In that case, Aaron Burr’s secretary declined to state whether he “understood” the contents of a certain letter written in cipher, on the ground that he might incriminate himself.... The Court held that the clerk’s answer as to whether he understood the cipher could not incriminate him, and the Court thus held that “the witness may answer the question now propounded”—i.e., whether he understood the letter.... The Court did not require the clerk to decipher the letter.
If anything, to be honest, I'm surprised that Apple didn't go even harder on the DOJ for misrepresenting things. Either way, Apple is pretty clearly highlighting just how desperate the DOJ seems in this case.
Our nation is at war and this iPhone was used to kill Americans. We need to protect our homeland, not terrorists. To Tim Cook and Apple, cooperate with the FBI.
As surprised as we were to learn it was an iPhone that killed 14 people in San Bernardino, rather than the attackers and the weapons they wielded, Graham had yet another surprise in store for us.
Sen. Lindsey Graham (R-S.C.), who last December called on Silicon Valley to stop selling encrypted devices, expressed serious concern on Wednesday about the precedent the Department of Justice would set if it successfully compels Apple to break iPhone security features.
“I was all with you until I actually started getting briefed by the people in the Intel Community,” Graham told Attorney General Loretta Lynch during an oversight hearing in the Senate Judiciary Committee. “I will say that I’m a person that’s been moved by the arguments about the precedent we set and the damage we might be doing to our own national security.”
This is what happens when legislators stop following their gut instincts on subjects they know little about and actually seek input from those who do know what's involved and what's at stake. Graham -- without speaking to "people in the Intel Community" -- originally presented terrorism as Apple's problem. With the benefit of technically-adept hindsight, Graham is now seeing this for what it is: a push for a dangerous precedent that won't end with this one iPhone and Apple. It will move on to other manufacturers, service providers and communications platforms. Because this one iPhone (which is actually twelve iPhones) is just the foot in the door. Apple does not hold a monopoly on encrypted communications.
“One of the arguments Apple makes is that there are other companies that make encryption,” Graham said to Lynch during the hearing. “So from a terrorist point of view, you’re not limited to Apple’s iPhone to communicate are you?”
“I think the terrorists use any device they can to communicate,” the Attorney General responded.
“So this encryption issue, if you require Apple to unlock that phone that doesn’t deny terrorist the ability to communicate privately does it, there are others ways they can do this,” Graham noted.
The FBI -- which sees any communications it can't access as nothing more than a collection of smoking guns comprised of 0s and 1s -- will not stop with Apple. It already has its eyes on WhatsApp, one of the biggest messaging apps in the world -- one that also features end-to-end encryption.
The underlying point Graham is making -- having now spoken with those with the most at stake -- is that a successful push to force American companies to provide unprecedented access to law enforcement does little to stop global terrorism, while causing tremendous damage to those forced into complicity. If the FBI manages to pry open the front door, every other nation in the world is going expect Apple to hold the door open for them as well. And if they can't find a way to force Apple to do that, they may block it from selling its products in their countries. Or Apple may decide the market isn't worth the security hit. Either way, it hurts Apple, and terrorists will just move on to the next service/platform/manufacturer.
It's heartening to see Graham come around on this, especially considering he's spent the last few months coming down harshly on phone manufacturers for refusing to immediately comply with every ridiculous government demand.
One of the key lines that various supporters of backdooring encryption have repeated in the last year, is that they "just want to have a discussion" about the proper way to... put backdoors into encryption. Over and over again you had the likes of James Comey insisting that he wasn't demanding backdoors, but really just wanted a "national conversation" on the issue (despite the fact we had just such a conversation in the 90s and concluded: backdoors bad, let's move on.):
My goal today isn’t to tell people what to do. My goal is to urge our fellow citizens to participate in a conversation as a country about where we are, and where we want to be, with respect to the authority of law enforcement.
And, yet, now we're having that conversation. Very loudly. And while the conversation really has been going on for almost two years, in the last month it moved from a conversation among tech geeks and policy wonks into the mainstream, thanks to the DOJ's decision to force Apple to write some code that would undermine security features on the work iPhone of Syed Farook, one of the San Bernardino attackers. According to some reports, the DOJ and FBI purposely chose this case in the belief that it was a perfect "test" case for its side: one that appeared to involve "domestic terrorists" who murdered 14 people. There were reports claiming that Apple was fine fighting this case under seal, but that the DOJ purposely chose to make this request public.
However, now that this has resulted in just such a "national conversation" on the issue, the DOJ, FBI and others in the White House are suddenly realizing that perhaps the public isn't quite as with them as they had hoped. And now there are reports that some in the White House are regretting the decision to move forward and are experiencing this well known feeling:
According to the NY Times:
Officials had hoped the Apple case involving a terrorist’s iPhone would rally the public behind what they see as the need to have some access to information on smartphones. But many in the administration have begun to suspect that the F.B.I. and the Justice Department may have made a major strategic error by pushing the case into the public consciousness.
Many senior officials say an open conflict between Silicon Valley and Washington is exactly what they have been trying to avoid, especially when the Pentagon and intelligence agencies are trying to woo technology companies to come back into the government’s fold, and join the fight against the Islamic State. But it appears it is too late to confine the discussion to the back rooms in Washington or Silicon Valley.
While the various public polling on the issue has led to very mixed results, it's pretty clear that the public did not universally swing to the government's position on this. In fact, it appears that the more accurately the situation is described to the public, the more likely they are to side with Apple over the FBI. Given that, John Oliver's recent video on the subject certainly isn't good news for the DOJ.
Either way, the DOJ and FBI insisted they wanted a conversation on this, and now they're getting it. Perhaps they should have been more careful what they wished for.
Not surprisingly, Oliver's take is much clearer and much more accurate than many mainstream press reports on the issues in the case, appropriately mocking the many law enforcement officials who seem to think that, just because Apple employs smart engineers, they can somehow do the impossible and "safely" create a backdoor into an encrypted iPhone that won't have dangerous consequences. He even spends a bit of time reviewing the original Crypto Wars over the Clipper Chip and highlights cryptographer Matt Blaze's contribution in ending those wars by showing that the Clipper Chip could be hacked.
But the biggest contribution to the debate -- which I hope that people pay most attention to -- is the point that Oliver made in the end with his faux Apple commercial. Earlier in the piece, Oliver noted that this belief among law enforcement that Apple engineers can somehow magically do what they want is at least partially Apple's own fault, with its somewhat overstated marketing. So, Oliver's team made a "more realistic" Apple commercial which noted that Apple is constantly fighting security cracks and vulnerabilities and is consistently just half a step ahead of hackers with malicious intent (and, in many cases, half a step behind them).
This is the key point: Building secure products is very, very difficult and even the most secure products have security vulnerabilities in them that need to be constantly watched and patched. And what the government is doing here is not only asking Apple to not patch a security vulnerability that it has found, but actively forcing Apple to make a new vulnerability and then effectively forcing Apple to keep it open. For all the talk of how Apple can just create the backdoor just this once and throw it away, this more like asking Apple to set off a bomb that blows the back off all houses in a city, and then saying, "okay, just throw away the bomb after you set it off."
Hopefully, as in cases like net neutrality, Oliver's piece does it's job in informing the public what's really going on.
We've written quite a few times about Polk County, Florida, Sheriff Grady Judd. You may recall him from the time he arrested two teenagers because they admitted to "bullying" another teen who committed suicide. Judd also promised to arrest the parents of both girls as well, stretching an already ridiculous understanding of the law to absolute breaking points (in fact all of the charges were dropped against the girls, because, all the talk of bullying was basically not true).
Judd also has made news for falsely arresting and then publicly shaming men, saying that they're "sexual predators" and parading them in front of the press, seizing their money and possessions and then "negotiating" to only give them back some of what they seized. Oh, and then there was the time that Judd used Craigslist to help arrest prostitutes... but then blamed Craigslist for the problem.
"Let me tell you, the first time we do have trouble getting into a cell phone, we're going to seek a court order from Apple and when they deny us I'm going to go lock the CEO of Apple up," Judd said in a press conference Wednesday.
Another report of the press conference said that Judd followed this up, for emphasis, with: "I'll lock the rascal up."
Yeah, you see, that's not how the law actually works. And you'd think, as Sheriff, Judd should know that. But he doesn't. Or he does and he doesn't care. Neither of which is a good sign in a sheriff.
"You cannot create a business model to go, 'we're not paying attention to the federal judge or to the state judge, because we're above the law,'" Judd said.
Of course, that's not the issue at all. It's not about ignoring a judge, it's about building a secure product, and what kinds of things a court can or cannot force a company to do to the security of its products. No one is saying they're "above the law." Except, it seems, Sheriff Grady Judd, who thinks that he can put Apple's CEO in jail based on his own desires, rather than what the law actually says.
This is not all that surprising, but President Obama, during his SXSW keynote interview, appears to have joined the crew of politicians making misleading statements pretending to be "balanced" on the question of encryption. The interview (the link above should start at the very beginning) talks about a variety of issues related to tech and government, but eventually the President zeroes in on the encryption issue. The embed below should start at that point (if not, it's at the 1 hour, 16 minute mark in the video). Unfortunately, the interviewer, Evan Smith of the Texas Tribune, falsely frames the issue as one of "security v. privacy" rather than what it actually is -- which is "security v. security."
In case you can't watch that, the President says he won't comment directly on the Apple legal fights, but then launches into the standard politician talking point of "yes, we want strong encryption, but bad people will use it so we need to figure out some way to break in."
If you watch that, the President is basically doing the same thing as all the Presidential candidates, stating that there's some sort of equivalency on both sides of the debate and that we need to find some sort of "balanced" solution short of strong encryption that will somehow let in law enforcement in some cases.
This is wrong. This is ignorant.
To his at least marginal credit, the President (unlike basically all of the Presidential candidates) did seem to acknowledge the arguments of the crypto community, but then tells them all that they're wrong. In some ways, this may be slightly better than those who don't even understand the actual issues at all, but it's still problematic.
Let's go through this line by line.
All of us value our privacy. And this is a society that is built on a Constitution and a Bill of Rights and a healthy skepticism about overreaching government power. Before smartphones were invented, and to this day, if there is probable cause to think that you have abducted a child, or that you are engaging in a terrorist plot, or you are guilty of some serious crime, law enforcement can appear at your doorstep and say 'we have a warrant to search your home' and they can go into your bedroom to rifle through your underwear to see if there's any evidence of wrongdoing.
Again, this is overstating the past and understating today's reality. Yes, you could always get a warrant to go "rifle through" someone's underwear, if you could present probable cause that such a search was reasonable to a judge. But that does not mean that the invention of smartphones really changed things so dramatically as President Obama presents here. For one, there has always been information that was inaccessible -- such as information that came from an in-person conversation or information in our brains or information that has been destroyed.
In fact, as lots of people have noted, today law enforcement has much more recorded evidence that it can obtain, totally unrelated to the encryption issue. This includes things like location information or information on people you called. That information used to not be available at all. So it's hellishly misleading to pretend that we've entered some new world of darkness for law enforcement when the reality is that the world is much, much brighter.
And we agree on that. Because we recognize that just like all our other rights, freedom of speech, freedom of religion, etc. there are going to be some constraints that we impose in order to make sure that we are safe, secure and living in a civilized society. Now technology is evolving so rapidly that new questions are being asked. And I am of the view that there are very real reasons why we want to make sure that government cannot just willy nilly get into everyone's iPhones, or smartphones, that are full of very personal information and very personal data. And, let's face it, the whole Snowden disclosure episode elevated people's suspicions of this.
[...]
That was a real issue. I will say, by the way, that -- and I don't want to go to far afield -- but the Snowden issue, vastly overstated the dangers to US citizens in terms of spying. Because the fact of the matter is that actually that our intelligence agencies are pretty scrupulous about US persons -- people on US soil. What those disclosures did identify were excesses overseas with respect to people who are not in this country. A lot of those have been fixed. Don't take my word for it -- there was a panel that was constituted that just graded all the reforms that we set up to avoid those charges. But I understand that that raised suspicions.
Again, at least some marginal kudos for admitting that this latest round was brought on by "excesses" (though we'd argue that it was actually unconstitutional, rather than mere overreach). And nice of him to admit that Snowden actually did reveal such "excesses." Of course, that raises a separate question: Why is Obama still trying to prosecute Snowden when he's just admitted that what Snowden did was clearly whistleblowing, in revealing questionable spying?
Also, the President is simply wrong that it was just about issues involving non-US persons. The major reform that has taken place wasn't about US persons at all, but rather about Section 215 of the PATRIOT Act, which was used almost entirely on US persons to collect all their phone records. So it's unclear why the President is pretending otherwise. The stuff outside of the US is governed by Executive Order 12333, and there's been completely no evidence that the President has changed that at all. I do agree, to some extent, that many do believe in an exaggerated view of NSA surveillance, and that's distracting. But the underlying issues about legality and constitutionality -- and the possibilities for abuse -- absolutely remain.
But none of that actually has to do with the encryption fight, beyond the recognition -- accurately -- that the government's actions, revealed by Snowden, caused many to take these issues more seriously. And, on that note, it would have been at least a little more accurate for the President to recognize that it wasn't Snowden who brought this on the government, but the government itself by doing what it was doing.
So we're concerned about privacy. We don't want government to be looking through everybody's phones willy-nilly, without any kind of oversight or probable cause or a clear sense that it's targeted who might be a wrongdoer.
What makes it even more complicated is that we also want really strong encryption. Because part of us preventing terrorism or preventing people from disrupting the financial system or our air traffic control system or a whole other set of systems that are increasingly digitalized is that hackers, state or non-state, can just get in there and mess them up.
So we've got two values. Both of which are important.... And the question we now have to ask is, if technologically it is possible to make an impenetrable device or system where the encryption is so strong that there's no key. There's no door at all. Then how do we apprehend the child pornographer? How do we solve or disrupt a terrorist plot? What mechanisms do we have available to even do simple things like tax enforcement? Because if, in fact, you can't crack that at all, government can't get in, then everybody's walking around with a Swiss bank account in their pocket. So there has to be some concession to the need to be able get into that information somehow.
The answer to those questions in that final paragraph are through good old fashioned detective work. In a time before smartphones, detectives were still able to catch child pornographers or disrupt terrorist plots. And, in some cases, the government failed to stop either of those things. But it wasn't because strong enforcement stymied them, but because there are always going to be some plots that people are able to get away with. We shouldn't undermine our entire security setup just because there are some bad people out there. In fact, that makes us less safe.
Also: tax enforcement? Tax enforcement? Are we really getting to the point that the government wants to argue that we need to break strong encryption to better enforce taxes? Really? Again, there are lots of ways to go after tax evasion. And, yes, there are lots of ways that people and companies try to hide money from the IRS. And sometimes they get away with it. To suddenly say that we should weaken encryption because the IRS isn't good enough at its job just seems... crazy.
Now, what folks who are on the encryption side will argue, is that any key, whatsoever, even if it starts off as just being directed at one device, could end up being used on every device. That's just the nature of these systems. That is a technical question. I am not a software engineer. It is, I think, technically true, but I think it can be overstated.
This is the part that's most maddening of all. He almost gets the point right. He almost understands. The crypto community has been screaming from the hills for ages that introducing any kind of third party access to encryption weakens it for all, introducing vulnerabilities that ensure that those with malicious intent will get in much sooner than they would otherwise. The President is mixing up that argument with one of the other arguments in the Apple/FBI case, about whether it's about "one phone" or "all the phones."
But even assuming this slight mixup is a mistake, and that he does recognize the basics of the arguments from the tech community, to have him then say that this "can be overstated" is crazy. A bunch of cryptography experts -- including some who used to work for Obama -- laid out in a detailed paper the risks of undermining encryption. To brush that aside as some sort of rhetorical hyperbole -- to brush aside the realities of cryptography and math -- is just crazy.
Encryption expert Matt Blaze (whose research basically helped win Crypto War 1.0) responded to this argument by noting that the "nerd harder, nerds" argument fundamentally misunderstands the issue:
Figuring out how to build the reliable, secure systems required to "compromise" on crypto has long been a central problem in CS.
If you can't read that, Blaze is basically saying that all crypto includes backdoors -- they're known as vulnerabilities. And the key focus in crypto is closing those backdoors, because leaving them open is disastrous. And yet the government is now demanding that tech folks purposely put in more backdoors and not close them, without recognizing the simple fact that vulnerabilities in crypto always lead to disastrous results.
So the question now becomes that, we as a society, setting aside the specific case between the FBI and Apple, setting aside the commercial interests, the concerns about what could the Chinese government do with this, even if we trust the US government. Setting aside all those questions, we're going to have to make some decisions about how do we balance these respective risks. And I've got a bunch of smart people, sitting there, talking about it, thinking about it. We have engaged the tech community, aggressively, to help solve this problem. My conclusions so far is that you cannot take an absolutist view on this. So if your argument is "strong encryption no matter what, and we can and should in fact create black boxes," that, I think, does not strike the kind of balance that we have lived with for 200, 300 years. And it's fetishizing our phones above every other value. And that can't be the right answer.
This is not an absolutist view. It is not an absolutist view to say that anything you do to weaken the security of phones creates disastrous consequences for overall security, far beyond the privacy of individuals holding those phones. And, as Julian Sanchez rightly notes, it's ridiculous that it's the status quo on the previous compromise that is now being framed as an "absolutist" position:
CALEA--with obligations on telecoms to assist, but user-side encryption protected--WAS the compromise. Now that's "absolutism".
Also, the idea that this is about "fetishizing our phones" is ridiculous. No one is even remotely suggesting that. No one is even suggesting -- as Obama hints -- that this is about making phones "above and beyond" what other situations are. It's entirely about the nature of computer security and how it works. It's about the risks to our security in creating deliberate vulnerabilities in our technologies. To frame that as "fetishizing our phones" is insulting.
There's a reason why the NSA didn't want President Obama to carry a Blackberry when he first became President. And there's a reason the President wanted a secure Blackberry. And it's not because of fetishism in any way, shape or form. It's because securing data on phones is freaking hard and it's a constant battle. And anything that weakens the security puts people in harm's way.
I suspect that the answer is going to come down to how do we create a system where the encryption is as strong as possible. The key is as secure as possible. It is accessible by the smallest number of people possible for a subset of issues that we agree are important. How we design that is not something that I have the expertise to do. I am way on the civil liberties side of this thing. Bill McCraven will tell you that I anguish a lot over the decisions we make over how to keep this country safe. And I am not interested in overthrowing the values that have made us an exceptional and great nation, simply for expediency. But the dangers are real. Maintaining law and order and a civilized society is important. Protecting our kids is important.
You suspect wrong. Because while your position sounds reasonable and "balanced" (and I've seen some in the press describe President Obama's position here as "realist"), it's actually dangerous. This is the problem. The President is discussing this like it's a political issue rather than a technological/math issue. People aren't angry about this because they're "extremists" or "absolutists" or people who "don't want to compromise." They're screaming about this because "the compromise" solution is dangerous. If there really were a way to have strong encryption with a secure key where only a small number of people could get in on key issues, then that would be great.
But the key point that all of the experts keep stressing is: that's not reality. So, no the President's not being a "realist." He's being the opposite.
So I would just caution against taking an absolutist perspective on this. Because we make compromises all the time. I haven't flown commercial in a while, but my understanding is that it's not great fun going through security. But we make the concession because -- it's a big intrusion on our privacy -- but we recognize that it is important. We have stops for drunk drivers. It's an intrusion. But we think it's the right thing to do. And this notion that somehow our data is different and can be walled off from those other trade-offs we make, I believe is incorrect.
Again, this is not about "making compromises" or some sort of political perspective. And the people arguing for strong encryption aren't being "absolutist" about it because they're unwilling to compromise. They're saying that the "compromise" solution means undermining the very basis of how we do security and putting everyone at much greater risk. That's ethically horrific.
And, also, no one is saying that "data is different." There has always been information that is "walled off." What people are saying is that one consequence of strong encryption is that it has to mean that law enforcement is kept out of that information too. That does not mean they can't solve crimes in other ways. It does not mean that they don't get access to lots and lots of other information. It just means that this kind of content is harder to access, because we need it to be harder to access to protect everyone.
It's not security v. privacy. It's security v. security, where the security the FBI is fighting for is to stop the 1 in a billion attack and the security everyone else wants is to prevent much more likely and potentially much more devastating attacks.
Meanwhile, of all the things for the President to cite as an analogy, TSA security theater may be the worst. Very few people think it's okay, especially since it's been shown to be a joke. Setting that up as the precedent for breaking strong encryption is... crazy. And, on top of that, using the combination of TSA security and DUI checkpoints as evidence for why we should break strong encryption with backdoors again fails to recognize the issue at hand. Neither of those undermine an entire security setup.
We do have to make sure, given the power of the internet and how much our lives are digitalized, that it is narrow and that it is constrained and that there's oversight. And I'm confident this is something that we can solve, but we're going to need the tech community, software designers, people who care deeply about this stuff, to help us solve it. Because what will happen is, if everybody goes to their respective corners, and the tech community says "you know what, either we have strong perfect encryption, or else it's Big Brother and Orwellian world," what you'll find is that after something really bad happens, the politics of this will swing and it will become sloppy and rushed and it will go through Congress in ways that have not been thought through. And then you really will have dangers to our civil liberties, because the people who understand this best, and who care most about privacy and civil liberties have disengaged, or have taken a position that is not sustainable for the general public as a whole over time.
I have a lot of trouble with the President's line about everyone going to "their respective corners," as it suggests a ridiculous sort of tribalism in which the natural state is the tech industry against the government and even suggests that the tech industry doesn't care about stopping terrorism or child pornographers. That, of course, is ridiculous. It's got nothing to do with "our team." It has to do with the simple realities of encryption and the fact that what the President is suggesting is dangerous.
Furthermore, it's not necessarily the "Orwellian/big brother" issue that people are afraid of. That's a red herring from the "privacy v. security" mindset. People are afraid of this making everyone a lot less safe. No doubt, the President is right that if there's "something really bad" happening then the politics moves in one way -- but it's pretty ridiculous for him to be saying that, seeing as the latest skirmish in this battle is being fought by his very own Justice Department, he's the one who jumped on the San Bernardino attacks as an excuse to push this line of argument.
If the President is truly worried about stupid knee-jerk reactions following "something bad" happening, rather than trying to talk about "balance" and "compromise," he could and should be doing more to fairly educate the American public, and to make public statements about this issue and how important strong encryption is. Enough of this bogus "strong encryption is important, but... the children" crap. The children need strong encryption. The victims of crimes need encryption. The victims of terrorists need encryption. Undermining all that because just a tiny bit of information is inaccessible to law enforcement is crazy. It's giving up the entire ballgame to those with malicious intent, just so that we can have a bit more information in a few narrow cases.
President Obama keeps mentioning trade-offs, but it appears that he refuses to actually understand the trade-offs at issue here. Giving up on strong encryption is not about finding a happy middle compromise. Giving up on strong encryption is putting everyone at serious risk.
A couple of months ago, I wrote a long post trying to dig into the details of David Lowery's class action lawsuit against Spotify. In the end, while there was some question over whether or not streaming music services really need to get compulsory mechanical licenses for producing reproductions of songs, it seemed like the fact that such licenses are compulsory and can be obtained easily via having the Harry Fox Agency issue a "Notice of Intention" under Section 115, it seemed crazy to think that the various music services had not done that. In fact, we noted that the only way the lawsuits made any sense was if the various music services and HFA ignored this and didn't send out such NOIs. At the time, I noted that this would be a surprise, and it could mean the services were in deep trouble.
Or perhaps not a surprise... and, yes, some folks may be in deep trouble. Beyond Lowery's lawsuit, a few other similar lawsuits have been filed. Earlier this month, Tim Geigner wrote about a very similar lawsuit filed by Yesh Music against Tidal. Of course, what didn't get as much attention is that Yesh filed very similar lawsuits against a bunch of other music services as well, including Google Music, Slacker, Line Corporation (which runs Mix Radio) and Guerva (which I think is a misspelling of the music site Guvera). Yesh also sued Deezer a few months ago.
One of the key questions that came up following the reporting on all of these cases is the Harry Fox Agency's role in all of this. HFA, an organization that was set up by the publishers themselves is supposed to be responsible for managing compulsory licensing for the vast majority (though not all) of popular songwriters (remember, HFA is about compositions/publishing, not sound recordings). But it's beginning to look seriously like HFA just fell asleep on the job and didn't bother to do the one key thing it was supposed to do for all these music services: file Section 115 NOIs.
Both David Lowery and another songwriter, Ari Herstand, have recently posted examples of HFA suddenly sending them NOIs that appear to be rushed and are showing up way after they're supposed to. I rarely agree with Lowery about anything, but it's seriously looking like HFA totally fucked up here. Big time. Here's the notice Lowery received:
As Lowery notes, this NOI was sent on February 16th, 2016, but was signed by a Spotify exec who left the company in 2015, for a song that showed up on Spotify in 2011 and using an HFA address that didn't exist until 2012. Basically... it looks like HFA is rushing around trying to send out NOIs that it failed to do properly, and doing a pretty half-assed job about it. And that seems especially stupid when it comes to issuing those NOIs to the guy who is already suing over those missing NOIs.
Herstand just received a similarly late NOI from HFA for his music appearing on Apple Music. As he notes, his notice says the music should appear on Apple Music as of March 10th of 2016, but it's actually been there since Apple Music launched last summer. He also notes this is the first NOI he's ever received from HFA, while he has received plenty of NOIs from the much smaller HFA competitor Music Reports "on a regular basis."
So, given all that, it sure looks like HFA didn't do the one thing that it was supposed to be doing all along, and that's... going to be bad news for someone. The big question is who? All of the lawsuits have been against the various music services, but without being privy to the contracts between HFA and the music services themselves, I'd be shocked if they didn't include some sort of indemnity clauses, basically saying that if music isn't licensed because of HFA's own failures to do its job that any liability falls back on HFA.
And, if that's the case, HFA could be on the hook for a ton of copyright infringement. If it's true that it's basically been ignoring the fairly simple NOI process for a lot of artists, then that's going to be a major scandal -- but one that seems a lot harder to pin on the music services themselves. They went to HFA and hired the company to handle mechanical licenses. There may be more going on behind the scenes here, but at a first pass, based on what appears to be happening, HFA may be in some seriously deep trouble.
It must be admitted that the Apple/FBI fight over iPhone encryption has had much more "outside the courtroom" drama than most cases -- what with both sides putting out their own blog posts and commenting publicly at length on various aspects. But things have been taken up a notch, it seems, with the latest. We wrote about the DOJ's crazy filing in the case, which is just chock full of incredibly misleading claims. Most of the time, when we call out misleading claims in lawsuits, the various parties stay quiet about it. But this one was apparently so crazy that Apple's General Counsel Bruce Sewell called a press conference where he just blasted the DOJ through and through. It's worth looking at his whole statement (highlights by me):
First, the tone of the brief reads like an indictment. We've all heard Director Comey and Attorney General Lynch thank Apple for its consistent help in working with law enforcement. Director Comey's own statement that "there are no demons here." Well, you certainly wouldn't conclude it from this brief. In 30 years of practice I don't think I've seen a legal brief that was more intended to smear the other side with false accusations and innuendo, and less intended to focus on the real merits of the case.
For the first time we see an allegation that Apple has deliberately made changes to block law enforcement requests for access. This should be deeply offensive to everyone that reads it. An unsupported, unsubstantiated effort to vilify Apple rather than confront the issues in the case.
Or the ridiculous section on China where an AUSA, an officer of the court, uses unidentified Internet sources to raise the spectre that Apple has a different and sinister relationship with China. Of course that is not true, and the speculation is based on no substance at all.
To do this in a brief before a magistrate judge just shows the desperation that the Department of Justice now feels. We would never respond in kind, but imagine Apple asking a court if the FBI could be trusted "because there is this real question about whether J. Edgar Hoover ordered the assassination of Kennedy — see ConspiracyTheory.com as our supporting evidence."
We add security features to protect our customers from hackers and criminals. And the FBI should be supporting us in this because it keeps everyone safe. To suggest otherwise is demeaning. It cheapens the debate and it tries to mask the real and serious issues. I can only conclude that the DoJ is so desperate at this point that it has thrown all decorum to the winds....
We know there are great people in the DoJ and the FBI. We work shoulder to shoulder with them all the time. That's why this cheap shot brief surprises us so much. We help when we're asked to. We're honest about what we can and cannot do. Let's at least treat one another with respect and get this case before the American people in a responsible way. We are going before court to exercise our legal rights. Everyone should beware because it seems like disagreeing with the Department of Justice means you must be evil and anti-American. Nothing could be further from the truth.
Somehow, I don't think Apple and the DOJ will be exchanging holiday cards this year. Apple's reply brief is due on Tuesday. I imagine it'll be an interesting weekend in Cupertino.