One of the key lines that various supporters of backdooring encryption have repeated in the last year, is that they "just want to have a discussion" about the proper way to... put backdoors into encryption. Over and over again you had the likes of James Comey insisting that he wasn't demanding backdoors, but really just wanted a "national conversation" on the issue (despite the fact we had just such a conversation in the 90s and concluded: backdoors bad, let's move on.):
My goal today isn’t to tell people what to do. My goal is to urge our fellow citizens to participate in a conversation as a country about where we are, and where we want to be, with respect to the authority of law enforcement.
And, yet, now we're having that conversation. Very loudly. And while the conversation really has been going on for almost two years, in the last month it moved from a conversation among tech geeks and policy wonks into the mainstream, thanks to the DOJ's decision to force Apple to write some code that would undermine security features on the work iPhone of Syed Farook, one of the San Bernardino attackers. According to some reports, the DOJ and FBI purposely chose this case in the belief that it was a perfect "test" case for its side: one that appeared to involve "domestic terrorists" who murdered 14 people. There were reports claiming that Apple was fine fighting this case under seal, but that the DOJ purposely chose to make this request public.
However, now that this has resulted in just such a "national conversation" on the issue, the DOJ, FBI and others in the White House are suddenly realizing that perhaps the public isn't quite as with them as they had hoped. And now there are reports that some in the White House are regretting the decision to move forward and are experiencing this well known feeling:
According to the NY Times:
Officials had hoped the Apple case involving a terrorist’s iPhone would rally the public behind what they see as the need to have some access to information on smartphones. But many in the administration have begun to suspect that the F.B.I. and the Justice Department may have made a major strategic error by pushing the case into the public consciousness.
Many senior officials say an open conflict between Silicon Valley and Washington is exactly what they have been trying to avoid, especially when the Pentagon and intelligence agencies are trying to woo technology companies to come back into the government’s fold, and join the fight against the Islamic State. But it appears it is too late to confine the discussion to the back rooms in Washington or Silicon Valley.
While the various public polling on the issue has led to very mixed results, it's pretty clear that the public did not universally swing to the government's position on this. In fact, it appears that the more accurately the situation is described to the public, the more likely they are to side with Apple over the FBI. Given that, John Oliver's recent video on the subject certainly isn't good news for the DOJ.
Either way, the DOJ and FBI insisted they wanted a conversation on this, and now they're getting it. Perhaps they should have been more careful what they wished for.
Not surprisingly, Oliver's take is much clearer and much more accurate than many mainstream press reports on the issues in the case, appropriately mocking the many law enforcement officials who seem to think that, just because Apple employs smart engineers, they can somehow do the impossible and "safely" create a backdoor into an encrypted iPhone that won't have dangerous consequences. He even spends a bit of time reviewing the original Crypto Wars over the Clipper Chip and highlights cryptographer Matt Blaze's contribution in ending those wars by showing that the Clipper Chip could be hacked.
But the biggest contribution to the debate -- which I hope that people pay most attention to -- is the point that Oliver made in the end with his faux Apple commercial. Earlier in the piece, Oliver noted that this belief among law enforcement that Apple engineers can somehow magically do what they want is at least partially Apple's own fault, with its somewhat overstated marketing. So, Oliver's team made a "more realistic" Apple commercial which noted that Apple is constantly fighting security cracks and vulnerabilities and is consistently just half a step ahead of hackers with malicious intent (and, in many cases, half a step behind them).
This is the key point: Building secure products is very, very difficult and even the most secure products have security vulnerabilities in them that need to be constantly watched and patched. And what the government is doing here is not only asking Apple to not patch a security vulnerability that it has found, but actively forcing Apple to make a new vulnerability and then effectively forcing Apple to keep it open. For all the talk of how Apple can just create the backdoor just this once and throw it away, this more like asking Apple to set off a bomb that blows the back off all houses in a city, and then saying, "okay, just throw away the bomb after you set it off."
Hopefully, as in cases like net neutrality, Oliver's piece does it's job in informing the public what's really going on.
As recently as this past week, officials said, the Justice Department was discussing how to proceed in a continuing criminal investigation in which a federal judge had approved a wiretap, but investigators were stymied by WhatsApp’s encryption.
The Justice Department and WhatsApp declined to comment. The government officials and others who discussed the dispute did so on condition of anonymity because the wiretap order and all the information associated with it were under seal. The nature of the case was not clear, except that officials said it was not a terrorism investigation. The location of the investigation was also unclear.
And, as long as we're operating on hearsay and conjecture, there's also this:
You’re getting useless data,” said Joseph DeMarco, a former federal prosecutor who now represents law enforcement agencies that filed briefs supporting the Justice Department in its fight with Apple. “The only way to make this not gibberish is if the company helps.”
“As we know from intercepted prisoner wiretaps,” he added, “criminals think that advanced encryption is great.”
You'd think that access to prisoner wiretaps would somewhat negate the need to break encryption, but maybe these mouthy inmates spend more time chatting about encryption than the allegations against them. And while I understand law enforcement's complaint that they used to be able to get all of this data with a warrant, they also used to have to run license plates by hand and perform stakeouts in person. So, it's not as though advances in technology have delivered no concurrent benefits.
Make no mistake about it: given the multitude of choices, the DOJ would rather have unfettered access to phones and all they contain. WhatsApp may have a billion or so users -- all protected by end-to-end encryption -- but if the FBI can crack open a phone, it can likely get to the content of the messages.
In the case of Amy Fletcher’s son Justin Bloxom, privacy advocates question whether phone evidence was critical to the cases. But Ms. Fletcher said: “Everything that was done was done through texts from a damn cell phone.”
“Had we not had that information, you wouldn’t realize how evil this man was,” said Ms. Fletcher, who didn’t know her son’s 2010 murder in Mansfield, La., had become part of the national debate until contacted by The Wall Street Journal.
There's no mention of WhatsApp in the Wall Street Journal's article, so it may be that all the recovered texts were of the SMS variety. But WhatsApp is supplanting SMS and the DOJ is definitely interested in the heavily-used messaging app. Last year, its requests to Facebook (which owns WhatsApp) for the contents of these messages jumped astronomically.
In the first six months of 2015, US law enforcement agencies sent Facebook 201 wiretap requests (referred to as “Title III” in the report) for 279 users or accounts. In all of 2014, on the other hand, Facebook only received 9 requests for 16 users or accounts.
Motherboard notes that this number, while still seemingly small, represents a 2133% increase. Not only that, but the total number of requests to Facebook for this data dwarfs similar requests from Google, which only saw 30 total for 2013-2014 combined.
The FBI and DOJ have yet to say much publicly about this particular case, probably feeling it's better to fight only one heavily-opposed battle at a time. But whatever the result of the Apple case, it will hardly be the end of the DOJ's efforts to force service providers to assist them in undermining their own protective efforts.
This is not all that surprising, but President Obama, during his SXSW keynote interview, appears to have joined the crew of politicians making misleading statements pretending to be "balanced" on the question of encryption. The interview (the link above should start at the very beginning) talks about a variety of issues related to tech and government, but eventually the President zeroes in on the encryption issue. The embed below should start at that point (if not, it's at the 1 hour, 16 minute mark in the video). Unfortunately, the interviewer, Evan Smith of the Texas Tribune, falsely frames the issue as one of "security v. privacy" rather than what it actually is -- which is "security v. security."
In case you can't watch that, the President says he won't comment directly on the Apple legal fights, but then launches into the standard politician talking point of "yes, we want strong encryption, but bad people will use it so we need to figure out some way to break in."
If you watch that, the President is basically doing the same thing as all the Presidential candidates, stating that there's some sort of equivalency on both sides of the debate and that we need to find some sort of "balanced" solution short of strong encryption that will somehow let in law enforcement in some cases.
This is wrong. This is ignorant.
To his at least marginal credit, the President (unlike basically all of the Presidential candidates) did seem to acknowledge the arguments of the crypto community, but then tells them all that they're wrong. In some ways, this may be slightly better than those who don't even understand the actual issues at all, but it's still problematic.
Let's go through this line by line.
All of us value our privacy. And this is a society that is built on a Constitution and a Bill of Rights and a healthy skepticism about overreaching government power. Before smartphones were invented, and to this day, if there is probable cause to think that you have abducted a child, or that you are engaging in a terrorist plot, or you are guilty of some serious crime, law enforcement can appear at your doorstep and say 'we have a warrant to search your home' and they can go into your bedroom to rifle through your underwear to see if there's any evidence of wrongdoing.
Again, this is overstating the past and understating today's reality. Yes, you could always get a warrant to go "rifle through" someone's underwear, if you could present probable cause that such a search was reasonable to a judge. But that does not mean that the invention of smartphones really changed things so dramatically as President Obama presents here. For one, there has always been information that was inaccessible -- such as information that came from an in-person conversation or information in our brains or information that has been destroyed.
In fact, as lots of people have noted, today law enforcement has much more recorded evidence that it can obtain, totally unrelated to the encryption issue. This includes things like location information or information on people you called. That information used to not be available at all. So it's hellishly misleading to pretend that we've entered some new world of darkness for law enforcement when the reality is that the world is much, much brighter.
And we agree on that. Because we recognize that just like all our other rights, freedom of speech, freedom of religion, etc. there are going to be some constraints that we impose in order to make sure that we are safe, secure and living in a civilized society. Now technology is evolving so rapidly that new questions are being asked. And I am of the view that there are very real reasons why we want to make sure that government cannot just willy nilly get into everyone's iPhones, or smartphones, that are full of very personal information and very personal data. And, let's face it, the whole Snowden disclosure episode elevated people's suspicions of this.
[...]
That was a real issue. I will say, by the way, that -- and I don't want to go to far afield -- but the Snowden issue, vastly overstated the dangers to US citizens in terms of spying. Because the fact of the matter is that actually that our intelligence agencies are pretty scrupulous about US persons -- people on US soil. What those disclosures did identify were excesses overseas with respect to people who are not in this country. A lot of those have been fixed. Don't take my word for it -- there was a panel that was constituted that just graded all the reforms that we set up to avoid those charges. But I understand that that raised suspicions.
Again, at least some marginal kudos for admitting that this latest round was brought on by "excesses" (though we'd argue that it was actually unconstitutional, rather than mere overreach). And nice of him to admit that Snowden actually did reveal such "excesses." Of course, that raises a separate question: Why is Obama still trying to prosecute Snowden when he's just admitted that what Snowden did was clearly whistleblowing, in revealing questionable spying?
Also, the President is simply wrong that it was just about issues involving non-US persons. The major reform that has taken place wasn't about US persons at all, but rather about Section 215 of the PATRIOT Act, which was used almost entirely on US persons to collect all their phone records. So it's unclear why the President is pretending otherwise. The stuff outside of the US is governed by Executive Order 12333, and there's been completely no evidence that the President has changed that at all. I do agree, to some extent, that many do believe in an exaggerated view of NSA surveillance, and that's distracting. But the underlying issues about legality and constitutionality -- and the possibilities for abuse -- absolutely remain.
But none of that actually has to do with the encryption fight, beyond the recognition -- accurately -- that the government's actions, revealed by Snowden, caused many to take these issues more seriously. And, on that note, it would have been at least a little more accurate for the President to recognize that it wasn't Snowden who brought this on the government, but the government itself by doing what it was doing.
So we're concerned about privacy. We don't want government to be looking through everybody's phones willy-nilly, without any kind of oversight or probable cause or a clear sense that it's targeted who might be a wrongdoer.
What makes it even more complicated is that we also want really strong encryption. Because part of us preventing terrorism or preventing people from disrupting the financial system or our air traffic control system or a whole other set of systems that are increasingly digitalized is that hackers, state or non-state, can just get in there and mess them up.
So we've got two values. Both of which are important.... And the question we now have to ask is, if technologically it is possible to make an impenetrable device or system where the encryption is so strong that there's no key. There's no door at all. Then how do we apprehend the child pornographer? How do we solve or disrupt a terrorist plot? What mechanisms do we have available to even do simple things like tax enforcement? Because if, in fact, you can't crack that at all, government can't get in, then everybody's walking around with a Swiss bank account in their pocket. So there has to be some concession to the need to be able get into that information somehow.
The answer to those questions in that final paragraph are through good old fashioned detective work. In a time before smartphones, detectives were still able to catch child pornographers or disrupt terrorist plots. And, in some cases, the government failed to stop either of those things. But it wasn't because strong enforcement stymied them, but because there are always going to be some plots that people are able to get away with. We shouldn't undermine our entire security setup just because there are some bad people out there. In fact, that makes us less safe.
Also: tax enforcement? Tax enforcement? Are we really getting to the point that the government wants to argue that we need to break strong encryption to better enforce taxes? Really? Again, there are lots of ways to go after tax evasion. And, yes, there are lots of ways that people and companies try to hide money from the IRS. And sometimes they get away with it. To suddenly say that we should weaken encryption because the IRS isn't good enough at its job just seems... crazy.
Now, what folks who are on the encryption side will argue, is that any key, whatsoever, even if it starts off as just being directed at one device, could end up being used on every device. That's just the nature of these systems. That is a technical question. I am not a software engineer. It is, I think, technically true, but I think it can be overstated.
This is the part that's most maddening of all. He almost gets the point right. He almost understands. The crypto community has been screaming from the hills for ages that introducing any kind of third party access to encryption weakens it for all, introducing vulnerabilities that ensure that those with malicious intent will get in much sooner than they would otherwise. The President is mixing up that argument with one of the other arguments in the Apple/FBI case, about whether it's about "one phone" or "all the phones."
But even assuming this slight mixup is a mistake, and that he does recognize the basics of the arguments from the tech community, to have him then say that this "can be overstated" is crazy. A bunch of cryptography experts -- including some who used to work for Obama -- laid out in a detailed paper the risks of undermining encryption. To brush that aside as some sort of rhetorical hyperbole -- to brush aside the realities of cryptography and math -- is just crazy.
Encryption expert Matt Blaze (whose research basically helped win Crypto War 1.0) responded to this argument by noting that the "nerd harder, nerds" argument fundamentally misunderstands the issue:
Figuring out how to build the reliable, secure systems required to "compromise" on crypto has long been a central problem in CS.
If you can't read that, Blaze is basically saying that all crypto includes backdoors -- they're known as vulnerabilities. And the key focus in crypto is closing those backdoors, because leaving them open is disastrous. And yet the government is now demanding that tech folks purposely put in more backdoors and not close them, without recognizing the simple fact that vulnerabilities in crypto always lead to disastrous results.
So the question now becomes that, we as a society, setting aside the specific case between the FBI and Apple, setting aside the commercial interests, the concerns about what could the Chinese government do with this, even if we trust the US government. Setting aside all those questions, we're going to have to make some decisions about how do we balance these respective risks. And I've got a bunch of smart people, sitting there, talking about it, thinking about it. We have engaged the tech community, aggressively, to help solve this problem. My conclusions so far is that you cannot take an absolutist view on this. So if your argument is "strong encryption no matter what, and we can and should in fact create black boxes," that, I think, does not strike the kind of balance that we have lived with for 200, 300 years. And it's fetishizing our phones above every other value. And that can't be the right answer.
This is not an absolutist view. It is not an absolutist view to say that anything you do to weaken the security of phones creates disastrous consequences for overall security, far beyond the privacy of individuals holding those phones. And, as Julian Sanchez rightly notes, it's ridiculous that it's the status quo on the previous compromise that is now being framed as an "absolutist" position:
CALEA--with obligations on telecoms to assist, but user-side encryption protected--WAS the compromise. Now that's "absolutism".
Also, the idea that this is about "fetishizing our phones" is ridiculous. No one is even remotely suggesting that. No one is even suggesting -- as Obama hints -- that this is about making phones "above and beyond" what other situations are. It's entirely about the nature of computer security and how it works. It's about the risks to our security in creating deliberate vulnerabilities in our technologies. To frame that as "fetishizing our phones" is insulting.
There's a reason why the NSA didn't want President Obama to carry a Blackberry when he first became President. And there's a reason the President wanted a secure Blackberry. And it's not because of fetishism in any way, shape or form. It's because securing data on phones is freaking hard and it's a constant battle. And anything that weakens the security puts people in harm's way.
I suspect that the answer is going to come down to how do we create a system where the encryption is as strong as possible. The key is as secure as possible. It is accessible by the smallest number of people possible for a subset of issues that we agree are important. How we design that is not something that I have the expertise to do. I am way on the civil liberties side of this thing. Bill McCraven will tell you that I anguish a lot over the decisions we make over how to keep this country safe. And I am not interested in overthrowing the values that have made us an exceptional and great nation, simply for expediency. But the dangers are real. Maintaining law and order and a civilized society is important. Protecting our kids is important.
You suspect wrong. Because while your position sounds reasonable and "balanced" (and I've seen some in the press describe President Obama's position here as "realist"), it's actually dangerous. This is the problem. The President is discussing this like it's a political issue rather than a technological/math issue. People aren't angry about this because they're "extremists" or "absolutists" or people who "don't want to compromise." They're screaming about this because "the compromise" solution is dangerous. If there really were a way to have strong encryption with a secure key where only a small number of people could get in on key issues, then that would be great.
But the key point that all of the experts keep stressing is: that's not reality. So, no the President's not being a "realist." He's being the opposite.
So I would just caution against taking an absolutist perspective on this. Because we make compromises all the time. I haven't flown commercial in a while, but my understanding is that it's not great fun going through security. But we make the concession because -- it's a big intrusion on our privacy -- but we recognize that it is important. We have stops for drunk drivers. It's an intrusion. But we think it's the right thing to do. And this notion that somehow our data is different and can be walled off from those other trade-offs we make, I believe is incorrect.
Again, this is not about "making compromises" or some sort of political perspective. And the people arguing for strong encryption aren't being "absolutist" about it because they're unwilling to compromise. They're saying that the "compromise" solution means undermining the very basis of how we do security and putting everyone at much greater risk. That's ethically horrific.
And, also, no one is saying that "data is different." There has always been information that is "walled off." What people are saying is that one consequence of strong encryption is that it has to mean that law enforcement is kept out of that information too. That does not mean they can't solve crimes in other ways. It does not mean that they don't get access to lots and lots of other information. It just means that this kind of content is harder to access, because we need it to be harder to access to protect everyone.
It's not security v. privacy. It's security v. security, where the security the FBI is fighting for is to stop the 1 in a billion attack and the security everyone else wants is to prevent much more likely and potentially much more devastating attacks.
Meanwhile, of all the things for the President to cite as an analogy, TSA security theater may be the worst. Very few people think it's okay, especially since it's been shown to be a joke. Setting that up as the precedent for breaking strong encryption is... crazy. And, on top of that, using the combination of TSA security and DUI checkpoints as evidence for why we should break strong encryption with backdoors again fails to recognize the issue at hand. Neither of those undermine an entire security setup.
We do have to make sure, given the power of the internet and how much our lives are digitalized, that it is narrow and that it is constrained and that there's oversight. And I'm confident this is something that we can solve, but we're going to need the tech community, software designers, people who care deeply about this stuff, to help us solve it. Because what will happen is, if everybody goes to their respective corners, and the tech community says "you know what, either we have strong perfect encryption, or else it's Big Brother and Orwellian world," what you'll find is that after something really bad happens, the politics of this will swing and it will become sloppy and rushed and it will go through Congress in ways that have not been thought through. And then you really will have dangers to our civil liberties, because the people who understand this best, and who care most about privacy and civil liberties have disengaged, or have taken a position that is not sustainable for the general public as a whole over time.
I have a lot of trouble with the President's line about everyone going to "their respective corners," as it suggests a ridiculous sort of tribalism in which the natural state is the tech industry against the government and even suggests that the tech industry doesn't care about stopping terrorism or child pornographers. That, of course, is ridiculous. It's got nothing to do with "our team." It has to do with the simple realities of encryption and the fact that what the President is suggesting is dangerous.
Furthermore, it's not necessarily the "Orwellian/big brother" issue that people are afraid of. That's a red herring from the "privacy v. security" mindset. People are afraid of this making everyone a lot less safe. No doubt, the President is right that if there's "something really bad" happening then the politics moves in one way -- but it's pretty ridiculous for him to be saying that, seeing as the latest skirmish in this battle is being fought by his very own Justice Department, he's the one who jumped on the San Bernardino attacks as an excuse to push this line of argument.
If the President is truly worried about stupid knee-jerk reactions following "something bad" happening, rather than trying to talk about "balance" and "compromise," he could and should be doing more to fairly educate the American public, and to make public statements about this issue and how important strong encryption is. Enough of this bogus "strong encryption is important, but... the children" crap. The children need strong encryption. The victims of crimes need encryption. The victims of terrorists need encryption. Undermining all that because just a tiny bit of information is inaccessible to law enforcement is crazy. It's giving up the entire ballgame to those with malicious intent, just so that we can have a bit more information in a few narrow cases.
President Obama keeps mentioning trade-offs, but it appears that he refuses to actually understand the trade-offs at issue here. Giving up on strong encryption is not about finding a happy middle compromise. Giving up on strong encryption is putting everyone at serious risk.
It must be admitted that the Apple/FBI fight over iPhone encryption has had much more "outside the courtroom" drama than most cases -- what with both sides putting out their own blog posts and commenting publicly at length on various aspects. But things have been taken up a notch, it seems, with the latest. We wrote about the DOJ's crazy filing in the case, which is just chock full of incredibly misleading claims. Most of the time, when we call out misleading claims in lawsuits, the various parties stay quiet about it. But this one was apparently so crazy that Apple's General Counsel Bruce Sewell called a press conference where he just blasted the DOJ through and through. It's worth looking at his whole statement (highlights by me):
First, the tone of the brief reads like an indictment. We've all heard Director Comey and Attorney General Lynch thank Apple for its consistent help in working with law enforcement. Director Comey's own statement that "there are no demons here." Well, you certainly wouldn't conclude it from this brief. In 30 years of practice I don't think I've seen a legal brief that was more intended to smear the other side with false accusations and innuendo, and less intended to focus on the real merits of the case.
For the first time we see an allegation that Apple has deliberately made changes to block law enforcement requests for access. This should be deeply offensive to everyone that reads it. An unsupported, unsubstantiated effort to vilify Apple rather than confront the issues in the case.
Or the ridiculous section on China where an AUSA, an officer of the court, uses unidentified Internet sources to raise the spectre that Apple has a different and sinister relationship with China. Of course that is not true, and the speculation is based on no substance at all.
To do this in a brief before a magistrate judge just shows the desperation that the Department of Justice now feels. We would never respond in kind, but imagine Apple asking a court if the FBI could be trusted "because there is this real question about whether J. Edgar Hoover ordered the assassination of Kennedy — see ConspiracyTheory.com as our supporting evidence."
We add security features to protect our customers from hackers and criminals. And the FBI should be supporting us in this because it keeps everyone safe. To suggest otherwise is demeaning. It cheapens the debate and it tries to mask the real and serious issues. I can only conclude that the DoJ is so desperate at this point that it has thrown all decorum to the winds....
We know there are great people in the DoJ and the FBI. We work shoulder to shoulder with them all the time. That's why this cheap shot brief surprises us so much. We help when we're asked to. We're honest about what we can and cannot do. Let's at least treat one another with respect and get this case before the American people in a responsible way. We are going before court to exercise our legal rights. Everyone should beware because it seems like disagreeing with the Department of Justice means you must be evil and anti-American. Nothing could be further from the truth.
Somehow, I don't think Apple and the DOJ will be exchanging holiday cards this year. Apple's reply brief is due on Tuesday. I imagine it'll be an interesting weekend in Cupertino.
The Justice Department has now filed its response to Apple's motion to vacate being forced to undermine the security features of Syed Farook's work iPhone. It's... quite a piece of work. The DOJ is pulling out all the stops in this one, and it seems to be going deeper and deeper into the ridiculous as it does so. Of course, it repeats many of the arguments in its earlier filings (both its original application for the All Writs Order as well as its Motion to Compel -- which even the judge told the DOJ she didn't think it should file). For example, it continues to assert that this should be judged on the "three factor test" that it made up from a Supreme Court decision that doesn't actually have a three factor test.
But the crux of the DOJ's argument is basically "how dare Apple make a warrant-proof phone" and thus it's Apple's fault that they haven't made it easy for the FBI to get what it wants. This argument is bonkers on many levels. Let's dig in:
By Apple’s own reckoning, the corporation—which grosses hundreds of billions of dollars a year—would need to set aside as few as six of its 100,000 employees for perhaps as little as two weeks. This burden, which is not unreasonable, is the direct result of Apple’s deliberate marketing decision to engineer its products so that the government cannot search them, even with a warrant.
This is a purposeful misrepresentation. The issue here is that the judge has made it clear that the key issue that she's concerned with is whether or not the request from the DOJ represents an "unreasonable burden" on Apple -- as the "burden" is the only actual test laid out in the US v. NY Telephone case the DOJ keeps pointing to. But Apple didn't present the time and manpower to show that it's the resources that are the unreasonable burden, but the potential impact on the safety and security of its customers. Focusing on the time is not the issue, but of course, the DOJ pretends it is.
Second, the DOJ's continued its ridiculous insistence that making your products safe and secure is a "deliberate marketing decision" -- which somehow makes it offensive in some way. Apple didn't engineer its products "so that the government cannot search them," it's so that your information is safe and secure from anyone, including criminals. You would think that law enforcement people in the FBI and DOJ would appreciate more secure devices that reduce crime. There was a time that they did. To sneeringly suggest that better protecting the public is nothing more than a "marketing decision" is ridiculous. Hell, even if it was a "marketing decision," a big part of the reason that "the market" wanted such features so badly was because the US government itself overstepped its bounds with mass surveillance.
The Court’s Order is modest. It applies to a single iPhone, and it allows Apple to decide the least burdensome means of complying. As Apple well knows, the Order does not compel it to unlock other iPhones or to give the government a universal “master key” or “back door.” It is a narrow, targeted order that will produce a narrow, targeted piece of software capable of running on just one iPhone, in the security of Apple’s corporate headquarters.
It has been explained -- at length -- by both Apple and various amicus briefs, how ridiculous this is. Everyone -- including the FBI -- has now admitted that this case is almost entirely about the precedent, and that a win for the DOJ will inevitably mean a long line of law local and federal law enforcement lining up outside Apple's headquarters in Cupertino with court orders in their hands, demanding that Apple help them crack into iPhones. That's a big deal. It also sets a precedent even beyond Apple, that companies can be forced to deliberately (1) weaken security on their devices and services and (2) lie to the public about it by "signing" the devices as legit.
The government and the community need to know what is on the terrorist’s phone, and the government needs Apple’s assistance to find out.
Instead of complying, Apple attacked the All Writs Act as archaic, the Court’s Order as leading to a “police state,” and the FBI’s investigation as shoddy, while extolling itself as the primary guardian of Americans’ privacy.... Apple’s rhetoric is not only false, but also corrosive of the very institutions that are best able to safeguard our liberty and our rights: the courts, the Fourth Amendment, longstanding precedent and venerable laws, and the democratically elected branches of government.
Apple didn't attack the AWA as "archaic" so much as inapplicable in this situation. Once again, the DOJ is doing some serious misrepresentation in this filing (and we're just three paragraphs in).
This case—like the three-factor Supreme Court test on which it must be decided—is about specific facts, not broad generalities. Here, Apple deliberately raised technological barriers that now stand between a lawful warrant and an iPhone containing evidence related to the terrorist mass murder of 14 Americans. Apple alone can remove those barriers so that the FBI can search the phone, and it can do so without undue burden. Under those specific circumstances, Apple can be compelled to give aid. That is not lawless tyranny. Rather, it is ordered liberty vindicating the rule of law. This Court can, and should, stand by the Order. Apple can, and should, comply with it.
Three factors! Drink! And, yes, Apple put in place these "barriers," but not as barriers to the government, but as security for everyone -- and there's a very big question, which the DOJ so desperately wishes to avoid with the mumble jumble above, which is whether or not a company can be forced to purposely write and sign code that deliberately undermines security features.
In deciding New York Telephone, the Supreme Court directly confronted and expressly rejected the policy arguments Apple raises now. Like Apple, the telephone company argued: that Congress had not given courts the power to issue such an order in its prior legislation; that the AWA could not be read so broadly; that it was for Congress to decide whether to provide such authority; and that relying on the AWA was a dangerous step down a slippery slope ending in arbitrary police powers.
Once again, the DOJ is misrepresenting the issues at play both in this case and in NY Telephone. In that case, a key part of the SCOTUS decision was based on the fact that NY Telephone was a public utility and therefore had certain responsibilities. That's not true of Apple. The DOJ also misrepresents the Congressional situation, which is different here, in that Congress did pass a specific law in this area, CALEA, which explicitly says that Apple need not help in this situation. The All Writs Act is a "gap filling" law, for when Congress has not spoken. But on this issue, it has.
The Supreme Court’s approach to the AWA does not create an unlimited source of judicial power, as Apple contends. The Act is self-limiting because it can only be invoked in aid of a court’s jurisdiction. Here, that jurisdiction rests on a lawful warrant, issued by a neutral magistrate pursuant to Rule 41. And New York Telephone provides a further safeguard, not through bright-line rules but rather through three factors courts must consider before exercising their discretion: (1) how far removed a party is from the investigative need; (2) how unreasonable a burden would be placed on that party; and (3) how necessary the party’s assistance is to the government. This three-factor analysis respects Congress’s mandate that the Act be flexible and adaptable, while eliminating the concern that random citizens will be forcibly deputized.
The DOJ insists that even with CALEA not saying it can do this, that doesn't matter, because CALEA is all about what companies can be forced to do prior to a warrant, not after.
CALEA, passed in 1994, does not “meticulously,” “intricately,” or “specifically” address when a court may order a smartphone manufacturer to remove barriers to accessing stored data on a particular smartphone. Rather, it governs what steps telecommunications carriers involved in transmission and switching must take in advance of court orders to ensure their systems can isolate information to allow for the real-time interception of network communications
But of course, under that interpretation, then the All Writs Act grants tremendous powers -- exactly the kinds of powers the DOJ insists elsewhere in this brief that isn't at issue in this case. I don't see how the DOJ can have it both ways.
As Apple recognizes, this Court must consider three equitable factors: (1) how “far removed” Apple is “from the underlying controversy”; (2) how “unreasonable [a] burden” the Order would place on Apple; and (3) how “necessary” its assistance is to searching Farook’s iPhone.
Apple is not so far removed from the underlying controversy that it should be excused from assisting in the execution of the search warrant. In New York Telephone, the phone company was sufficiently close to the controversy because the criminals used its phone lines. See 434 U.S. at 174. The Court did not require that the phone company know criminals were using its phone lines, or that it be involved in the crime. See id. Here, as a neutral magistrate found, there is probable cause to believe that Farook’s iPhone contains evidence related to his crimes. That alone would be sufficient proximity under the AWA and New York Telephone, even if Apple did not also own and control the software on Farook’s iPhone.
But again, under such an interpretation, the AWA can be used to force basically any tech company to figure out ways to spy on users if the FBI comes calling and gets a magistrate judge to rubber stamp an order. That's... crazy. Just because they use your technology does not mean that you're somehow legally on the hook for helping the FBI investigate their usage.
As Apple’s business model and its representations to its investors and customers make clear, Apple intentionally and for commercial advantage retains exclusive control over the software that can be used on iPhones, giving it monopoly-like control over the means of distributing software to the phones. As detailed below, Apple does so by: (1) firmly controlling iPhones’ operating systems and first-party software; (2) carefully managing and vetting third-party software before authenticating it for use on iPhones; and (3) continually receiving information from devices running its licensed software and its proprietary services, and retaining continued access to data from those devices about how its customers are using them. Having established suzerainty over its users’ phones—and control over the precise features of the phones necessary for unlocking them—Apple cannot now pretend to be a bystander, watching this investigation from afar.
This is kind of an incredible argument when you think about it: because Apple makes sure that its devices have updated software to keep it safe from vulnerabilities, that means that Apple is somehow connected to any use of the phone and responsible for helping the FBI crack into the phone. Does the FBI really want to encourage companies to stop offering any follow on support for software? Because that's the argument they're making here.
Thus, by its own design, Apple remains close to its iPhones through careful management and constant vigil over what software is on an iPhone and how that software is used. Indeed, Apple is much less “removed from the controversy”—in this case, the government’s inability to search Farook’s iPhone—than was the New York Telephone company because that company did not deliberately place its phone lines to prevent inconspicuous government access.... Here, Apple has deliberately used its control over its software to block law-enforcement requests for access to the contents of its devices, and it has advertised that feature to sell its products.
This argument is particularly maddening: basically continuing the ridiculous line of thinking that protecting user privacy is some sort of deliberate marketing strategy against the government, rather than in favor of protecting customers' own security and privacy.
And then we get even more maddening. In discussing the "burden" the DOJ literally tries to argue that if there is a burden, it's Apple's fault for designing a system so secure.
Apple is one of the richest and most tech-savvy companies in the world, and it is more than able to comply with the AWA order. Indeed, it concedes it can do so with relatively little effort. Even this modest burden is largely a result of Apple’s own decision to design and market a nearly warrant-proof phone.
This is monumentally misleading. The whole DOJ premise is that Apple deliberately is trying to interfere with legal investigations. But that's bonkers. Apple is just trying to build a secure phone for its users -- and a natural and unavoidable consequence of that is that it makes it more difficult for law enforcement to get access to that info. But that's because the whole point of such security is to make it more difficult for everyone who is not the phone's owner to get access, because that's how you protect them.
The DOJ is so vain it thinks Apple's security is all about them.
Then we get back to the lying:
Apple’s primary argument regarding undue burden appears to be that it should not be required to write any amount of code to assist the government.
Not really. Its primary argument is that the burden is in writing any amount of code that undermines the safety and security of its customers. That last part is kind of the important part. No wonder the DOJ ignores it.
Apple asserts that it would take six to ten employees two to four weeks to develop new code in order to carry out the Court’s Order.... Even taking Apple at its word, this is not an undue burden, especially given Apple’s vast resources and the government’s willingness to find reasonable compromises and provide reasonable reimbursement.
Apple is a Fortune 5 corporation with tremendous power and means: it has more than 100,000 full-time-equivalent employees and had an annual income of over $200 billion dollars in fiscal year 2015—more than the operating budget for California.... Indeed, Apple’s revenues exceed the nominal GDPs of two thirds of the world’s nations. To build the ordered software, no more than ten employees would be required to work for no more than four weeks, perhaps as little as two weeks.
Again, this is misleading (sense a theme?). First, as noted above, the "burden" is not so much in the time or engineers allotted to this issue. Second, even if we accept the DOJ's assertions here, it's misleading. The Apple filing noted that it would take that much effort just to create the initial code and to test it, but then noted -- quite rightly -- that if in the testing any problems arose, as they almost certainly would, it would need to basically redo the process. Part of the point, which can slip by non-technical people who have no experience developing and deploying code, is that this process could take a long, long time, and involve a lot of effort before it's actually safe to use on the actual phone.
Next up, the DOJ continues to insist that there can't possibly be any danger in creating this code, because Apple surely knows how to guard it, and further, that even if the code got out, that it wouldn't matter because it's asking for code that will only run on the Farook phone.
Next, contrary to Apple’s stated fears, there is no reason to think that the code Apple writes in compliance with the Order will ever leave Apple’s possession. Nothing in the Order requires Apple to provide that code to the government or to explain to the government how it works. And Apple has shown it is amply capable of protecting code that could compromise its security. For example, Apple currently protects (1) the source code to iOS and other core Apple software and (2) Apple’s electronic signature, which as described above allows software to be run on Apple hardware.... Those —which the government has not requested—are the keys to the kingdom. If Apple can guard them, it can guard this.
But, again, that leaves out the reality of testing this particular code and how that makes it much more likely the code will get out. This argument was presented in the amicus brief filed by iPhone forensics and security experts.
Next up, the DOJ totally misrepresents Apple's current assistance to government requests for information from the Chinese government. The DOJ is trying to argue, misleadingly, that Apple has no problem doing the same stuff for China, so that its worries about this case, creating a precedent for authoritarian regimes, is nonsense. But it's the DOJ's argument that's truly nonsense:
Apple suggests that, as a practical matter, it will cease to resist foreign governments’ efforts to obtain information on iPhone users if this Court rules against it. It offers no evidence for this proposition, and the evidence in the public record raises questions whether it is even resisting foreign governments now. For example, according to Apple’s own data, China demanded information from Apple regarding over 4,000 iPhones in the first half of 2015, and Apple produced data 74% of the time.... Apple appears to have made special accommodations in China as well: for example, moving Chinese user data to Chinese government servers, and installing a different WiFi protocol for Chinese iPhones.... Such accommodations provide Apple with access to a huge, and growing, market.... This Court’s Order changes neither the carrots nor the sticks that foreign governments can use on Apple. Thus, it does not follow that if America forgoes Apple’s assistance in this terrorism investigation, Apple will refuse to comply with the demands of foreign governments. Nor does it follow that if the Court stands by its Order, Apple must yield to foreign demands, made in different circumstances without the safeguards of American law.
What the DOJ is referring to here is Apple's latest transparency report in which it notes that it complied with 74% of government requests for information from China. You can see it here:
But again, Apple has always been willing to respond to legitimate government requests for information that it has access to. That's why that same chart shows that it complied with 81% of US requests as well. But that says absolutely nothing about the requirement to build a special system to hack in and access data that it does not currently have access to.
The rest of the China stuff, about servers and WAPI, is just the DOJ picking up on Stewart Baker's conspiracy theory that he posted a few weeks back. Lots of countries (stupidly) demand local storage, not necessarily because of surveillance reasons, but because they think it's good for their economy. And the reason Apple used WAPI was because that was the standard used in China for WiFi-like wireless. And as for the idea that Apple magically gave access to the Chinese, that makes no sense, given that Apple then had to fight a man in the middle attack against iCloud in China that was claimed to have originated from the Chinese government. If Apple gave it access, why would the government need to run a MiTM attack? The whole argument makes no sense.
In the first half of 2015 alone, Apple handled 27,000 “device requests”—often covering multiple devices—and provided data approximately 60% of the time.... If Apple can provide data from thousands of iPhones and Apple users to China and other countries, it can comply with the AWA in America. (Id.) This is not speculation because, in fact, Apple complied for years with American court orders to extract data from passcode-locked iPhones, dedicating infrastructure and personnel in order to do so.
Again that's different. That's about supplying the info that Apple had access to and not about writing code to undermine security features. Apples and oranges.
Finally, the DOJ mocks Apple's constitutional arguments on the First and Fifth Amendments.
Apple’s claim is particularly weak because it does not involve a person being compelled to speak publicly, but a for-profit corporation being asked to modify commercial software that will be seen only by Apple. There is reason to doubt that functional programming is even entitled to traditional speech protections....
To the extent Apple’s software includes expressive elements—such as variable names and comments—the Order permits Apple to express whatever it wants, so long as the software functions.
We're not "compelling" you to say this exactly, we're letting you say whatever you want... so long as it does what we want it to. That still seems like compelled speech, no?
Apple lastly asserts that the Order violates its Fifth Amendment right to due process. Apple is currently availing itself of the considerable process our legal system provides, and it is ludicrous to describe the government’s actions here as “arbitrary.”
Once again, it appears that many of the DOJ's arguments here are misleading in the extreme. Apple's response is due next week, and I imagine it will be quite a read as well.
There is a (reasonable) tendency to argue that in this big fight over encryption backdoors and "going dark" and "should Apple help the FBI" to assume that the various DOJ/FBI efforts to force backdoors into encryption are the official position of the Obama administration. After all, the Justice Department is a part of the administration and the head of the DOJ, Attorney General Loretta Lynch, reports to President Obama. And the FBI is a part of the DOJ. But it's also been quite clear for some time that there are a variety of opinions within the White House on these issues, with many outside of the DOJ not supporting backdooring encryption at all. In fact, many are actively opposed to such ideas. And now it's reaching the stage where people are starting to push stories that the White House is not at all happy with FBI Director James Comey and his crusade on this issue.
With regards to the Apple standoff, "It's just not clear [Comey] is speaking for the administration," said Richard Clarke, a former White House counterterrorism and cybersecurity chief. "We know there have been administration meetings on this for months. The proposal that Comey had made on encryption was rejected by the administration."
[....]
"I have been very surprised at how public and inflammatory, frankly, the FBI and the Justice Department’s approach has been on this," said Chris Finan, a former National Security Council cybersecurity adviser.
"That doesn't tend to be the administration’s preferred approach to handling things."
There are a lot more quotes in the article suggesting similar things (and also discussing FBI issues beyond just the Apple/encryption debate).
Indeed, back last fall, we noted that leaked documents showed that many in the White House did not agree with Comey or the FBI on this issue -- and some pushed for a public statement opposing backdooring encryption. Unfortunately, the administration later took the cowardly approach of agreeing not to push for legislation, but refusing to take a strong public stance on the issue, because they didn't want to anger the law enforcement community. So, instead, you have the DOJ and FBI -- representatives of the administration -- now running wild, pushing dangerous legal theories that will undermine key elements of computer security, and lots of people think that's the administration's official position.
The White House failed badly in not taking a public stance on this months ago, and it should fix that now.
The prosecutions tied to this investigation have been interesting, to say the least. The FBI's short run as child porn site hosts received a judicial shrug -- something courts have done in the past when confronted with disturbing government behavior in service of combating crime. These have also led to the government arguing -- and the court echoing -- that Tor users have no expectation of privacy, as sooner or later, everything comes down to an IP address.
The warrant itself is slightly redacted, but that's hardly a surprise. More surprising is the fact that it has been released at all, as the FBI usually argues for the sealing of documents related to its investigations, especially in cases where law enforcement tech and methods are discussed.
As far as the details contained within, most of what's known about the FBI's NIT has already been discussed. As Motherboard's Joseph Cox points out, there are a few interesting aspects to the warrant request. For one, it makes it clear the FBI will be running a child porn site for the duration of the "search."
“While the TARGET WEBSITE operates at a government facility, such request data associated with a user's actions on the TARGET WEBSITE will be collected,” the affidavit, signed by Douglas Macfarlane, an FBI special agent, reads.
While the document claims the FBI has no other way to ascertain the IP addresses and locations of users connecting to the website, it also goes light on the details of what it plans to do. The NIT is discussed in terms of what it's capable of gathering, but goes very, very light on technical details. Nowhere in the document does the FBI refer to its NIT in terms more applicable to its function, like "malware," "spyware" or "hacking." The FBI describes its NIT this way:
In the normal course of operation, websites send content to visitors. A user's computer downloads that content and uses it to display web pages on the user's computer. Under the NIT authorized by this warrant, the TARGET WEBSITE, which will be located in Newington, Virginia, in the Eastern District of Virginia, would augment that content with additional computer instructions. When a user's computer successfully downloads those instructions from the TARGET WEB SITE..., the instructions, which comprise the NIT, are designed to cause the user's "activating" computer to transmit certain information to a computer controlled by or known to the government. That information is described with particularity on the warrant (in Attachment B of this affidavit), and the warrant authorizes obtaining no other information. The NIT will not deny the user of the "activating" computer access to any data or functionality of the user's computer.
This lack of details could be problematic.
Critics are worried that the language of NIT applications is too vague for judges to grasp what exactly it is they are authorizing; the words "malware" or "hacking" are never used, for example. (Magistrate Judge Theresa C. Buchanan, who signed off on the NIT, has repeatedly declined to answer questions from Motherboard.) The NIT was used to access computers in the US, Greece, Chile, and likely elsewhere.
Speaking of foreign nations, the FBI apparently had some outside assistance in this case.
In December of 2014, a foreign law enforcement agency advised the FBI that it suspected IP address 192.198.81.106 , which is a US-based IP address, to be associated with the TARGET WEBSITE. A publicly available website provided information that the IP Address 192.198.81.106 was owned by [REDACTED] a server hosting company headquartered at [REDACTED] Through further investigation, FBI verified that the TARGET WEBSITE was hosted from the previously referenced IP address. [...] Further investigation has identified a resident of Naples, FL, as the suspected administrator of the TARGET WEBSITE, who has administrative control over the computer server in Lenoir, NC, that hosts the TARGET WEBSITE.
The fact that documents from sealed cases related to the FBI's Playpen investigation are being released publicly shows that even opposed forces can sometimes arrive at the same plan of actions, even if their motivations are completely different.
In Washington, the lawyer for a defendant captured with the assistance of the FBI's NIT is hoping to put the FBI's apparent overreach on display by requesting the unsealing of documents. The FBI, on the other hand, isn't putting up much of a fight to keep these sealed. The affidavit in this related case contains graphic descriptions of child porn images found on the site. People who generally don't believe the ends justifies the means often make exceptions for more heinous criminal activity like this. The public outing of sealed docs could persuade fence-sitters to come down on the side of the FBI, even if the agency's use of NITs is hardly limited to cases involving crime the public overwhelmingly finds completely repugnant.
Among the many questions swirling around the challenge to U.S. Magistrate Judge Sheri Pym's
Order that Apple create software to bypass the iPhone passcode screen, a matter of paramount
public interest may have been overlooked: Even if the government prevails in compelling Apple
to bypass these iPhone security features: (A) evidence for use in a criminal trial obtained in this
way will be challenged under the Daubert standard (described below) and the evidence may be
held to be inadmissible at trial; and (B) the Daubert challenge may require disclosure of Apple's
iPhone unlocking software to a number of third parties who would require access to it in order to
bring the Daubert challenge and who may not secure the new software adequately. To state that
neither consequence would be in the public interest would be an understatement in the extreme.
The Daubert challenge would arise because any proffered evidence from the subject iPhone
would have been obtained by methodology utilizing software that had never been used before to
obtain evidence in a criminal trial. The Supreme Court, in Daubert v. Merrill-Dow
Pharmaceutical-Dow Pharmaceuticals, Inc., held that new methodologies from which proffered
evidence is derived must, when challenged, be substantiated by expert scientific testimony in
order to be admissible. In Daubert, the court stated that the criteria that must be utilized when
faced with a defense challenge to scientific testimony and evidence are:
Can the methodology used to reach the expert's conclusion (the new software here)
be tested and verified?
Have the methodology and software been peer-reviewed and has the review been
published in a peer-reviewed journal?
Do the techniques used to reach the conclusion (here, to obtain the evidence) have an
ascertainable error rate?
Has the methodology used to generate the conclusion (the evidence) been generally
accepted by the relevant scientific community?
Under the Daubert standards, introduction of evidence from the iPhone, electronic
communications and data stored in the phone, would require the testimony of an expert witness
to, among other things:
establish the integrity of the data (and its reliability) throughout the chain of custody;
explain whether any person or software could modify the data coming off of the phone;
verify that the data that came off the phone as delivered by Apple and held by law
enforcement was the data that had originally been on the phone;
explain the technical measures, such as the digital signatures attached to the data, used
ensure that no tampering has occurred and their likely error rates.
Such an expert would, in preparation for his or her testimony, require access to and examination
of the software, as it is inconceivable that defense counsel would simply accept the testimony of
the Apple personnel without also demanding that their own, third-party, experts have access to
the code.
In addition, defense counsel would undoubtedly demand the right for their own third-party
experts to have access not only to the source code, but to further demand the right to simulate the
testing environment and run this code on their own systems in order to confirm the veracity of
evidence. This could easily compromise the security of the new unlocking code, as argued by in
the amicus brief filed with Judge Pym by Jennifer Granick and Riana Pfefferkorn from
Stanford's Center for Internet and Society (also covered previously by Techdirt):
There is also a danger that the Custom Code will be lost or stolen. The more often Apple must use
the forensic capability this Court is ordering it to create, the more people have to have access to
it. The more people who have access to the Custom Code, the more likely it will leak. The software
will be valuable to anyone eager to bypass security measures on one of the most secure
smartphones on the market. The incentive to steal the Custom Code is huge. The Custom Code
would be invaluable to identity thieves, blackmailers, and those engaged in corporate espionage
and intellectual property theft, to name a few.
Ms. Granick and Ms. Pfefferkorn may not have contemplated demands by defense counsel to
examine the software on their own systems and according to their own terms, but their logic
applies with equal force to evidentiary challenges to the new code: The risk of the software
becoming public increases when it is examined by multiple defense counsel and their experts, on
their own systems, with varying levels of technical competency. Fundamentally, then, basic
criminal trial processes such as challenges to expert testimony and evidence that results from that
testimony based on this new software stand in direct tension with the public interest in the
secrecy and security of the source code of the new iPhone unlocking software.
At best, none of these issues can be resolved definitively at this time because the software to
unlock the phone has not been written. But the government's demand that the court force Apple
to write software that circumvents its own security protocols maybe shortsighted as a matter of
trial strategy, in that any evidence obtained by that software may be precluded following a
Daubert inquiry. Further, the public interest may be severely compromised by a court order
directing that Apple to write the subject software because the due process requirements for
defense counsel and their experts to access the software and Apple's security protocols may
compromise the secrecy necessary to prevent the proposed workaround from becoming available
to hackers, foreign governments and others. No matter what safeguards are ordered by a court,
security of the new software may be at considerable risk because it is well known that no
security safeguards are impregnable.
The government may be well advised to heed the adage, "Be careful what you ask for. You may
just get it." Its victory in the San Bernardino proceedings may be worse than Pyrrhic. It could be
dangerous.
Kenneth N. Rashbaum is a Partner at Barton, LLP in New York, where he heads the Privacy and
Cybersecurity Practice. He is an Adjunct Professor of Law at Fordham University School of
Law, Chair of the Disputes Division of the American Bar Association Section of International
Law, Co-Chair of the ABA Section of International Law Privacy, E-Commerce and Data
Security Committee and a member of the Section Council. You can follow Ken @KenRashbaum
Liberty McAteer is an Associate at Barton LLP. A former front-end web developer, he advises
software developers and e-commerce organizations on data protection, cybersecurity and
privacy, including preparation of security and privacy protocols and information security terms
in licensing agreements, service level agreements and website terms of service. You can follow
Liberty @LibertyMcAteer
The deeper you dive into the various DOJ filings to try to use the All Writs Act to force Apple to hack into encrypted iPhones, the more and more dishonest they seem. We already covered some of the misleading claims in the DOJ's latest filing in NY, including pointing to a 2012 case as evidence that the All Writs Act can be used to force Apple to break into a phone, when the actual ruling in that case said only that Apple had standing to oppose an All Writs Act order.
However, buried in an excellent article by Sarah Jeong at Vice's Motherboard
about that same filing, there's another interesting tidbit that seems worth exploring: in both the NY and California cases, the DOJ has repeatedly pointed to a so-called "three factor test" under United States v. New York Telephone Co., which is the key case that established that it's acceptable, under the All Writs Act, for the FBI to force a telephone company to install and use a "pen register" device on telephone lines (to track who they call). In the original motion for the order in the San Bernardino case, here's the DOJ's argument on page 14 and 15 of that document:
In New York Telephone Co., the Supreme Court considered three
factors in concluding that the issuance of the All Writs Act order
to the phone company was appropriate. First, it found that the
phone company was not "so far removed from the underlying
controversy that its assistance could not be permissibly compelled."
... Second, it concluded that the order did not place an
undue burden on the phone company.... Third, it
determined that the assistance of the company was necessary to
achieve the purpose of the warrant.... Each of these factors
supports issuance of the order directed to Apple in this case.
The DOJ repeated this paragraph verbatim in its motion to compel (on page 15) and again (verbatim) in the filing in NY (page 38).
The magistrate judge in NY, James Orenstein, accepted this three factors test, and used it to argue that the DOJ's application actually failed to meet the requisite factors (starting right on page 1):
In addition, applicable case law requires me to consider three factors in deciding whether to issue an order under the AWA: the closeness of Apple's relationship to the underlying criminal conduct and government investigation; the burden the requested order would impose on Apple; and the necessity of imposing such a burden on Apple. As explained below, after reviewing the facts in the record and the parties' arguments, I conclude that none of those factors justifies imposing on Apple the obligation to assist the government's investigation against its will.
There's just one problem in all of this -- as highlighted in Jeong's article linked above, and discussed in more detail by Orin Kerr last month: there is no three factors test in the US v. NY Telephone case. As Jeong summarizes:
You could argue it’s three factors, or maybe four, or even five. The point is, NY Telephone isn’t as easy to apply as the government makes it out to be. Everyone in the Apple case is playing in uncharted waters.
Kerr notes that the paragraph in the Supreme Court's ruling, far from laying out a "three factor test" appears to be "frustratingly murky."
The tricky part of New York Telephone is that the Court left the actual test for what the AWA allows frustratingly murky. The Court was comparatively clear about one essential limit on a Court’s power under the AWA: “We agree that the power of federal courts to impose duties upon third parties is not without limits; unreasonable burdens may not be imposed.” Okay. But the rest of what the Court says is really unclear.
Here are the key paragraphs from the Supreme Court ruling:
Turning to the facts of this case, we do not think that the Company was a third party so far removed from the underlying controversy that its assistance could not be permissibly compelled. A United States District Court found that there was probable cause to believe that the Company’s facilities were being employed to facilitate a criminal enterprise on a continuing basis. For the Company, with this knowledge, to refuse to supply the meager assistance required by the FBI in its efforts to put an end to this venture threatened obstruction of an investigation which would determine whether the Company’s facilities were being lawfully used. Moreover, it can hardly be contended that the Company, a highly regulated public utility with a duty to serve the public, had a substantial interest in not providing assistance. Certainly the use of pen registers is by no means offensive to it. The Company concedes that it regularly employs such devices without court order for the purposes of checking billing operations, detecting fraud, and preventing violations of law. It also agreed to supply the FBI with all the information required to install its own pen registers. Nor was the District Court’s order in any way burdensome. The order provided that the Company be fully reimbursed at prevailing rates, and compliance with it required minimal effort on the part of the Company and no disruption to its operations.
Finally, we note, as the Court of Appeals recognized, that without the Company's assistance there is no conceivable way in which the surveillance authorized by the District Court could have been successfully accomplished. The FBI, after an exhaustive search, was unable to find a location where it could install its own pen registers without tipping off the targets of the investigation. The provision of a leased line by the Company was essential to the fulfillment of the purpose— to learn the identities of those connected with the gambling operation—for which the pen register order had been issued.
So yeah, it mentions the three things the DOJ keeps insisting are the "three factor test" (if the third party is not "so far removed," if there is no "undue burden" and if the assistance was deemed "necessary"). But there's a hell of a lot of other stuff in there as well, including the fact that in that case, NY Telephone was "a highly regulated public utility." So that seems like a relevant "fourth" factor that weighs against the DOJ (and they conveniently skip over).
And, as Kerr notes, unlike basically any judicial "test," this one fails to lay out any of the ground rules:
The paragraph above is pretty confusing. It begins with the idea that the AWA doesn’t apply to someone “so far removed” from the controversy; then turns to the need for the phone company’s help; then talks about what is “offensive” to the company; and then covers the burden to the phone company, focusing on how much it cost the business and interfered with it. But the paragraph doesn’t link these ideas or say how they relate to one another. It doesn’t say what the standard is for each idea or how much weight to give it.
In other words, this was just a way for the Court to get the case off the docket, not to set a "test" that would be applied 40 years later. Kerr later expands:
We’re mostly left with the uncertainty of the New York Telephone case itself. Beyond the “unreasonable burden” test, it’s not clear what to make of the other matters that the court mentions. Are they all just factors in a grand multi-factor test? Are they actually parts of the undue burden standard, just not explicitly labeled that way? Are they parts of what makes the order “appropriate”?
Once you figure that out — if you can — there’s the uncertainty about what each mentioned standard means.
Kerr then spends a lot of time noting that even with the actual establishment of the "unreasonable burden" test, there are no actual details or explanations to go with it and you can come up with a wide variety of possible interpretations that would lead to very different results. In short, there's no big "test" developed here, and despite nearly four decades, there's no evidence that anyone else has really made use of this "test" to determine the ground rules for it.
Either way, it seems clear that the DOJ keeps making some pretty direct claims in its filings that are based on either direct misreadings of things, or deliberately misleading the courts about these things. That seems like a dangerous game to play.