from the this-is-wrong dept
A few weeks ago, we pointed out that Senator Sheldon Whitehouse led the way with
perhaps the
most ridiculous statement of any Senator (and there were a lot of crazy statements) in the debate over encryption and the FBI's exaggerated fear of "going dark." He argued that if the police couldn't find a missing girl (using a hypothetical that not only didn't make any sense, but which also was entirely unlikely to ever happen), then perhaps Apple could face some civil liability for not allowing the government to spy on your data. Here's what he said:
It strikes me that one of the balances that we have in these circumstances, where a company may wish to privatize value -- by saying "gosh, we're secure now, we got a really good product, you're gonna love it" -- that's to their benefit. But for the family of the girl that disappeared in the van, that's a pretty big cost. And, when we see corporations privatizing value and socializing costs, so that other people have to bear the cost, one of the ways that we get back to that and try to put some balance into it, is through the civil courts. Through the liability system. If you're a polluter and you're dumping poisonous waste into the water rather than treating it properly somebody downstream can bring an action and can get damages for the harm they sustained, can get an order telling you to knock it off.
You can read our
longer analysis of how wrong this is, but in short: encryption is not
pollution. Pollution is a
negative externality. Encryption is the opposite of that. It's a tool that
better protects the public in the vast majority of cases. That's
why Apple is making it so standard.
The suggestion was so ridiculous and so wrong that we were surprised that famed NSA apologist Ben Wittes of the Brookings Institute found Whitehouse's nonsensical rant
"interesting" and worthy of consideration. While we disagree with Wittes on nearly everything, we thought at the very least common sense would have to eventually reach him, leading him to recognize that absolutely nothing Whitehouse said made any sense (then again, this is the same Wittes who seems to have joined the
magic unicorn/golden key brigade -- so I'm beginning to doubt my initial assessment that Wittes is well-informed but just comes to bad conclusions).
However, even with Wittes finding Whitehouse's insane suggestion "interesting," it's still rather surprising to see him find it worthy of a
multi-part detailed legal analysis for which he brought in a Harvard Law student, Zoe Bedell, to help. In the
first analysis, they take a modified form of Whitehouse's hypothetical (after even they admit that his version doesn't actually make any sense), but still come to the conclusion that the company "could" face civil liability. Though, at least they admit plaintiffs would "not have an easy case."
The first challenge for plaintiffs will be to establish that Apple even had a duty, or an obligation, to take steps to prevent their products from being used in an attack in the first place. Plaintiffs might first argue that Apple actually already has a statutory duty to provide communications to government under a variety of laws. While Apple has no express statutory obligation to maintain the ability to provide decrypted information to the FBI, plaintiffs could argue that legal obligations it clearly does have would be meaningless if the communications remained encrypted.
To make this possible, Bedell and Wittes try to read into various wiretapping and surveillance laws a non-existent
duty to decrypt information from your mobile phone. But that's clearly not true. If that actually existed, then we wouldn't be having this debate right now in the first place, and FBI Director James Comey wouldn't be talking to Congress about changing the law to require such things. But, still, they hope that maybe, just maybe, a court would create such a duty out of thin air based on things like "the foreseeability of the harm." Except, that's going to fall flat on its face, because the likelihood of harm here goes the other way. Not encrypting your information leads to a
much, much, much greater probability of harm than encrypting your data and not allowing law enforcement to see it.
Going to even more ridiculous levels than the "pollution" argument, this article compares Apple encrypting your data to the potential liability of the guy who taught the Columbine shooters how to use their guns:
For example, after the Columbine shooting, the parents of a victim sued the retailer who sold the shooters one of their shotguns and even taught the shooters how to saw down the gun’s barrel. In refusing to dismiss the case, the court stated that “[t]he intervening or superseding act of a third party, . . . including a third-party's intentionally tortious or criminal conduct[,] does not absolve a defendant from responsibility if the third-party's conduct is reasonably and generally foreseeable.” The facts were different here in some respects—the Columbine shooters were under-age, and notably, they bought their supplies in person, rather than online. But that does not explain how two federal district courts in Colorado ended up selecting and applying two different standards for evaluating the defendant's duty.
But it's even more different than that. Even with this standard -- which many disagree with -- there still needs to be "conduct" that is "reasonably and generally foreseeable." And that's
not the case here that it is "reasonably and generally foreseeable" that because data is encrypted that people will be at more risk. In all these years, the FBI still can't come up with a single example where such encryption was a real problem. It would be basically impossible to argue that this is a foreseeable "problem," especially when weighed against the
very real and very present problem of people trying to hack into your device and get your data.
In the second in the series, Bedell and Wittes go even further in looking at whether or not Apple could be found to have
provided material support to terrorists thanks to encryption. If this sounds vaguely familiar, remember a similarly ridiculous claim not to long ago from a music industry lawyer and a DOJ official that YouTube and Twitter could be charged with
material support for terrorism because ISIS used both platforms.
Bedell and Wittes concoct a scenario in which a court might argue that providing a phone that can encrypt a terrorist's data, opens the company up to liability:
In our scenario, a plaintiff might argue that the material support was either the provision of the cell phone itself, or the provision of the encrypted messaging services that are native on it. Thus, if a jury could find that providing terrorists with encrypted communications services is just asking for trouble, then plaintiffs would have satisfied the first element of the definition of international terrorism in § 2331, a necessary step for making a case for liability under § 2333.
Of course, this is wiped out pretty quickly because that law
requires intent. The authors note that this would "pose a challenge" to any plaintiff "as it would appear to be difficult, if not impossible, to prove that Apple intended to intimidate civilians or threaten governments by selling someone an iPhone..."
You think?
But, our intrepid NSA apologists still dig deeper to see if they can come up with a legal theory that will actually work:
But again, courts have handled this question in ways that make it feasible for a plaintiff to succeed on this point against Apple. For example, when the judge presiding over the Arab Bank case considered and denied the bank’s motion to dismiss, he shifted the analysis of intimidation and coercion (as well as the question of the violent act and the broken criminal law) from the defendant in the case to the group receiving the assistance. The question for the jury was thus whether the bank was secondarily, rather than primarily, liable for the injuries. The issue was not whether Arab Bank was trying to intimidate civilians or threaten governments. It was whether Hamas was trying to do this, and whether Arab Bank was knowingly helping Hamas.
Judge Posner’s opinion in Boim takes a different route to the same result. Instead of requiring a demonstration of actual intent to coerce or intimidate civilians or a government, Judge Posner essentially permits the inference that when terrorist attacks are a “foreseeable consequence” of providing support, an organization or individual knowingly providing that support can be understood to have intended those consequences. Because Judge Posner concludes that Congress created an intentional tort, § 2333 in his reading requires the plaintiff to prove that the defendant knew it was supporting a terrorist or terrorist organization, or at least that it was deliberately indifferent to that fact. In other words, the terrorist attack must be a foreseeable consequence of the specific act of support, rather than just a general risk of providing a good or service.
But even under those standards, it's hard to see how Apple could possibly be liable for material support. It's just
selling an iPhone and doing so in a way that -- for the vast majority of its customers -- is
better protecting their privacy and data. It would take an extremely twisted mind and argument to turn that into somehow "knowingly" helping terrorists or creating a "foreseeable consequence." At least the authors admit that much.
But why stop there? They then say that Apple could still be liable
after the government asks them to decrypt messages. If Apple doesn't magically stop the user in particular from encrypting messages,
then, they claim, Apple could be shown to be "knowingly" supporting terrorism.
The trouble for Apple is that our story does not end with the sale of the phone to the person who turns out later to be an ISIS recruit. There is an intermediate step in the story, a step at which Apple’s knowledge dramatically increases, and its conduct arguably comes to look much more like that of someone who—as Posner explains—is recklessly indifferent to the consequences of his actions and thus carries liability for the foreseeable consequences of the aid he gives a bad guy.
That is the point at which the government serves Apple with a warrant—either a Title III warrant or a FISA warrant. In either case, the warrant is issued by a judge and puts Apple on notice that there is probable cause to believe the individual under investigation is engaged in criminal activity or activity of interest for national security reasons and is using Apple’s services and products to help further his aims. Apple, quite reasonably given its technical architecture, informs the FBI at this point that it cannot comply in any useful way with the warrant as to communications content. It can only provide the metadata associated with the communications. But it continues to provide service to the individual in question.
But all of this, once again, assumes an impossibility: that once out of its hands, Apple can somehow stop the end user from using the encryption on their phone.
This is the mother of all stretches in terms of legal theories. And, throughout it all, neither Bedell nor Wittes even seems to recognize that stronger encryption
protects the end user. It's like it doesn't even enter their minds that there's a reason why Apple is providing encryption that isn't "to help people hide from the government." It's not about government snooping. It's about
anyone snooping. The other cases they cite are not like that at all. These arguments, even as thin as they are, only make sense if Apple's move to encryption doesn't really have widespread value for
basically the entire population. You don't sue Toyota for "material support for terrorism" just because a terrorist uses a Toyota to make a car bomb. Yet, Wittes and Bedell are somehow trying to make the argument that Apple is liable for better protecting you, just because in some instances it might also help "bad" people. That's a ridiculous legal theory that barely deserves to be laughed at, let alone a multi-part analysis of how it "might work."
Filed Under: ben wittes, encryption, isis, liability, material support, mobile encryption, pollution, sheldon whitehouse, terorrism, zoe bedell
Companies: apple