from the this-is-ridiculous dept
Senator Richard Burr, head of the Senate Intelligence Committee and long time friend to the intelligence community, has now penned a ridiculous, misleading, fear-mongering opinion piece for the Wall Street Journal, entitled:
Stopping Terrorists From "Going Dark." It's pretty much exactly what you'd expect if you've paid any attention to the ridiculous
"going dark" debate in the US. But, let's dig in and show just how bad this one is:
While the terrorist attacks in Paris, San Bernardino, Calif., and Garland, Texas, have brought discussions about encryption to the front pages, criminals in the U.S. have been using this technology for years to cover their tracks. The time has come for Congress and technology companies to discuss how encryption—encoding messages to protect their content—is enabling murderers, pedophiles, drug dealers and, increasingly, terrorists.
Right, except so far officials haven't been able to show evidence of any of those cases actually using encryption. Similarly, law enforcement has failed to show that criminals using encryption have really been that much of a problem either. And that's because it's
not a problem. Even in the (still mostly rare) cases where encryption is being used, criminals
still reveal plenty of information that would allow law enforcement to track them down. It's called doing basic detective work.
Consumer information should be protected, and the development of stronger and more robust levels of encryption is necessary. Unfortunately, the protection that encryption provides law-abiding citizens is also available to criminals and terrorists. Today’s messaging systems are often designed so that companies’ own developers cannot gain access to encrypted content—and, alarmingly, not even when compelled by a court order. This allows criminals and terrorists, as the law enforcement community says, to “go dark” and plot with abandon.
Yes, criminals and terrorists can use encryption just like law-abiding citizens. But that's true of
any technology. There's no way to build technology that "only the good people can use." Criminals use cars and computers and guns. And they eat food and drink too. Some of them talk to each other in person. Yet we don't freak out about any of that other stuff. And, again, it's simply incorrect to say they can "plot with abandon." They cannot. Even when using encryption, many people either mess it up or still leave other clues. Most encrypted communication still reveals metadata about who was contacted, for example.
Leaving aside the terrorism challenges, encryption is affecting the investigations of kidnapping, child pornography, gang activity and other crimes. Federal, state, local and tribal law-enforcement officers can obtain legal authority to conduct electronic communications surveillance on terrorists and criminals. But encrypted devices and applications sometimes block access to the data. This means that even when the government has shown probable cause under the Fourth Amendment, it cannot acquire the evidence it seeks.
Yes, yes, the FBI and folks like the Manhattan DA's office keep making this claim, but every time they're asked to provide
actual evidence of investigations stymied because of encryption, they
come up empty. Official stats on lawful interception orders show that
encryption is almost never a problem. They just don't run into it.
Technology has outpaced the law. The core statute, the Communications Assistance for Law Enforcement Act, was enacted in 1994, more than a decade before the iPhone existed. The law requires telecommunications carriers—for instance, phone companies—to build into their equipment the capability for law enforcement to intercept communications in real time. The problem is that it doesn’t apply to other providers of electronic communications, including those supporting encrypted applications.
This is wrong. Technology has not outpaced the law -- quite the opposite. Thanks to technology, law enforcement has
more access to more information about every person alive than ever before in history. Technology now allows police to know where basically everyone has been at any moment in the day, who they spoke with, who they called or who they contacted via email. The fact that
one small bit of data might be encrypted is hardly the case that technology has somehow outpaced the law.
Separately, yes, it's true that CALEA (the wiretapping statute) requires that phone calls can be tapped, but that's entirely different than undermining encryption. In fact, as we noted last week
law already makes clear that phone companies
are not required to backdoor encryption.
Federal Bureau of Investigation Director James Comey has said that one of the two Garland, Texas, shooters who died carrying out an attack on a Muhammad art exhibit in May exchanged 109 messages with an operative overseas. “We have no idea what he said,” Mr. Comey told the Senate this month, “because those messages were encrypted.” He described this as a “big problem”—and I couldn’t agree more.
Yes, yes, this is the example it took Comey
over a year to finally come up with, but again it's an incredibly weak one. Note: the encryption did not stop them from knowing who the shooter was communicating with, because the encryption does not impact the metadata. Yes, it may limit the ability to read the exact content of the messages, but the same would be true if they had just communicated via a phone call on an untapped line. Or if they had simply communicated with a simple code that those two knew and the FBI did not. This is really no different than any other criminal investigation situation, and it's not the encryption that's the problem.
Last month Manhattan District Attorney Cyrus R. Vance Jr. released an in-depth report specifically on “smartphone encryption and public safety.” Many cellphones, including those designed by Apple and Google, now encrypt by default all the data they store, which is accessible only with a passcode.
Yeah, and we talked about how
ridiculously wrong that report was at the time. And, again, the default mobile encryption only applies to data stored on those phones, not metadata. Apple would still have the keys to most data backed up in the cloud. Same with information shared with others where encryption may not be used. The amount of data that is truly "unobtainable" is minimal -- which is why no one has any really good examples of it being a problem.
The challenges presented by encryption extend to financial transactions. In August Sen. Elizabeth Warren wrote letters to six federal agencies voicing concerns that banks were using Symphony, an encrypted messaging system that could prevent regulators from detecting illegal activities. The letter came shortly after New York’s top banking regulator, the New York State Department of Financial Services, raised the same concern with several major banks and Symphony’s developer.
In response, the banks agreed to store decryption keys with independent custodians, and Symphony agreed to retain electronic communications for seven years. All parties also agreed to a periodic review process to make sure that oversight keeps in sync with new technologies.
It would seem to me that daily financial flows shouldn’t command more attention than terrorist or criminal communications, yet here we are. Although the agreement described above may not be the solution for all encrypted communications, it does show that cooperative solutions are possible.
That is
not an apples to apples comparison by any stretch of the imagination. The reason for the concern with the banks is that banks are a
highly regulated industry in which they are legally required to keep records of certain communications. That's not true of
the general public, and unless Senator Burr is looking to wipe out the 4th Amendment, he shouldn't even pretend these things have anything in common.
Second, what a cheap politician's trick to pull out the "daily financial flows shouldn’t command more attention than terrorist or criminal communications" line. This is blatant fear mongering, because the issue is not about terrorists or criminals, but you, me, and everyone reading this who has an expectation of privacy. The only way to break encryption for "terrorists and criminals" is
to make everyone less safe by putting in dangerous backdoors.
And, every time we put backdoors into encryption we see how it's abused -- such as with the recent
Juniper vulnerability.
Finally, the "cooperative solution" in the case of the financial industry is an entirely different animal as well. Again, that's a limited use case in a specific, highly regulated industry. To even suggest that because of that specific use case, there must be some sort of "cooperative solution" once again highlights a near total ignorance of how encryption works.
I and other lawmakers in Washington would like to work with America’s leading tech companies to solve this problem, but we fear they may balk. When Apple objected to a recent court order in a New York criminal case requiring it to unlock an iPhone running iOS 7—an operating system that Apple can unlock—the company refused, arguing: “This is a matter for Congress to decide.” On that point, Apple and I agree. It’s time to update the law.
You fear they may balk? You want to know why? Perhaps because
your friends in the intelligence community spent the last fifteen years breaking into their systems at every opportunity, undermining the trust and security of all of their users. You think
that might have something to do with it? Maybe?
Senator Burr is doing something incredibly dangerous here. He's misleading the American public in a totally ignorant way, that will put our security at risk. He is making the world a more dangerous place, on purpose, because of a misunderstanding of how technology works. He has no place regulating technology issues at all.
Filed Under: backdoors, congress, encryption, going dark, richard burr, senate intelligence committee