from the you're-wrong dept
This is perhaps not surprising, but still disappointing. Former NYC mayor and current billionaire media/tech company boss Michael Bloomberg has
come down on the wrong side of the "going dark" encryption fight. In a Wall Street Journal op-ed (possible paywall link) he scolds tech execs for daring to side with Apple over the FBI and the Justice Department on the question of backdooring encryption. Bloomberg does not appear to actually understand the issues at play.
The fireworks and parades this weekend will give Americans a chance to celebrate the nation’s independence from England and show their love of country. But true patriotism involves more than flying the flag—and more than paying taxes and casting ballots. It requires putting America’s needs above individual interests when national security and public safety are at stake.
Generations of Americans have honored that principle, risking their lives to preserve a nation “conceived in Liberty,” as Lincoln remarked in his Gettysburg Address, “and dedicated to the proposition that all men are created equal.” Today, 1.3 million men and women serve in the military on active duty, often in dangerous situations overseas. Yet here at home, some executives in an industry that thrives on freedom—technology—are resisting government efforts to safeguard it. They are dangerously wrong.
Note the false framing here. Bloomberg is setting up the argument that backdooring encryption for the sake of the FBI/DOJ is "good for national security and public safety." He's wrong. It's not. It's not even close. It actually puts many more people at risk, because the only way to backdoor encryption effectively is to break that encryption and put everyone who uses it at much more risk. Yes, it means that the FBI/NSA won't be able to track
some people, but it's a very small number of people, and they have other ways to track them without undermining the security of everyone else.
The freedom that Americans enjoy requires shared sacrifices, and not only by soldiers. “We the people” impose limits on our personal liberty to protect ourselves and those around us. We are free to speak our minds, but we cannot yell “Fire!” in a crowded theater. We are free to travel without restriction, but driving a vehicle requires a license, and boarding a plane requires official identification. We are free to smoke tobacco, but today in most states we cannot do so indoors.
This is also dangerously wrong. We've discussed this many times before, but the lame "we cannot yell fire in a crowded theater" line is
simply incorrect. It's based on an ignorance of the actual law in that space (the court case where this statement was made is no longer the accepted standard under the law). It's also just a weak excuse for someone who is about to strip away other rights. It's basically a warning sign of someone who doesn't have a strong argument for why they want to strip away rights, so they'll make a misleading and incorrect statement claiming that you can't yell fire in a crowded theater (even though, in most cases, you actually can).
We also limit our right to privacy. The Fourth Amendment protects against “unreasonable searches and seizures,” but it also explicitly authorizes warrants based on probable cause. Every day, judges approve warrants authorizing searches of homes, cars and computers. Even our bodies can be subject to search warrants, as drunken-driving suspects learn when they attempt to refuse a blood test. Those suspected of other crimes may have their calls tapped and mail opened—all with the safeguard of an independent judiciary certifying the public need, to protect both our liberty and safety.
This is an especially intellectually dishonest move. He goes from "fire in a theater" to arguing that we already "limit our right to privacy" because judges issue warrants. But this is different. The 4th Amendment directly includes the warrant exception, unlike the First Amendment which includes no exception. And, really, many people question a lot of things that Bloomberg finds acceptable violations of privacy. That's not a huge surprise though. After all, as mayor, Bloomberg was a major supporter of the unconstitutional
stop and frisk program that the police used under his watch... until a court
threw it out. Of course, he also was against any transparency of
his own administration or
of the NYPD.
When Apple refused to unlock a cellphone used by one of the San Bernardino terrorists (and owned by his public employer), many in the tech industry came to the company’s defense. They argued, in effect, that they shouldn’t be forced to cooperate with a search warrant for one of their products, even though failure to comply could put more innocent lives at risk. Thankfully, the government was able to unlock the phone on its own. But next time, the public may not be so lucky. Imagine if the government is in possession of a cellphone that it has reason to believe contains information about an imminent hijacking—or an effort to detonate a dirty bomb. Should we allow the manufacturer to refuse a court order to unlock it?
This, again, totally misrepresents the situation. The issue was not the court order to unlock, but the fact that based on the encryption used on the phone, the only way to "unlock" it was to create a revamped operating system that undermined a number of key security features -- which would likely create much more risks for everyone. This was not a question of just turning a key as Bloomberg implies. The issue was not whether the government could just force a company to "unlock" an encrypted phone, but rather whether or not it could force them to build technology that deliberately undermines security.
And, really, this stupid game of "but what about the next time..." is ridiculous. THERE IS ALWAYS SECRET INFO THAT LAW ENFORCEMENT DOESN'T KNOW ABOUT. And it's not the end of the world. There is information in people's heads. There is information that they already destroyed. And somehow, law enforcement survives. The fact that some information may get scrambled by encryption is like that other kind of info. It's not the end of the world. In fact, we deliberately designed our legal system with the recognition that law enforcement does not have an automatic right to every bit of information.
Of course not. We are a nation of laws, and no industry is above them. The Constitution doesn’t carve out an exception for tech companies.
This is the most frustrating line in the article, because it's complete bullshit. No one is arguing they're above the law. Exactly the opposite. They're questioning whether or not having a valid warrant would then
require tech companies to construct a new tool to deliberately undermine security. Being "above the law" is doing things like
deliberately hiding documents and
emails from the public despite being legally required to reveal them. Being above the law is doing things like
refusing to accept a decision of the City Council against your unconstitutional policies. Those were things that Mayor Bloomberg did. What the tech companies are doing is not above the law at all.
Yet Apple responded to the investigation with a troubling announcement: In the future, phones will be designed to prevent even Apple from opening them, just as the makers of some messaging services have already done. Such a move would be an unprecedented rejection of public authority and a potentially catastrophic blow to public safety. The prospect of criminals and terrorists communicating with phones beyond the reach of government search warrants should send a shiver down the spine of every citizen.
This is again a misrepresentation of reality. Apple had already said that the phones would be designed that way well before this investigation. And the reason they did so was not to "reject public authority," but rather
because it's a better way to protect public safety. Don't ask me, ask the NYPD who
used to go around telling people to encrypt their phones to make them less appealing to thieves, who steal thousands upon thousands of those devices all the time, often seeking to get the info that's on the devices.
Google, Facebook, Snapchat and WhatsApp are all working to increase encryption in ways that will make it impossible for the courts and law-enforcement officials to obtain their users’ data. They argue that if they are forced to comply with government requests for data, terrorists will simply choose open-source encryption apps instead. But lone wolves are not always that sophisticated. Those that are may have no regard for investigations following their death. And for those that do want to cover their tracks: Why should we help them?
Again, the efforts they're making are not to keep law enforcement out, they're to keep
criminals out. It's about
protecting the public. And, as for "lone wolves" not always being "that sophisticated," that's exactly why this shouldn't be a huge concern. Because as we've noted time and time again, unless you're really good, you're probably going to make a mistake when you try to encrypt stuff, and you'll leak out plenty of info for law enforcement. So if they're really not that sophisticated, none of this is actually a problem.
It’s true that encryption may make it harder for repressive regimes to crack down on dissent, but it also makes it harder for democratic societies to protect themselves against terrorists and criminals. We can work to undermine repressive regimes in ways that do not compromise our own safety, and we should expect tech leaders to help lead the way.
Again, this is misrepresenting why encryption is so important. It DOES NOT make it harder for democratic societies to protect themselves against terrorists or criminals. It does the exact opposite, by providing the tools for people to better protect themselves against both of those things.
It’s worth remembering that the U.S. taxpayers, beginning under President Eisenhower, funded the R&D that led to the development of the internet and other technological advances without which these companies would not exist. And while many of them claim to be concerned about their customers’ privacy, let’s also remember that many turn around and sell their customers’ personal information to advertisers. If anyone thinks they are more concerned about privacy than profits, I have a bridge to sell you.
Again, this is not the actual argument. Bloomberg is throwing out so many strawmen, you'd think that they were on clearance. The fact that the government helped fund the development of some of these technologies is kind of irrelevant. It also helped fund plenty of research into encryption as well. So what? That doesn't mean we should undermine it all and make us all less safe. And, the whole "but companies sell your info to advertisers" point is a complete red herring. Having stronger encryption actually gives individuals more ability to take control over their own info and prevent that kind of thing. Sure, companies are mostly interested in their own profits, as, I assume, is Bloomberg's own company. But that's why they're building better encryption as well. Because they know that
making their customers safer and keeping them from having devices stolen or hacked creates a better product.
When America entered World War I, Thomas Edison devoted most of his time to naval research. His work led to various military inventions and improvements, including instruments to detect torpedoes. It is perhaps too idealistic to expect Silicon Valley’s best minds to give up their jobs to serve the government in the fight against terrorism. But a little cooperation shouldn’t be too much to ask.
Lots of people in Silicon Valley are proactively helping in the fight against terrorism. It's complete bullshit to argue that because we think that working encryption is necessary, people are refusing to help the government. The whole point is that breaking encryption doesn't help stop terrorism. It doesn't help stop crime. It puts us all at more risk.
It's too bad Bloomberg doesn't have anyone working for him who could have explained this to him before he sounded off so ignorantly in the WSJ. But, really, if Bloomberg thinks that tech companies need to build compromised, backdoored encryption, here's a solution: he can put some of his billions towards building just such a product, and see what the market thinks of it. If he's right and this is what the public needs to be safe, won't he be making tons of profits? After all, where else is he getting that bridge he wants to sell us...
Filed Under: backdoors, crypto wars, encryption, michael bloomberg, security, silicon valley, technology