One of the points that seems to be widely misunderstood by people who don't spend much time in computer security worlds, is that building secure encryption systems is really hard and almost everything has some sort of vulnerability somewhere. This is why it's a constant struggle by security researchers, cryptographers and security engineers to continually poke holes in encryption, and try to fix up and patch systems. It's also why the demand for backdoors is idiotic, because they probably already exist in some format. But purposely building in certain kinds of backdoors that can't be closed by law almost certainly blasts open much larger holes for those with nefarious intent to get in.
Case in point: over the weekend, computer science professor Matthew Green and some other researchers announced that they'd discovered a serious hole in the encryption used for Apple's iMessage platform, allowing a sophisticated hacker to access encrypted messages and pictures. And, Green, who has been vocal about the ridiculousness of the DOJ's request against Apple, notes how this is yet more evidence that the DOJ's request is a bad idea:
“Even Apple, with all their skills — and they have terrific cryptographers — wasn’t able to quite get this right,” said Green, whose team of graduate students will publish a paper describing the attack as soon as Apple issues a patch. “So it scares me that we’re having this conversation about adding back doors to encryption when we can’t even get basic encryption right.”
It's worth noting that the flaw that he and his team found would not have helped the FBI get what it wants off of Syed Farook's iPhone, but it's still a reminder of just how complex cryptography currently is, at a time when people are trying to keep everyone out. Offer up any potential backdoor, and you're almost certainly blasting major holes throughout the facade.
Apple is getting ready to push out a software update that will fix the flaw shortly. And this, alone, is yet another reason why the DOJ's case is so dangerous -- since the method it wants to use to get into Farook's phone is via its capabilities to push software updates. Patching software holes is a major reason to accept regular software updates, but the FBI is now trying to co-opt that process to install unsafe code. That, in turn, may prompt people to avoid software updates altogether, which in most cases will make them less safe.
This is not all that surprising, but President Obama, during his SXSW keynote interview, appears to have joined the crew of politicians making misleading statements pretending to be "balanced" on the question of encryption. The interview (the link above should start at the very beginning) talks about a variety of issues related to tech and government, but eventually the President zeroes in on the encryption issue. The embed below should start at that point (if not, it's at the 1 hour, 16 minute mark in the video). Unfortunately, the interviewer, Evan Smith of the Texas Tribune, falsely frames the issue as one of "security v. privacy" rather than what it actually is -- which is "security v. security."
In case you can't watch that, the President says he won't comment directly on the Apple legal fights, but then launches into the standard politician talking point of "yes, we want strong encryption, but bad people will use it so we need to figure out some way to break in."
If you watch that, the President is basically doing the same thing as all the Presidential candidates, stating that there's some sort of equivalency on both sides of the debate and that we need to find some sort of "balanced" solution short of strong encryption that will somehow let in law enforcement in some cases.
This is wrong. This is ignorant.
To his at least marginal credit, the President (unlike basically all of the Presidential candidates) did seem to acknowledge the arguments of the crypto community, but then tells them all that they're wrong. In some ways, this may be slightly better than those who don't even understand the actual issues at all, but it's still problematic.
Let's go through this line by line.
All of us value our privacy. And this is a society that is built on a Constitution and a Bill of Rights and a healthy skepticism about overreaching government power. Before smartphones were invented, and to this day, if there is probable cause to think that you have abducted a child, or that you are engaging in a terrorist plot, or you are guilty of some serious crime, law enforcement can appear at your doorstep and say 'we have a warrant to search your home' and they can go into your bedroom to rifle through your underwear to see if there's any evidence of wrongdoing.
Again, this is overstating the past and understating today's reality. Yes, you could always get a warrant to go "rifle through" someone's underwear, if you could present probable cause that such a search was reasonable to a judge. But that does not mean that the invention of smartphones really changed things so dramatically as President Obama presents here. For one, there has always been information that was inaccessible -- such as information that came from an in-person conversation or information in our brains or information that has been destroyed.
In fact, as lots of people have noted, today law enforcement has much more recorded evidence that it can obtain, totally unrelated to the encryption issue. This includes things like location information or information on people you called. That information used to not be available at all. So it's hellishly misleading to pretend that we've entered some new world of darkness for law enforcement when the reality is that the world is much, much brighter.
And we agree on that. Because we recognize that just like all our other rights, freedom of speech, freedom of religion, etc. there are going to be some constraints that we impose in order to make sure that we are safe, secure and living in a civilized society. Now technology is evolving so rapidly that new questions are being asked. And I am of the view that there are very real reasons why we want to make sure that government cannot just willy nilly get into everyone's iPhones, or smartphones, that are full of very personal information and very personal data. And, let's face it, the whole Snowden disclosure episode elevated people's suspicions of this.
[...]
That was a real issue. I will say, by the way, that -- and I don't want to go to far afield -- but the Snowden issue, vastly overstated the dangers to US citizens in terms of spying. Because the fact of the matter is that actually that our intelligence agencies are pretty scrupulous about US persons -- people on US soil. What those disclosures did identify were excesses overseas with respect to people who are not in this country. A lot of those have been fixed. Don't take my word for it -- there was a panel that was constituted that just graded all the reforms that we set up to avoid those charges. But I understand that that raised suspicions.
Again, at least some marginal kudos for admitting that this latest round was brought on by "excesses" (though we'd argue that it was actually unconstitutional, rather than mere overreach). And nice of him to admit that Snowden actually did reveal such "excesses." Of course, that raises a separate question: Why is Obama still trying to prosecute Snowden when he's just admitted that what Snowden did was clearly whistleblowing, in revealing questionable spying?
Also, the President is simply wrong that it was just about issues involving non-US persons. The major reform that has taken place wasn't about US persons at all, but rather about Section 215 of the PATRIOT Act, which was used almost entirely on US persons to collect all their phone records. So it's unclear why the President is pretending otherwise. The stuff outside of the US is governed by Executive Order 12333, and there's been completely no evidence that the President has changed that at all. I do agree, to some extent, that many do believe in an exaggerated view of NSA surveillance, and that's distracting. But the underlying issues about legality and constitutionality -- and the possibilities for abuse -- absolutely remain.
But none of that actually has to do with the encryption fight, beyond the recognition -- accurately -- that the government's actions, revealed by Snowden, caused many to take these issues more seriously. And, on that note, it would have been at least a little more accurate for the President to recognize that it wasn't Snowden who brought this on the government, but the government itself by doing what it was doing.
So we're concerned about privacy. We don't want government to be looking through everybody's phones willy-nilly, without any kind of oversight or probable cause or a clear sense that it's targeted who might be a wrongdoer.
What makes it even more complicated is that we also want really strong encryption. Because part of us preventing terrorism or preventing people from disrupting the financial system or our air traffic control system or a whole other set of systems that are increasingly digitalized is that hackers, state or non-state, can just get in there and mess them up.
So we've got two values. Both of which are important.... And the question we now have to ask is, if technologically it is possible to make an impenetrable device or system where the encryption is so strong that there's no key. There's no door at all. Then how do we apprehend the child pornographer? How do we solve or disrupt a terrorist plot? What mechanisms do we have available to even do simple things like tax enforcement? Because if, in fact, you can't crack that at all, government can't get in, then everybody's walking around with a Swiss bank account in their pocket. So there has to be some concession to the need to be able get into that information somehow.
The answer to those questions in that final paragraph are through good old fashioned detective work. In a time before smartphones, detectives were still able to catch child pornographers or disrupt terrorist plots. And, in some cases, the government failed to stop either of those things. But it wasn't because strong enforcement stymied them, but because there are always going to be some plots that people are able to get away with. We shouldn't undermine our entire security setup just because there are some bad people out there. In fact, that makes us less safe.
Also: tax enforcement? Tax enforcement? Are we really getting to the point that the government wants to argue that we need to break strong encryption to better enforce taxes? Really? Again, there are lots of ways to go after tax evasion. And, yes, there are lots of ways that people and companies try to hide money from the IRS. And sometimes they get away with it. To suddenly say that we should weaken encryption because the IRS isn't good enough at its job just seems... crazy.
Now, what folks who are on the encryption side will argue, is that any key, whatsoever, even if it starts off as just being directed at one device, could end up being used on every device. That's just the nature of these systems. That is a technical question. I am not a software engineer. It is, I think, technically true, but I think it can be overstated.
This is the part that's most maddening of all. He almost gets the point right. He almost understands. The crypto community has been screaming from the hills for ages that introducing any kind of third party access to encryption weakens it for all, introducing vulnerabilities that ensure that those with malicious intent will get in much sooner than they would otherwise. The President is mixing up that argument with one of the other arguments in the Apple/FBI case, about whether it's about "one phone" or "all the phones."
But even assuming this slight mixup is a mistake, and that he does recognize the basics of the arguments from the tech community, to have him then say that this "can be overstated" is crazy. A bunch of cryptography experts -- including some who used to work for Obama -- laid out in a detailed paper the risks of undermining encryption. To brush that aside as some sort of rhetorical hyperbole -- to brush aside the realities of cryptography and math -- is just crazy.
Encryption expert Matt Blaze (whose research basically helped win Crypto War 1.0) responded to this argument by noting that the "nerd harder, nerds" argument fundamentally misunderstands the issue:
Figuring out how to build the reliable, secure systems required to "compromise" on crypto has long been a central problem in CS.
If you can't read that, Blaze is basically saying that all crypto includes backdoors -- they're known as vulnerabilities. And the key focus in crypto is closing those backdoors, because leaving them open is disastrous. And yet the government is now demanding that tech folks purposely put in more backdoors and not close them, without recognizing the simple fact that vulnerabilities in crypto always lead to disastrous results.
So the question now becomes that, we as a society, setting aside the specific case between the FBI and Apple, setting aside the commercial interests, the concerns about what could the Chinese government do with this, even if we trust the US government. Setting aside all those questions, we're going to have to make some decisions about how do we balance these respective risks. And I've got a bunch of smart people, sitting there, talking about it, thinking about it. We have engaged the tech community, aggressively, to help solve this problem. My conclusions so far is that you cannot take an absolutist view on this. So if your argument is "strong encryption no matter what, and we can and should in fact create black boxes," that, I think, does not strike the kind of balance that we have lived with for 200, 300 years. And it's fetishizing our phones above every other value. And that can't be the right answer.
This is not an absolutist view. It is not an absolutist view to say that anything you do to weaken the security of phones creates disastrous consequences for overall security, far beyond the privacy of individuals holding those phones. And, as Julian Sanchez rightly notes, it's ridiculous that it's the status quo on the previous compromise that is now being framed as an "absolutist" position:
CALEA--with obligations on telecoms to assist, but user-side encryption protected--WAS the compromise. Now that's "absolutism".
Also, the idea that this is about "fetishizing our phones" is ridiculous. No one is even remotely suggesting that. No one is even suggesting -- as Obama hints -- that this is about making phones "above and beyond" what other situations are. It's entirely about the nature of computer security and how it works. It's about the risks to our security in creating deliberate vulnerabilities in our technologies. To frame that as "fetishizing our phones" is insulting.
There's a reason why the NSA didn't want President Obama to carry a Blackberry when he first became President. And there's a reason the President wanted a secure Blackberry. And it's not because of fetishism in any way, shape or form. It's because securing data on phones is freaking hard and it's a constant battle. And anything that weakens the security puts people in harm's way.
I suspect that the answer is going to come down to how do we create a system where the encryption is as strong as possible. The key is as secure as possible. It is accessible by the smallest number of people possible for a subset of issues that we agree are important. How we design that is not something that I have the expertise to do. I am way on the civil liberties side of this thing. Bill McCraven will tell you that I anguish a lot over the decisions we make over how to keep this country safe. And I am not interested in overthrowing the values that have made us an exceptional and great nation, simply for expediency. But the dangers are real. Maintaining law and order and a civilized society is important. Protecting our kids is important.
You suspect wrong. Because while your position sounds reasonable and "balanced" (and I've seen some in the press describe President Obama's position here as "realist"), it's actually dangerous. This is the problem. The President is discussing this like it's a political issue rather than a technological/math issue. People aren't angry about this because they're "extremists" or "absolutists" or people who "don't want to compromise." They're screaming about this because "the compromise" solution is dangerous. If there really were a way to have strong encryption with a secure key where only a small number of people could get in on key issues, then that would be great.
But the key point that all of the experts keep stressing is: that's not reality. So, no the President's not being a "realist." He's being the opposite.
So I would just caution against taking an absolutist perspective on this. Because we make compromises all the time. I haven't flown commercial in a while, but my understanding is that it's not great fun going through security. But we make the concession because -- it's a big intrusion on our privacy -- but we recognize that it is important. We have stops for drunk drivers. It's an intrusion. But we think it's the right thing to do. And this notion that somehow our data is different and can be walled off from those other trade-offs we make, I believe is incorrect.
Again, this is not about "making compromises" or some sort of political perspective. And the people arguing for strong encryption aren't being "absolutist" about it because they're unwilling to compromise. They're saying that the "compromise" solution means undermining the very basis of how we do security and putting everyone at much greater risk. That's ethically horrific.
And, also, no one is saying that "data is different." There has always been information that is "walled off." What people are saying is that one consequence of strong encryption is that it has to mean that law enforcement is kept out of that information too. That does not mean they can't solve crimes in other ways. It does not mean that they don't get access to lots and lots of other information. It just means that this kind of content is harder to access, because we need it to be harder to access to protect everyone.
It's not security v. privacy. It's security v. security, where the security the FBI is fighting for is to stop the 1 in a billion attack and the security everyone else wants is to prevent much more likely and potentially much more devastating attacks.
Meanwhile, of all the things for the President to cite as an analogy, TSA security theater may be the worst. Very few people think it's okay, especially since it's been shown to be a joke. Setting that up as the precedent for breaking strong encryption is... crazy. And, on top of that, using the combination of TSA security and DUI checkpoints as evidence for why we should break strong encryption with backdoors again fails to recognize the issue at hand. Neither of those undermine an entire security setup.
We do have to make sure, given the power of the internet and how much our lives are digitalized, that it is narrow and that it is constrained and that there's oversight. And I'm confident this is something that we can solve, but we're going to need the tech community, software designers, people who care deeply about this stuff, to help us solve it. Because what will happen is, if everybody goes to their respective corners, and the tech community says "you know what, either we have strong perfect encryption, or else it's Big Brother and Orwellian world," what you'll find is that after something really bad happens, the politics of this will swing and it will become sloppy and rushed and it will go through Congress in ways that have not been thought through. And then you really will have dangers to our civil liberties, because the people who understand this best, and who care most about privacy and civil liberties have disengaged, or have taken a position that is not sustainable for the general public as a whole over time.
I have a lot of trouble with the President's line about everyone going to "their respective corners," as it suggests a ridiculous sort of tribalism in which the natural state is the tech industry against the government and even suggests that the tech industry doesn't care about stopping terrorism or child pornographers. That, of course, is ridiculous. It's got nothing to do with "our team." It has to do with the simple realities of encryption and the fact that what the President is suggesting is dangerous.
Furthermore, it's not necessarily the "Orwellian/big brother" issue that people are afraid of. That's a red herring from the "privacy v. security" mindset. People are afraid of this making everyone a lot less safe. No doubt, the President is right that if there's "something really bad" happening then the politics moves in one way -- but it's pretty ridiculous for him to be saying that, seeing as the latest skirmish in this battle is being fought by his very own Justice Department, he's the one who jumped on the San Bernardino attacks as an excuse to push this line of argument.
If the President is truly worried about stupid knee-jerk reactions following "something bad" happening, rather than trying to talk about "balance" and "compromise," he could and should be doing more to fairly educate the American public, and to make public statements about this issue and how important strong encryption is. Enough of this bogus "strong encryption is important, but... the children" crap. The children need strong encryption. The victims of crimes need encryption. The victims of terrorists need encryption. Undermining all that because just a tiny bit of information is inaccessible to law enforcement is crazy. It's giving up the entire ballgame to those with malicious intent, just so that we can have a bit more information in a few narrow cases.
President Obama keeps mentioning trade-offs, but it appears that he refuses to actually understand the trade-offs at issue here. Giving up on strong encryption is not about finding a happy middle compromise. Giving up on strong encryption is putting everyone at serious risk.
Who knew that Senator John McCain understood encryption better than actual cryptographers? Late last week, he wrote an op-ed for Bloomberg View, in which he trots out all the usual talking points on how Silicon Valley just needs to nerd harder to solve the "Going Dark" problem. There's lots of cluelessness in the piece, but let's focus on the big one:
Top cryptologists have reasonably cautioned that “new law enforcement requirements are likely to introduce unanticipated, hard to detect security flaws,” but this is not the end of the analysis. We recognize there may be risks to requiring such access, but we know there are risks to doing nothing.
Actually, it kind of is "the end of the analysis" because the core element of that analysis is the fact that any attempt to backdoor encryption doesn't just make security weaker, it puts basically everyone at much greater risk. It introduces cataclysmic problems for any system that stores information that needs to be kept secure and private.
The following sentence is equally inane, in which he tries to place the "risks" of backdooring encryption on the same plane as the risk of ISIS using encryption. Let's be clear here: the risk of backdooring encryption isn't just significantly larger than the risk of ISIS using encryption, they're not even in the same universe. Even worse, by backdooring encryption, you are almost certainly increasing the risk of ISIS as well, by giving them a massive vulnerability to attack and exploit. Trying to suggest that this is an "on the one hand, on the other hand" situation is so ridiculously ignorant, one wonders who the hell is advising Senator McCain on this topic.
The fact is that there are always some risks. Tens of thousand of people die in car accidents in the US every year, yet you don't hear Senator McCain weighing the risks of driving versus the risks of banning cars. And that's a much more reasonable position to stake out, because banning cars would actually reduce automobile deaths — but it would also cripple the economy. But here's the thing: backdooring encryption has the potential to do much more damage to the economy than banning automobiles, because it would create vulnerabilities that could really completely shut down our economy. So, for McCain to pretend that there are somewhat equal risks on either side isn't just ignorant and meaningless, it's dangerous.
Some technologists and Silicon Valley executives argue that any efforts by the government to ensure law-enforcement access to encrypted information will undermine users’ privacy and make them less secure. This position is ideologically motivated and profit-driven, though not without merit. But, by speaking in absolute terms about privacy rights, they bring the discussion to a halt, while the security threat evolves.
Honestly, this is not true. I know that Comey's favorite line these days is that using strong encryption is a "business model decision," but Silicon Valley's interest in strong encryption doesn't appear to be driven by their own bottom lines, frankly. If it was, they would have adopted it much earlier. Strong encryption actually undermines some companies' business models, in that it makes it more difficult for them to collect the data that many of them rely on. The move towards stronger encryption has mostly been the result of a few things: (1) the fact that the NSA broke into their data centers and put their legitimate users at risk, (2) a better understanding of the wider risks from malicious attackers of what happens when you have weak encryption and (3) user demands for privacy. The last one may have indirect business model benefits in that it keeps users happier, but to argue that keeping users happy is somehow a purely money-driven decision, and frame it as somehow a bad thing, is pretty damn ridiculous.
And, honestly, while there are some activists who speak in absolute terms about "privacy rights," you rarely hear that from Silicon Valley companies. In fact, those who have absolute views on privacy tend to be the most critical of Silicon Valley companies for taking a much less principled view on "privacy rights." McCain pretending that this is driven by some sort of "privacy rights" advocacy suggests he's (again) woefully misinformed on this issue.
To be clear, encryption is often a very good thing. It increases the security of our online activities, provides the confidence necessary for economic growth through the Internet, and protects our privacy by securing some of our most important personal information, such as financial data and health records. Yet as with many technological tools, terrorist organizations are using encryption with alarming success.
Actually, they're not using encryption with "alarming success." There are very, very, very, very few examples of terrorists using encryption successfully. The Paris attackers? Unencrypted SMS. San Bernardino? Unencrypted social media communication.
The jihadists' followers and adherents use encryption to hide their communications within the U.S. FBI Director James Comey recently testified that the attackers in last year's Garland, Texas, shootings exchanged more than 100 text messages with an overseas terrorist, but law enforcement is still blinded to the content of those texts because they were encrypted.
Notice that this is the only example that comes up in these discussions. That's because it's the only example. And it's not even a very good one. Because, as with most encrypted communication, the metadata was still perfectly accessible. That's why they know that the attackers exchanged messages with a terrorist. Sure, they may not be able to understand the direct contents of the message, but the same thing would have been true if the attacker and the people he communicated with had worked out a code before hand. Or, you know, if they had met and talked in person. Is McCain going to ban talking in person too?
Finally, McCain's "solution" to all of this is to make a law telling Silicon Valley to nerd harder and solve the problem... or else:
As part of this effort, Congress should consider legislation that would require U.S. telecommunications companies to adopt technological alternatives that allow them to comply with lawful requests for access to content, but that would not prescribe what those systems should look like. This would allow companies to retain flexibility to design their technologies to meet both their business needs and our national security interests.
In other words, despite the fact that all of the best cryptographers in the world have said that what you're asking for is basically impossible and would make everyone less safe, just do it anyway -- and do it in a way that when it falls apart and everyone is made more vulnerable, Congressional leaders like John McCain can spin around and blame the companies rather than themselves.
We have to encourage companies and individuals who rely on encryption to recognize that our security is threatened, not encouraged, by technologies that place vital information outside the reach of law enforcement. Developing technologies that aid terrorists like Islamic State is not only harmful to our security, but it is ultimately an unwise business model.
Does John McCain seriously not employ a single knowledgeable staffer who could point out to him that basically every encrypted technology that ISIS uses is not made by an American company? Seriously, look at the list of ISIS's preferred encryption technologies:
So who, exactly, is developing technologies that "aid terrorists like Islamic State" and need their encryption undermined?
Meanwhile, we haven't even touched on the biggest issue, as was highlighted in that big paper from Harvard last week. And it's this: the whole Going Dark thing is a total myth, because for the tiny, tiny, tiny bit of information that is now blocked out by strong encryption, there's a mountain of other data that is now accessible to law enforcement and the intelligence community. Things have been getting lighter and lighter and lighter for decades.
Shouldn't a sitting Senator understand these basic facts?
We'll know things are really going wrong when government authorities are trying to innovate their way around math. (And maybe we're already headed that way with backdoors to encryption.) Hopefully, though, we'll be able to trust in math for the foreseeable future, and nevermind about the Banach-Tarski paradox. Math is hard.
If you've been thinking about learning how to code, take a look at our Daily Deals for a collection of online courses to help you program and/or master some professional skills.
Update: While the article in question claimed that Dr. Wertheimer was the Director of Research for the NSA, an email from the NSA alerts us that Wertheimer left the NSA before writing the article.
As you may recall, one of the big Snowden revelations was the fact that the NSA "took control" over a key security standard allowing backdoors to be inserted (or, at least, a weakness that made it easy to crack). It didn't take long for people to realize that the standard in question was Dual_EC_DRBG, or the Dual Elliptic Curve Deterministic Random Bit Generator. It also came out that the NSA had given RSA $10 million to push this compromised random bit generator as the default. That said, as we noted, many had already suspected something was up and had refused to use Dual_EC_DRBG. In fact, all the way back in 2007, there was a widespread discussion about the possibility of the NSA putting a backdoor in Dual_EC_DRBG, which is why so few actually trusted it.
Still, to have the details come out in public was a pretty big deal, so it also seemed like a fairly big deal to see that the Director of Research at the NSA, Dr. Michael Wertheimer (also former Assistant Deputy Director and CTO in the Office of the Director of National Intelligence), had apparently written something of an apology in the latest Notices of the American Mathematical Society. In a piece entitled, "The Mathematics Community and the NSA," Wertheimer sort of apologizes, admitting that mistakes were made. After admitting that concerns were raised by Microsoft researchers in 2007, and again with the Snowden documents (though without saying why they were raised the second time), here's Wertheimer's "apology."
With hindsight, NSA should have ceased supporting the Dual_EC_DRBG algorithm immediately after security researchers discovered the potential for a trapdoor. In truth, I can think of no better way to describe our failure to drop support for the Dual_EC_DRBG algorithm as anything other than regrettable. The costs to the Defense Department to deploy a new algorithm were not an adequate reason to sustain our support for a questionable algorithm. Indeed, we support NIST’s April 2014 decision to remove the algorithm. Furthermore, we realize that our advocacy for the Dual_EC_DRBG casts suspicion on the broader body of work NSA has done to promote secure standards. Indeed, some colleagues have extrapolated this single action to allege that NSA has a broader agenda to
“undermine Internet encryption.” A fair reading of our track record speaks otherwise. Nevertheless, we understand that NSA must be much more transparent in its standards work and act according to that transparency. That effort can begin with the AMS now.
However, as security researcher/professor Matthew Green quickly shot back, this is a bullshit apology, because he's really only apologizing for not dropping the standard when they got caught red handed back in 2007.
The trouble is that on closer examination, the letter doesn't express regret for the inclusion of Dual EC DRBG in national standards. The transgression Dr. Wertheimer identifies is simply the fact that NSA continued to support the algorithm after major questions were raised. That's bizarre.
Green also takes on Wertheimer's weak attempt to still defend pushing the compromised Dual_EC_DRBG as ridiculous. Here were Wertheimer's arguments for why it was still okay:
The Dual_EC_DRBG was one of four
random number generators in the NIST
standard; it is neither required nor the
default.
The NSA-generated elliptic curve
points were necessary for accreditation
of the Dual_EC_DRBG but only had to
be implemented for actual use in certain DoD applications.
The trapdoor concerns were openly
studied by ANSI X9F1, NIST, and by the
public in 2007.
But, again, those don't make much sense and actually make Wertheimer's non-apology that much worse. As Green notes, even though there were other random number generators, the now infamous RSA deal did lead some to use it since it was the "default" in a popular software library and because NIST had declared the standard safe, meaning that people trusted it. Green also goes into great detail describing how the second point is also incredibly misleading. It's worth reading his full explanation, but the short version is that despite some people fearing the NSA's plan would have a backdoor, the details and the possible "alternatives" to avoid that were completely hidden away and more or less dropped.
And that final point, well... really? Again, that's basically saying, "Well, people thought we might have put in a backdoor, but couldn't prove it, but there, you guys had your chance to debate it." Nevermind the fact that there actually was a backdoor and it wasn't confirmed until years later. And, as Green notes, many of the concerns were actually raised earlier and swept under the rug. Also, the standard was pushed and adopted by RSA as a default long before some of these concerns were raised as well.
This might all be academic, but keep this in mind: we now know that RSA Security began using the Dual EC DRBG random number generator in BSAFE -- as the default, I remind you -- in 2004. That's three years during which concerns were not openly studied by the public.
To state that the trapdoor concerns were 'openly' studied in 2007 is absolutely true. It's just completely irrelevant.
In other words, this isn't an apology. It's an apology that the NSA got caught (and didn't stop pushing things the first time it got caught), and then a weak defense of why they still went ahead with a compromised offering.
Wertheimer complains that this one instance has resulted in distrust from the mathematics and cryptography community. If so, his weak response isn't going to help very much.
They can promise strong encryption. They just need to figure out how they can provide us plain text. - FBI General Counsel Valerie Caproni, September 27, 2010
[W]e're in favor of strong encryption, robust encryption. The country needs it, industry needs it. We just want to make sure we have a trap door and key under some judge's authority where we can get there if somebody is planning a crime. - FBI Director Louis Freeh, May 11, 1995
Here we go again. Apple has done (and Google has long announced they will do) basic encryption on mobile devices. And predictably, law enforcement has responded with howls of alarm.
We've seen this movie before. Below is a slightly adapted blog post from one we posted in 2010, the last time the FBI was seriously hinting that it was going to try to mandate that all communications systems be easily wiretappable by mandating "back doors" into any encryption systems. We marshaled eight "epic failures" of regulating crypto at that time, all of which are still salient today. And in honor of the current debate, we've added a ninth:
. . .
If the government howls of protest at the idea that people will be using encryption sound familiar, it's because regulating and controlling consumer use of encryption was a monstrous proposal officially declared dead in 2001 after threatening Americans' privacy, free speech rights, and innovation for nearly a decade. But like a zombie, it's now rising from the grave, bringing the same disastrous flaws with it.
For those who weren't following digital civil liberties issues in 1995, or for those who have forgotten, here's a refresher list of why forcing companies to break their own privacy and security measures by installing a back door was a bad idea 15 years ago:
It will create security risks. Don't take our word for it. Computer security expert Steven Bellovin has explained some of the problems. First, it's hard to secure communications properly even between two parties. Cryptography with a back door adds a third party, requiring a more complex protocol, and as Bellovin puts it: "Many previous attempts to add such features have resulted in new, easily exploited security flaws rather than better law enforcement access."It doesn't end there. Bellovin notes:
Complexity in the protocols isn't the only problem; protocols require computer programs to implement them, and more complex code generally creates more exploitable bugs. In the most notorious incident of this type, a cell phone switch in Greece was hacked by an unknown party. The so-called 'lawful intercept' mechanisms in the switch — that is, the features designed to permit the police to wiretap calls easily — was abused by the attacker to monitor at least a hundred cell phones, up to and including the prime minister's. This attack would not have been possible if the vendor hadn't written the lawful intercept code.
More recently, as security researcher Susan Landau explains, "an IBM researcher found that a Cisco wiretapping architecture designed to accommodate law-enforcement requirements — a system already in use by major carriers — had numerous security holes in its design. This would have made it easy to break into the communications network and surreptitiously wiretap private communications."
This isn't just a problem for you and me and millions of companies that need secure communications. What will the government itself use for secure communications? The FBI and other government agencies currently use many commercial products — the same ones they want to force to have a back door. How will the FBI stop people from un-backdooring their deployments? Or does the government plan to stop using commercial communications technologies altogether?
It won't stop the bad guys. Users who want strong encryption will be able to get it — from Germany, Finland, Israel, and many other places in the world where it's offered for sale and for free. In 1996, the National Research Council did a study called "Cryptography's Role in Securing the Information Society," nicknamed CRISIS. Here's what they said:
Products using unescrowed encryption are in use today by millions of users, and such products are available from many difficult-to-censor Internet sites abroad. Users could pre-encrypt their data, using whatever means were available, before their data were accepted by an escrowed encryption device or system. Users could store their data on remote computers, accessible through the click of a mouse but otherwise unknown to anyone but the data owner, such practices could occur quite legally even with a ban on the use of unescrowed encryption. Knowledge of strong encryption techniques is available from official U.S. government publications and other sources worldwide, and experts understanding how to use such knowledge might well be in high demand from criminal elements. — CRISIS Report at 303
None of that has changed. And of course, more encryption technology is more readily available today than it was in 1996. So unless the goverment wants to mandate that you are forbidden to run anything that is not U.S. government approved on your devices, they won't stop bad guys from getting access to strong encryption.
It will harm innovation. In order to ensure that no "untappable" technology exists, we'll likely see a technology mandate and a draconian regulatory framework. The implications of this for America's leadership in innovation are dire. Could Mark Zuckerberg have built Facebook in his dorm room if he'd had to build in surveillance capabilities before launch in order to avoid government fines? Would Skype have ever happened if it had been forced to include an artificial bottleneck to allow government easy access to all of your peer-to-peer communications?This has especially serious implications for the open source community and small innovators. Some open source developers have already taken a stand against building back doors into software.
It will harm US business. If, thanks to this proposal, US businesses cannot innovate and cannot offer truly secure products, we're just handing business over to foreign companies who don't have such limitations. Nokia, Siemens, and Ericsson would all be happy to take a heaping share of the communications technology business from US companies. And it's not just telecom carriers and VOIP providers at risk. Many game consoles that people can use to play over the Internet, such as the Xbox, allow gamers to chat with each other while they play. They'd have to be tappable, too.
It will cost consumers. Any additional mandates on service providers will require them to spend millions of dollars making their technologies compliant with the new rules. And there's no real question about who will foot the bill: the providers will pass those costs onto their customers. (And of course, if the government were to pay for it, they would be using taxpayer dollars.)
It will be unconstitutional.. Of course, we wouldn't be EFF if we didn't point out the myriad constitutional problems. The details of how a cryptography regulation or mandate will be unconstitutional may vary, but there are serious problems with nearly every iteration of a "no encryption allowed" proposal that we've seen so far. Some likely problems:
The First Amendment would likely be violated by a ban on all fully encrypted speech.
The First Amendment would likely not allow a ban of any software that can allow untappable secrecy. Software is speech, after all, and this is one of the key ways we defeated this bad idea last time.
The Fourth Amendment would not allow requiring disclosure of a key to the backdoor into our houses so the government can read our "papers" in advance of a showing of probable cause, and our digital communications shouldn't be treated any differently.
The Fifth Amendment would be implicated by required disclosure of a private papers and the forced utterance of incriminating testimony.
Right to privacy. Both the right to be left alone and informational privacy rights would be implicated.
It will be a huge outlay of tax dollars. As noted below, wiretapping is still a relatively rare tool of government (at least for the FBI in domestic investigations -- the NSA is another matter as we now all know). Yet the extra tax dollars needed to create a huge regulatory infrastructure staffed with government bureaucrats who can enforce the mandates will be very high. So, the taxpayers would end up paying for more expensive technology, higher taxes, and lost privacy, all for the relatively rare chance that motivated criminals will act "in the clear" by not using encryption readily available from a German or Israeli company or for free online.
The government hasn't shown that encryption is a problem. How many investigations have been thwarted or significantly harmed by encryption that could not be broken? In 2009, the government reported only one instance of encryption that they needed to break out of 2,376 court-approved wiretaps, and it ultimately didn't prevent investigators from obtaining the communications they were after.This truth was made manifest in a recent Washington Post article written by an ex-FBI agent. While he came up with a scary kidnapping story to start his screed, device encryption simply had nothing to do with the investigation. The case involved an ordinary wiretap. In 2010, the New York Times reported that the government officials pushing for this have only come up with a few examples (and it's not clear that all of the examples actually involve encryption) and no real facts that would allow independent investigation or confirmation. More examples will undoubtedly surface in the FBI's PR campaign, but we'll be watching closely to see if underneath all the scary hype there's actually a real problem demanding this expensive, intrusive solution.
Mobile devices are just catching up with laptops and other devices. Disk encryption just isn't that new. Laptops and desktop computers have long had disk encryption features that the manufacturers have absolutely no way to unlock. Even for simple screen locks with a user password, the device maker or software developer doesn't automatically know your password or have a way to bypass it or unlock the screen remotely.Although many law enforcement folks don't really like disk encryption on laptops and have never really liked it, and we understand that some lobbied against it in private, we haven't typically heard them suggest in public that it was somehow improper for these vendors not to have a backdoor to their security measures.That makes us think that the difference here is really just that some law enforcement folks think that phones are just too popular and too useful to have strong security. But strong security is something we all should have. The idea that basic data security is just a niche product and that ordinary people don't deserve it is, frankly, insulting. Ordinary people deserve security just as much as elite hackers, sophisticated criminals, cops and government agents, all of whom have ready access to locks for their data.
The real issue with encryption may simply be that the FBI has to use more resources when they encounter it than when they don't. Indeed, Bellovin argues: "Time has also shown that the government has almost always managed to go around encryption." (One circumvention that's worked before: keyloggers.) But if the FBI's burden is the real issue here, then the words of the CRISIS Report are even truer today than they were in 1996:
It is true that the spread of encryption technologies will add to the burden of those in government who are charged with carrying out certain law enforcement and intelligence activities. But the many benefits to society of widespread commercial and private use of cryptography outweigh the disadvantages.
The mere fact that law enforcement's job may become a bit more difficult is not a sufficient reason for undermining the privacy and security of hundreds of millions of innocent people around the world who will be helped by mobile disk encryption. Or as Chief Justice of John Roberts recently observed in another case rejecting law enforcement's broad demands for access to the information available on our mobile phones: "Privacy comes at a cost."
Last night, while the mainstream press was yammering on about the security implications of Microsoft ending support of Windows XP (it's already vulnerable, this won't really change anything), a much bigger issue was concerning security folks. A massive vulnerability in OpenSSL, called Heartbleed, was revealed. As Matt Blaze notes, the bug actually leaks data beyond what it's protecting, which makes it worse than no crypto at all. The vulnerability likely impacts a huge number of servers -- including Yahoo's (many other major sites, including Google, Facebook, Twitter, Dropbox and Microsoft are apparently not impacted by this). Oh, and the vulnerability has been there for years. Over at the Tor Project, they made the most succinct statement of how serious this is:
If you need strong anonymity or privacy on the Internet, you might want to stay away from the Internet entirely for the next few days while things settle.
Of course, that also means that if you needed strong anonymity or privacy on the internet, there's a good chance some of the services you use left you vulnerable for quite some time until now.
Among the many problems with President Obama's weak statement concerning NSA surveillance was the fact that he didn't even address the serious issue of the NSA undermining cryptography with backdoors. The White House's task force had included a recommendation to end this practice, and the President appeared to ignore it entirely. Now, a large group of US computer security and cryptography researchers have sent a strongly worded open letter to the President condemning these efforts (and his failure to stop the program).
Indiscriminate collection, storage, and processing of unprecedented amounts of personal information chill free speech and invite many types of abuse, ranging from mission creep to identity theft. These are not hypothetical problems; they have occurred many times in the past. Inserting backdoors, sabotaging standards, and tapping commercial data-center links provide bad actors, foreign and domestic, opportunities to exploit the resulting vulnerabilities.
The value of society-wide surveillance in preventing terrorism is unclear, but the threat that such surveillance poses to privacy, democracy, and the US technology sector is readily apparent. Because transparency and public consent are at the core of our democracy, we call upon the US government to subject all mass-surveillance activities to public scrutiny and to resist the deployment of mass-surveillance programs in advance of sound technical and social controls. In finding a way forward, the five principles promulgated at http://reformgovernmentsurveillance.com/ provide a good starting point.
The choice is not whether to allow the NSA to spy. The choice is between a communications infrastructure that is vulnerable to attack at its core and one that, by default, is intrinsically secure for its users. Every country, including our own, must give intelligence and law-enforcement authorities the means to pursue terrorists and criminals, but we can do so without fundamentally undermining the security that enables commerce, entertainment, personal communication, and other aspects of 21st-century life. We urge the US government to reject society-wide surveillance and the subversion of security technology, to adopt state-of-the-art, privacy-preserving technology, and to ensure that new policies, guided by enunciated principles, support human rights, trustworthy commerce, and technical innovation.
That ReformGovernmentSurveillance.com site is the one launched by a bunch of the biggest internet companies, so it's good to see these researchers and technologists lining up behind that effort as well.
One of the things that's been glaring about all of the investigations and panels and research into these programs is that they almost always leave out actual technologists, and especially leave out security experts. That seems like a big weakness, and now those security researchers are speaking out anyway. At some point, the politicians backing these programs are going to have to realize that almost no one who actually understands this stuff thinks what they're doing is the right way to go about this.
Alan Turing is a name you're required to know if you have any interest in computers, cryptology or artificial intelligence. The famed "Turing Test" is still used as one way to test functionality of artificial intelligence, he's considered the father of modern day computing, and his work in decrypting the Nazi's Enigma code quite possibly shortened the war by a factor of years, saving who knows how many lives from an even further prolonged conflict. The word hero gets tossed around a lot these days, too often utilized to describe athletes and entertainers when it should probably be reserved for people like Turing. He was an amazing person, smart as hell, and dedicated to a craft that unarguably moved humanity forward and simultaneously saved lives.
And, in 1952, he was convicted of being a homosexual and sentenced to chemical castration by hormone injection, leading to his suicide a few years later. That was 1954. And, though it sadly took sixty years, the Queen has officially pardoned Turing for his non-crime.
Announcing the pardon, Grayling said: "Dr. Alan Turing was an exceptional man with a brilliant mind. His brilliance was put into practice at Bletchley Park during the second world war, where he was pivotal to breaking the Enigma code, helping to end the war and save thousands of lives.
"His later life was overshadowed by his conviction for homosexual activity, a sentence we would now consider unjust and discriminatory and which has now been repealed. Dr. Turing deserves to be remembered and recognised for his fantastic contribution to the war effort and his legacy to science. A pardon from the Queen is a fitting tribute to an exceptional man."
It is undoubtedly a good thing that Turing has been pardoned, though the need for such a pardon should never have arisen. For a government to have chemically castrated one of their very best was a crime for which I issue no pardon of my own. And that's important, because the very same queen that was queening over the UK when Turing was convicted, sentenced, and killed himself was the same queen that queeningly issued this pardon. And, amazingly, it took Elizabeth the Second four years to do so after the UK government's Gordon Brown issued an "unequivocal apology" to Turing and his family. There is a very firm lesson here for all of us in how we treat one another, even those who are different from us.
Writer David Leavitt, professor of English at Florida University and author of The Man Who Knew Too Much: Alan Turing and the Invention of the Computer (2006), said it was "great news". The conviction had had "a profound and devastating" effect on Turing, Leavitt said, as the mathematician felt he was being "followed and hounded" by the police "because he was considered a security risk".
"There was this paranoid idea in 1950s England of the homosexual traitor, that he would be seduced by a Russian agent and go over to the other side," Leavitt said. "It was such a misjudgment of Alan Turing because he was so honest, and was so patriotic."
More importantly, it was a misjudgment of Alan Turing as a human being. To use our fear and dislike of those that are different from us to completely negate the possible benefits our fellow human Alan Turing could have brought us had he not been so abused shows the very worst in all of us. So, while it may feel warm and fuzzy that Turing has been officially pardoned, I'd suggest we all keep our eye on the ball this holiday season and make an effort to judge each other not on old and antiquated biases, but on our character and actions. Had humanity done so, who knows where Alan Turing could have brought us? And I would hope that any fear some of us might have of those that are different from us would be outweighed by the fear of losing the contributions of those very same people.
There's a reason why patent trolls love east Texas -- and big part of that is that the juries there have a long history of favoring patent holders, no matter how ridiculous or how trollish. That was on display last night, when the jury in Marshall, Texas sided with patent troll Erich Spangenberg and his TQP shell company over Newegg. As we've been describing, Newegg brought out the big guns to prove pretty damn thoroughly that this guy Mike Jones and his encryption patent were both not new at the time the patent was granted and, more importantly, totally unrelated to the encryption that Newegg and other ecommerce providers rely on. Having Whit Diffie (who invented public key cryptography) and Ron Rivest (who basically made it practical in real life) present on your behalf, showing that they did everything prior to Jones' patent, while further showing that what Newegg was doing relied on their work, not Jones', should have ended the case.
But, apparently TQP's lawyers' technique of attacking Diffie's credibility somehow worked. The jury said both that the patent was valid and that Newegg infringed -- and they awarded TQP $2.3 million -- a little less than half of what TQP wanted, but still a lot more than TQP settled with many other companies (including those with much bigger ecommerce operations than Newegg). In other words, yet another travesty of justice from a jury in east Texas. Newegg will appeal, as it did in its last big patent troll lawsuit (which was much bigger), against Soverain Software. Again, Newegg had lost in East Texas, but prevailed big time on appeal. Hopefully history repeats itself.
Joe Mullin's coverage (linked above) has a bunch of little tidbits about how everyone responded to the verdict, but I think Diffie's response is the most honest. Asked how he was feeling:
"Distressed," he said. "I was hoping to be rid of this business."
Yeah, he's not the only one. The sheer ridiculousness of a jury simply not believing the very people who created the very building blocks of modern encryption, and instead buying the story of someone who did nothing special either with the concept behind his patent or with that patent once it existed, is just distressing. It shows how arbitrary jury trials can be, especially when you have jurors who simply don't understand the technology or the history at play. Blech. I think I may have to go buy an anti-patent troll t-shirt from Newegg.