Former NSA Official Argues The Real Problem With Undisclosed Exploits Is Careless End Users
from the sorry-about-all-the-ransomware dept
As leaked NSA software exploits have been redeployed to cause computer-based misery all over the world, the discussion about vulnerability disclosures has become louder. The argument for secrecy is based on the assumption that fighting an existential threat (terrorism, but likely also a variety of normal criminal behavior) outweighs concerns the general public might have about the security of their software/data/personal information. Plenty of recent real-world examples (hospital systems ransomed! etc.) do the arguing for those seeking expanded disclosure of vulnerabilities and exploits.
Former Deputy Director of the NSA Rick Ledgett appears on the pages of Lawfare to argue against disclosure, just as one would have gathered by reading his brief author bio. Ledgett's arguments, however, feel more like dodges. First off, Ledgett says the NSA shouldn't have to disclose every vulnerability/exploit it has in its arsenal, an argument very few on the other side of the issue are actually making. Then he says arguments against exploit hoarding "oversimplify" the issue.
The WannaCry and Petya malware, both of which are partially based on hacking tools allegedly developed by the National Security Agency, have revived calls for the U.S. government to release all vulnerabilities that it holds. Proponents argue that this would allow patches to be developed, which in turn would help ensure that networks are secure. On its face, this argument might seem to make sense—but it is a gross oversimplification of the problem, one that not only would not have the desired effect but that also would be dangerous.
At this point, you'd expect Ledgett to perform some de-simplification. Instead, the post detours for a bit to do some victim-blaming. It's not the NSA's fault if undisclosed exploits wreak worldwide havoc. It's the end users who are the problem -- the ones who (for various reasons) use outdated system software or don't keep current with patches. This isn't a good argument to make for the very reasons outlined in Ledgett's opening paragraph: software vendors can't patch flaws they're unaware of. This is where disclosure would help protect more users, even if it meant the loss of some surveillance intercepts.
Then Ledgett argues the NSA's leaked exploits weren't really the problem. If they hadn't been available, the malware purveyors just would have used something else.
The actors behind WannaCry and Petya, believed by some to be from North Korea and Russia, respectively, had specific goals when they unleashed their attacks. WannaCry seemed to be straightforward but poorly executed ransomware, while Petya appeared to have a more sinister, destructive purpose, especially in the early Ukraine-based infection vector. Those actors probably would have used whatever tools were available to achieve their goals; had those specific vulnerabilities not been known, they would have used others. The primary damage caused by Petya resulted from credential theft, not an exploit.
This is undoubtedly true. Bad actors use whatever tools help them achieve their ends. It's just that these specific cases -- the cases used by Ledgett to argue against increased disclosure -- were based on NSA exploits vendors hadn't been informed of yet. The patches that addressed more current vulnerabilities weren't issued until after the NSA told Microsoft about them, and it only did that because its toolset was no longer under its control.
Ledgett also points out that the NSA does better than most state entities in terms of disclosure:
Most of the vulnerabilities discovered by the U.S. government are disclosed, and at the National Security Agency the percentage of vulnerabilities disclosed to relevant companies has historically been over 90 percent. This is atypical, as most world governments do not disclose the vulnerabilities they find.
Maybe so, but there's not much honor than just being better than the worst governments. Ledgett only says the NSA is better than "most." This doesn't turn the NSA into a beacon of surveillance state forthrightness. All it does is place it above governments less concerned about the security and wellbeing of their citizens.
Ledgett then goes back to the well, claiming a) the two recent attacks had nothing to do with the NSA, and b) disclosing vulnerabilities would make the NSA less effective.
WannaCry and Petya exploited flaws in software that had either been corrected or superseded, on networks that had not been patched or updated, by actors operating illegally. The idea that these problems would be solved by the U.S. government disclosing any vulnerabilities in its possession is at best naive and at worst dangerous. Such disclosure would be tantamount to unilateral disarmament in an area where the U.S. cannot afford to be unarmed… Neither our allies nor our adversaries would give away the vulnerabilities in their possession, and our doing so would probably cause those allies to seriously question our ability to be trusted with sensitive sources and methods.
The problem here is that Ledgett ignores the obvious: leaked NSA tools helped create the problem. The NSA never disclosed these vulnerabilities to affected software vendors -- at least not until it became obvious it could no longer keep these tools secret.
I'm guessing the NSA is already living through the last part of Ledgett's paragraph. A set of effective, still-undisclosed vulnerabilities being digitally spirited away and dumped into the public's lap probably makes it less likely foreign surveillance partners will be sharing their malware toolkits with the NSA.
This leads right into another argument against vulnerability hoarding: it has been shown with complete clarity that the NSA can't guarantee its exploits will never be used by criminals and malicious governments. The leak of its toolkit shows any suggestion that only the "good guys" will have access to undisclosed vulnerabilities is both ignorant and arrogant. The NSA isn't untouchable. Neither are all the surveillance partners the NSA has shared its tools with.
In the end, it's the private sector's fault, according to Ledgett. The solution is for vendors to write better software and end users to patch more frequently. This is good advice, but not an absolution of the NSA's vulnerability secrecy.
The NSA needs to do better balancing its needs and the security of the general public. Very few people are arguing the NSA should have zero undisclosed exploits. But the exploits dumped by the Shadow Brokers affected older versions of Microsoft system software dating back to Windows XP and they still weren't patched until the exploits had already been made public. These were exploits some in the NSA thought were too powerful, and yet, the NSA did nothing until the malware offspring of its secret exploit stash were taking down systems all over the world.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: disclosure, exploits, malware, nsa, rick ledgett, vep, vulnerabilities
Reader Comments
Subscribe: RSS
View by: Time | Thread
And if companies manage to write secure software, the NSA would agitate for backdoors to be implemented, just like they want backdoors in encryption. Such backdoors will also be exploitable by the bad guys, when like those exploits, somebody accidentally or explicitly leaks them.
[ link to this | view in thread ]
[ link to this | view in thread ]
rAISE YOUR HAND
Adobe flash, the same..
Itunes?? DOEs report on you.
MS music player had a "Phone home" until a few years back..that reported all your music played and used on the program.
There is 1 consideration..a TOTALLY secure system is a BITCH..If you forget your password, or CANT get it...You ERASE EVERYTHING AND START OVER..
[ link to this | view in thread ]
[ link to this | view in thread ]
Liar, Liar, Pants on Fire
Former NSA Official Argues The Real Problem With Undisclosed Exploits Is Careless End Users
The Undisclosed Exploits have nothing to do with surveillance and everything to do with gaining access to a persons machine/device.
The unconstitutional surveillance the criminals at NSA (etal) are carrying out is accomplished by intercepting network traffic from key hubs (eg where undersea fiber optic cables make landfall) and then storing the data in various repositories (eg Pine Bluff Utah).
Gaining access to a machine/device in order to trespass into a persons private property is what the criminals at NSA use the undisclosed exploits to accomplish.
Some of the exploits in NSA's toolkit are likely in part a collaborative effort between software/hardware manufacturers and the criminals of the NSA working together in the development stages to create Easter-eggs (ie back doors) for gaining surreptitious access to a persons private property.
The italicized/bold text below was excerpted from www.propublica.org a report titled Revealed: The NSA’s Secret Campaign to Crack, Undermine Internet Security
Beginning in 2000, as encryption tools were gradually blanketing the Web, the N.S.A. invested billions of dollars in a clandestine campaign to preserve its ability to eavesdrop. Having lost a public battle in the 1990s to insert its own “back door” in all encryption, it set out to accomplish the same goal by stealth.
The agency, according to the documents and interviews with industry officials, deployed custom-built, superfast computers to break codes, and began collaborating with technology companies in the United States and abroad to build entry points into their products. The documents do not identify which companies have participated.
https://www.propublica.org/article/the-nsas-secret-campaign-to-crack-undermine-interne t-encryption
How many terrorists or terror plots have been stopped using NSA's (ie Five Eyes )criminal global surveillance regime?
The italicized/bold text below was excerpted from the website www.intercept.com a report titled U.S. Mass Surveillance Has No Record of Thwarting Large Terror Attacks, Regardless of Snowden Leaks:
A White House panel concluded in December 2013 that the NSA’s bulk collection of Americans’ telephone information was “not essential in preventing attacks.” A member of the panel took it one step further, when he told NBC News that there were no examples of the NSA stopping “any [terror attacks] that might have been really big” using the program.
https://theintercept.com/2015/11/17/u-s-mass-surveillance-has-no-record-of-thwarting-large- terror-attacks-regardless-of-snowden-leaks/
The answer is zero.
What the unconstitutional and criminal surveillance is good for is blackmail, industrial espionage, insider stock trading tips and keeping tabs on personal relationships.
[ link to this | view in thread ]
0 undisclosed exploits
[ link to this | view in thread ]
What group of end users is it, of which the aforementioned end user is a member, who have notoriously insecure, unpatched, and poorly configured systems which have repeatedly exposed metric craptons of data?
Who? Who are these pebkac monkeys? I. Just. Don't. Know...
[ link to this | view in thread ]
[ link to this | view in thread ]
One is regular updating by the companies. Microsoft does it and I believe they are doing a good security job (much better than, say, 10 years ago actually).
Then there's the end user. How the fucking fucks do they expect anybody to protect themselves against undisclosed exploits that don't even need real user input to get in? I mean, even if you could prevent by being completely paranoid about security not everybody would have the expertise to take these added steps even if you disconsider the added hassle to operate the system that comes with it.
No, the problem is you shouldn't be hoarding exploits. If you must, just use them, gather some intel and disclose as soon as possible.
[ link to this | view in thread ]
Exploit This!
Italized/bold text below was excerpted from the web page medium.com/insurge-intelligence a report titled:
How the CIA made Google, Inside the secret network behind mass surveillance, endless war, and Skynet
INSURGE INTELLIGENCE, a new crowd-funded investigative journalism project, breaks the exclusive story of how the United States intelligence community funded, nurtured and incubated Google as part of a drive to dominate the world through control of information. Seed-funded by the NSA and CIA, Google was merely the first among a plethora of private sector start-ups co-opted by US intelligence to retain ‘information superiority.’
The origins of this ingenious strategy trace back to a secret Pentagon-sponsored group, that for the last two decades has functioned as a bridge between the US government and elites across the business, industry, finance, corporate, and media sectors. The group has allowed some of the most powerful special interests in corporate America to systematically circumvent democratic accountability and the rule of law to influence government policies, as well as public opinion in the US and around the world. The results have been catastrophic: NSA mass surveillance, a permanent state of global war, and a new initiative to transform the US military into Skynet.
https://medium.com/insurge-intelligence/how-the-cia-made-google-e836451a959e
Hat tip: www.lewrockwell.com
[ link to this | view in thread ]
Careless end users?
What possible care is an end user expected to take against an undisclosed and therefore unknown vulnerability?
Even if you know that the rebel alliance has the death star plans, what measures can you take if you don't know what the vulnerability is and how it will be exploited? Once you try to analyze how the X wing fighters are attacking, it is probably too late.
[ link to this | view in thread ]
They're right. No matter how much you try to secure something the end-user is always the most vulnerable part of an IT ecosystem. As long as you have hired at least one person dumb enough to open an email attachment called "sexyladies.exe" you have more liabilities than security assets.
[ link to this | view in thread ]
Re: 0 undisclosed exploits
No, we shouldn't have to buy our rights back from our own security agencies, but over the past decades the government has worn through my hard shell of idealism to the soft nougat of pragmatism underneath.
[ link to this | view in thread ]
Re:
[ link to this | view in thread ]
The PEBKAC users
And Ledgett wants to blame the end users when the end user is dealing with a locked down machaine and updates to fix problems never arrive?
Cry me a river.
[ link to this | view in thread ]
HIIIII
BEFORE it got ...unexploited.....
the backdoor for all this was clever and required some neat compiling but .....
now ask yourself how 4 of us never had probs till we let one jerk get hold of the binaries....and why sony had such a hard time solving what they put into practice on there products....
ask yourself if its careless users, or careless security on my compatriots part to let a idiot get hold of the code whom then pandered it to sony like he was some smart ass...
perhaps this former unemployed bum of a govt agent might do well to really start thinking with his brain rather then his politics....
ever see pirated apps, i was warned 20 years ago that nearly 95% of all the applications that can be pirated are exploitable as well....what does this tel you when you see windows ten ....ohhhh all that spyware built right in ohhhhhhhhhhh
ya good f-in idea right?
hahahahaa
now no one is gonna see it coming
people will get so peeved they might start doing the unthinkable
leaving this so called internet until it smartens back up
i doubt that happens for a while yet....
back to my video games ....that are all hackable but i bought anyways
oh and do keep raising prices so more and more people cant afford to sue the net and stuff on it....
the goal is no more about saving the net or freedoms we can have its ripping this shit to pieces that has taken over it all.
the fbi makes malware ...don't you know back in 2000 they had 65 million honey pot servers on teh net and its ar mroe now....how you think all those ipv4 addresses became rare....
i could go on and on what ive seen but .....maybe ill write a book and sell it on amazon ROFL
[ link to this | view in thread ]
First off, a factual issue.
This isn't true, or is at least misleading. The NSA never needed to inform Microsoft because Microsoft patched the vulnerability the day after it became public. This was 7 months before WannaCry hit, and 8 months before Petya.
8 months is a long time. If the NSA had informed Microsoft, the same machines would have been vulnerable. The problem is IT negligence on the part of affected businesses.
Second, I don't think the NSA should be doing Microsoft's job. The NSA discovered this exploit for the purpose of intelligence operations, and it was in fact being used for intelligence operations. This is the whole point of the NSA. Disclosing the vulnerability would have compromised those intelligence operations. You're really asking the NSA to do something that isn't their job. Your problem isn't the NSA's actions, it's the NSA's mission.
Now I'm gonna let you in on a little hacker secret. Virtually all exploits are found by analyzing Microsoft patches and patch notes. The NSA leak did not matter. If the NSA had earlier informed Microsoft, and Microsoft had earlier released a patch, the attacks would have just happened earlier. Microsoft isn't the defense here, it's just another vector by which the vulnerability could be discovered.
I will grant you that if the NSA did not research the bug, then the vulnerability may have never been discovered, and WannaCry/Petya would not have happened. Maybe they shouldn't be in the business of cyber security research at all, because every discovered vulnerability is a time bomb simple waiting to go off.
If you want to prevent another Petya, I think there's only two approaches.
1) We need cultural changes to handle computers in a less exploitable way. This could be done culturally by getting people to just take security more seriously (Mr Ledgett's approach). Microsoft has the most sway here I believe.
2) We need to hold the bad actors (NK and Russia allegedly) responsible.
[ link to this | view in thread ]
Translation
[ link to this | view in thread ]
[ link to this | view in thread ]
Re:
In that case the solution to poverty, inequality, war and pretty much everything is for government to do a better job (or actually doing one) at protecting people.
[ link to this | view in thread ]
Re: Re:
But what do those idiots do instead of patching the hole and alerting everyone of the hole? They keep the hole open and then "by their own ineptitude" they let others duplicate (exploit) that hole.
When a system has a problem you fix it!
[ link to this | view in thread ]
Re: rAISE YOUR HAND
[ link to this | view in thread ]
Re:
[ link to this | view in thread ]
Re: Re: 0 undisclosed exploits
But I agree with you, we should never pay for our rights. That is why VPNs suck (pay for your right to anonymity).
[ link to this | view in thread ]
Re:
"If you must, just use them, gather some intel and disclose as soon as possible."
ASAP. I can see this being totally abused. ASAP means ASAP, like now, no time to gather intel. The rest of the ASAP definitions are abuses.
[ link to this | view in thread ]
Re:
sexyladies.exe would not work if the exploit was already disclosed by NSA and immediately patched by the software vendor and immediately deployed to its customers.
[ link to this | view in thread ]
Re: HIIIII
[ link to this | view in thread ]
Re:
Complete bullshit. Remember, intelligence is war, and there is no winners in war.
It is not NSA job to patch vulnerabilities but it's their DUTY to make it public as soon as it "discovers" any. Otherwise we end up with Wanna Cry problems. Again, not all machines infected were due to the users negligence or IT's negligence.
[ link to this | view in thread ]
Re:
NSA: "we were still gathering SOME intel"..... "but we disclosed it as soon as possible" (8 months later)....
See the problem?
[ link to this | view in thread ]