Given Spy Agencies' Love For Exploits And Malware, It's Never Been More Dangerous To Be A Security Researcher
from the only-acceptable-form-of-security-is-'national,'-apparently dept
There's probably never been a great time to be a security researcher, what with the attacks they suffer in response to their work, not to mention far too many corporations greeting the discovery of vulnerabilities with legal threats and criminal charges.
The worldwide adoption of the Wassenaar Arrangement threatens to treat the products of security research like weapons-grade plutonium, making every outbound flight with a laptop or USB drive akin to smuggling weapons out of the country.
But that's not the end of it. Security researchers work in the same shadowy areas as national security agencies, as well as other government entities focused on espionage. These agencies not only create their own malware and exploits, but also purchase them from companies in the vulnerabilities business.
Security researchers generally want to expose holes and dangerous software. Almost every other entity involved would prefer to keep these hidden. Any malicious software exposed and patched into irrelevance could carry national security implications. This puts researchers in the crosshairs of both governments and their exploit suppliers. And the system -- as it is -- has very little in the way of built-in protections for legitimate security research.
A paper published by Juan Andrés Guerrero-Saade of Kaspersky Lab points out the inherent dangers of security research in an age when one form of security (national) routinely takes precedent over another form (general computing). [h/t Slashdot]
Though private research teams and intelligence agencies will follow similar intelligence production cycles , we must not conflate their attributes.On top of that, the "work cycles" are also inverted. Security researchers may not know they're treading on the cyber-toes of state operatives until long after the research has begun. Contrast that to the operations of intelligence agencies/exploit marketers, who are only seeking holes rather than fixes and know with certainty what they're deploying or who they're targeting. When security researchers stumble across vulnerabilities, they're not always aware of their origin and may find themselves facing prosecution or, at the very least, government interference and/or additional surveillance.
(1) Intelligence agencies benefit from cover for action, meaning that other governmental institutions do not fi nd the agencies’ intelligence production activities suspect.
(2) Agency employees enjoy legal protections, even those involved in network exploitation activities. And finally,
(3) their work is shielded from political blowback or geopolitical incongruousness.
Each point is inversely applicable to security researchers and thus sets the tone for the power asymmetry:
1. Security researchers enjoy no cover for action for their production of intelligence reports into what may or may not constitute legitimate intelligence operations…
2. Security researchers are afforded no explicit legal protections for the grey areas regularly visited throughout the course of an investigation…
3. The companies too lack a cover for action and are in no way insulated from the political blowback that arises from the public disclosure of sensitive operations. They suffer from a further dimension of ‘guilt by association’ as research into sensitive operations and subsequent reporting is misconstrued as an act of geopolitical aggression when the victim and perpetrator are involved in any form of international tension...
But those outcomes are the least worrying of the possibilities. Security researchers may find themselves involved with governments willing to deploy far more severe tactics.
The researcher as a private individual faces unique challenges when in the cross-hairs of a nation-state actor determined to enact some form of retribution. The operator of an espionage campaign is not a common criminal nor a simple citizen and his resources are truly manifold. As a special class of government insider responsible for a sensitive operation, the attacker can go so far as to legitimize special recourse in order to neutralize the threat posed by the meddling security researcher. The options available slide relative to the nature of the attacker, ranging from civilized to unscrupulous, and include: subtle pressure, patriotic enlistment, bribery, compromise and blackmail, legal repercussions, threat to livelihood, threat to viability of life in the actor’s area of influence, threat of force, or elimination.With no built-in protections for researchers, the potential negative outcomes of their work may outweigh the intangible rewards of the work itself -- more secure computing.
With this in mind, a rather interesting awarded contract was posted to the Federal Business Opportunities (FBO) website. Kudu Dynamics -- which has secured previous DARPA/Dept. of Defense contracts related to cybersecurity -- apparently landed a $500,000 contract for a project that appears to give the company permission to spy on security researchers.
ACLU technologist Chris Soghoian phrased it this way:
DARPA awards 500k grant to spy on security vuln researchers. Seriously. (h/t/ @evacide) https://t.co/51cQHtJ4hY pic.twitter.com/w0Z5pW3hwo
— Christopher Soghoian (@csoghoian) October 26, 2015
DARPA awards 500k grant to spy on security vuln researchers. Seriously.Here's the synopsis of the project (emphasis added):
The goal of Kudu's proposed effort, named "Internet Cyber Early Warning of Adversary Research and Development (ICEWARD)", is to determine whether it is possible to gain actionable insight into the intent of a cyber adversary by observing specific behaviors. In particular, the proposers hypothesize that vulnerability researchers make use of public information and resources (such as search engines and websites) that are relevant to their missions, targets, and techniques in such a way that it is possible to glean part of their intent if only we could observe such use and differentiate it from noise (e.g., search engine crawlers). The basis for this hypothesis is both the proposers' own experience as vulnerability researchers and a little noticed incident in 2010-2011. The proposed approach will investigate the feasibility of creating highly tailored information resources whose access via the public network will be inherently highly correlated with vulnerability research. A second aspect of the proposed work entails the creation of an evaluation methodology for proving or disproving this hypothesis.What it seems to suggest is that Kudu would be allowed to read over the shoulders of security researchers as they used public resources, separate that use from the "noise" of bot activity, and extrapolate the researchers' intended goals.
I'm not as convinced as Soghoian that this means Kudu will be allowed to spy on/tap into security researchers' web browsing. It could mean that Kudu plans to use its own experience in the security research field to help refine processes and programs put in place to counter malicious activity -- a lot of which can seem almost indistinguishable from legitimate security research. This determination of intent would allow government agencies to focus on actual threats, rather than wasting resources chasing down legitimate security research.
Then again, certain government agencies would greatly benefit from this advance knowledge. It would give them a heads-up if researchers were probing exploits, vulnerabilties or malware they'd rather keep under wraps and in working condition.
What makes this murkier than it perhaps should be is the fact that the FBO scrubbed the synopsis from the listing a few hours after people began talking about it. (The EFF preserved the original post as a PDF.) There may be any number of non-nefarious reasons for it doing so, but considering most of the discussion centered around the theory that a government contractor was getting paid to spy on security researchers, the sudden burial of this contract info seems slightly suspicious.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: computer security, darpa, fbi, malware, nsa, security research, surveillance, vulnerabilities, zero days
Reader Comments
Subscribe: RSS
View by: Time | Thread
[ link to this | view in thread ]
It's not just spy agencies
Pretty much everyone who's making money off security failures is pushing hard to have their critics turned into criminals so that they can use the power of the state to silence them. And thanks to the combination of Congressional ignorance and stupidity with the power of lobbying and the reach of campaign contributions (i.e., bribes) it's working.
[ link to this | view in thread ]
Re: It's not just spy agencies
Today the most effective way to change something is to cause a disruption to the money flow. Preferably one that cannot be stopped and does not depend on human interference to keep growing if the problem is not tackled.
[ link to this | view in thread ]
Re: Re: It's not just spy agencies
[ link to this | view in thread ]
[ link to this | view in thread ]
BAM fixed.
/gov logic.
[ link to this | view in thread ]
[ link to this | view in thread ]
Boo Hoo.
OS vendors create the disease. Their subsidiary security companies create the cure. The state takes sloppy seconds.
[ link to this | view in thread ]
Welcome to the 21st Century.
[ link to this | view in thread ]
Re: Welcome to the 21st Century.
[ link to this | view in thread ]