Advisory Panel Offers Suggestions To Strengthen US Cybersecurity, But Is The Government Capable Of Change?
from the is-government-too-big-to-learn? dept
The President's Council of Advisors on Science and Technology (abbreviated unfortunately as PCAST) has just released a report dealing with the nation's hottest topic since terrorism: cybersecurity. The report's writers include a host of professors from a variety of scientific pursuits, along with a few corporate figures from the tech world, including Google's Eric Schmidt and Microsoft's Craig Mundie.
The report's suggestions aren't half-bad.
Overarching Finding: Cybersecurity will not be achieved by a collection of static precautions that, if taken by Government and industry organizations, will make them secure. Rather, it requires a set of processes that continuously couple information about an evolving threat to defensive reactions and responses.What's being suggested makes sense. But logic means nothing when confronted with bureaucratic processes. The government, as a whole, isn't a nimble beast. "Static precautions" are top speed for the behemoth. Turning it into a swift, reactive entity may be an impossibility.
Evidence of the government's inability to craft functioning and secure software exists everywhere. Currently, everyone's attention has been drawn to the government's healthcare site, which has been plagued with problems since it went live and weeks later, after an overhaul, still underperforms and plays fast and loose with personal data.
Entities where cybersecurity is even more crucial aren't much better. It took the FBI more than decade and several hundred million dollars (spread across two contractors) to come up with functioning software. The DEA is still using Windows Server 2003, despite the NSA's warnings that the outdated software contains serious security flaws. The Pentagon's network of unrelated computers is even worse. According to a Reuters investigation, the Pentagon still relies on a variety of different computers, some dating back to the 1970s. Ancient file formats and arcane file management processes make searching for older records a nightmare.
So, nimble the government is not. PCAST's recommendations do use a lighter tone than the multiple damning GAO reports covering the same ground, but the underlying message is the same. The government may be able to improve, but it seldom shows the desire to, as the first finding points out.
Finding 1: The Federal Government rarely follows accepted best practices. It needs to lead by example and accelerate its efforts to make routine cyberattacks more difficult by implementing best practices for its own systems.This is a non-starter, as years of failing grades from GAO investigators can attest. Problems that existed a half-decade ago still exist today. Each subsequent report says the same thing: recommendations were made but little evidence was uncovered that these suggestions were ever communicated to those responsible, much less deployed.
Finding 2: Many private-sector entities come under some form of Federal regulation for reasons not directly related to national security. In many such cases there is opportunity, fully consistent with the intent of the existing enabling legislation, for promoting and achieving best practices in cybersecurity.This one has problems as well. What this looks like is an invitation for the government to use the heavy hand of regulation to force private entities to rise to a level of security the government itself is unwilling to obtain.
The government should use its existing powers to ensure private entities protect the sensitive data it gathers on Americans during the course of business (rather than use this as an opportunity to expand power, as the report points out), but it's highly hypocritical to hold businesses to a higher standard than it applies to itself.
Finding 3: Industry-driven, but third-party-audited, continuous-improvement processes are more likely to create an effective cybersecurity culture than are Government-mandated, static lists of security measures.This goes back to the overarching finding.
Finding 4: To improve the capacity to respond in real time, cyberthreat data need to be shared more extensively among private-sector entities and—in appropriate circumstances and with publicly understood interfaces—between private-sector entities and Government.For this to work best, this needs to be voluntary (and encouraged by proper incentives), rather than presented as "mandatory" (or worse, "compelled") -- especially in terms of feeding info to the government. Private entities may also be reluctant to share with others in their own field for fear of exposing sources or methods. This, too, is problematic and cannot be solved simply by attempting to legislate the reluctance away.
Finding 5: Internet Service Providers are well-positioned to contribute to rapid improvements in cybersecurity through real-time action.Of all the things I'm worried about in this list of suggestions, this is my chief concern. Everything said here is true. ISPs are in a better position to gain unique insight on attacks. The problem is, when faced with the daunting task overhauling its own processes and practices, the government may instead decide to toss the problem to ISPs and let them do the work -- and shoulder the blame.
Once again, this needs to lean towards voluntary to have any chance at success. A utopian projection would see industry and the government working hand-in-hand to repel cyberattacks. But buck-passing and scapegoating usually falls heavily on the private sector in the event of a failure -- the sort of thing that doesn't engender cooperative relationships.
Finding 6: Future architectures will need to start with the premise that each part of a system must be designed to operate in a hostile environment. Research is needed to foster systems with dynamic, real-time defenses to complement hardening approaches.This is solid advice as well, but doing so will mean more thoroughly vetting potential contractors, as well as carefully overseeing each step of the process. Again, history shows us that government agencies are willing to hire contractors despite their past (often massive) failures. If a responsive, secure system is going to be built, it needs to be done by the right people and tested thoroughly throughout development. It can't just be tossed to the lowest bidder and peeked in on occasionally. That's how you end up with a $500 million system that has to be scrapped as soon as it goes live.
The problem with recommendations like these is that it's almost guaranteed they will never be acted upon with any sincerity. They may get folded in with half-baked efforts aimed at cybersecurity, but what's being recommended is fundamental change.
Lawmakers have pushed various versions of cybersecurity legislation, almost all of which is aimed at gutting protections in the private sector and increasing government power. The biggest torchbearers for the "cyberwar" threat helm agencies that have vested interests in weakening private sector security. The government is largely unwilling to clean up its own backyard and this report, no matter how on point or well-written, won't change that.
Filed Under: cybersecurity, pcast