Sending Software Execs To Jail For Bad Security
from the that'll-wake-'em-up dept
It's pretty easy to make suggestions like this when you're sitting at a desk writing articles, and not writing code. Over at News.com, Charles Cooper is suggesting that the way to solve the cybersecurity issue is to put the fear of jail into software execs. Basically, he wants a Sarbanes-Oxley for cybersecurity that says that if a system is not secure, the executives of a company that makes the products can go to jail. While he is right that many security problems are due to sloppy programming, programming is not quite the same as accounting, and mistakes are going to slip through - not because of fraud (as in most accounting issues), but because it's nearly impossible to find every potential security hole or imagine every possible attempted intrusion scenario. While such a law likely would improve security on a number of products, it would also create a huge burden on software companies (and make many smart execs wary of becoming a software CEO). While it might make sense to increase the liability of software companies that do a bad job, there has to be a limit, or it will make it absolutely impossible for any new software company to ever get started.Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Reader Comments
Subscribe: RSS
View by: Time | Thread
I don't think so
[ link to this | view in chronology ]
Re: I don't think so
But it could go a long way if implemented properly. Of course, there would have to be due process, and the law would have to be written in such a way that a vendor, who knows there is a bug in the software, and who purposefully ignores the bug and makes no attempt to warn their customers of the bug, and has no intention of ever releasing a patch, for free, to customers who have *bought* the software, then they should go to jail.
There are plenty of companies out there who would get nabbed in this one, (Oracle and Microsoft would both have to seriously revisit their support models,) but many other security (and customer) conscious companies would not be in trouble. There is an awful lot of blatent ignoring of security issues, when I released an SSL vulnerability in Netscape about 4 years ago, Netscape flat out ignored it (even though another group from Brazil was more successful at getting Netscape to fix other SSL vulnerabilities not related to mine at the time, it was only after threatening them.) They finally quietly fixed the flaw a few weeks later, after my post to bugtraq got them motivated enough to contact me and work with me.
Funny thing was about 8 months later, I was in a room with several other security researchers listening to the head of the QA department at Netscape say, point blank that they "don't care about any security issues unless it is posted on the front page of the Wall Street Journal." The whole room laughed at the guy, it was pretty tragic, especially when I asked him if he really wanted that much attention, since he was basically telling us that we should publish the exploits in the Wall Street Journal. He never did recover...
A carefully crafted law would prevent this type of corporate ignorance, and corporate negligence towards customer security. But then again, since when did recent multi-billion dollar companies actually give a darn about the customer beyond suing them or trying to pull the wool over their eyes about security and vulnerability disclosures?
[ link to this | view in chronology ]
Connecting Responsibility with Authority
The benefit of Sarbanes-Oxley is that it seeks to place responsibility and liability into the hands of those who already have the authority.
[ link to this | view in chronology ]
Not the Answer
As a software developer, I would only warrant a software product to be completely secure if I installed and configured it myself on a closed system, where the customer has no chance to install other software or reconfigure my software.
I would further limit my exposure by requiring that any number of computers be behind a dedicated firewall. Who is going to accept all of that?
[ link to this | view in chronology ]
Re: Not the Answer
I hate to break it to you, but the civilian and contractor folks working on secret projects don't have much interest in using them either. The biggest problem with the "certified" software packages (which aren't secure either,) is that it is locked in time...any time someone comes up with a new exploit for a certified OS, it must be recertified once the fix is applied. The cost is overwhelming, so most vendors who wish to get a certification, do it once and then use the certification as bragging rights.
I would only warrant a software product to be completely secure if I installed and configured it myself on a closed system
Ah yes, security through obscurity. I love systems that rely on security through obscurity to protect themselves (though I am not suggesting that it won't help in your case, since you make sure that there are other security capabilities like firewalls, etc., in place...) I've been personally responsible for slaying security through obscurity mechanisms in a number of vendor products.
Who is going to accept all of that?
If I am buying a product from you, or am giving you sensitive information, I'd want to make sure that you've done as much as you can to prevent the unnecessary disclosure of information to others. I certainly don't buy products from companies with questionable security policies, and I suspect more people are doing the same now after so many "black-eyes" out there have shown that security isn't something to fake.
[ link to this | view in chronology ]
Re: Not the Answer
>
>I hate to break it to you, but the civilian and >contractor folks working on secret projects >don't have much interest in using them either. >The biggest problem with the "certified" >software packages (which aren't secure either,) >is that it is locked in time...any time someone >comes up with a new exploit for a certified OS, >it must be recertified once the fix is applied. >The cost is overwhelming, so most vendors who >wish to get a certification, do it once and >then use the certification as bragging rights.
The DoD COE has significant security extentions and is patched frequently (assuming a dilligent admin who is actually concerned with keeping his systems' paperwork current). Most folks (including the admins) aren't privy to COE modifications and that's the way that it should be... security by obsecurity, for all it's weakness, is but an outer layer in the security onion. C2 cert. is depricated but still branished by the uneducated as though it means something (even thought it's 10 to 15 years past it's "prime") and is primarily used as a budgetary contraint on projects.
[ link to this | view in chronology ]
Re: Not the Answer
Correct, but the DoD COE isn't certified at any level (nor is it meant to be.) And there are some of us in security working with DoD COE which find it to be not patched quickly enough for our tastes, though with IAVA, there is much more of a push to get timely patches in now that they have a limit of 30 days to get the system up to IAVA compliance.
Most folks (including the admins) aren't privy to COE modifications and that's the way that it should be... security by obsecurity
I agree that STO (Security Through Obscurity) is ok when it is backed up by stronger security mechanisms (as said above.) Configuration Management is good, and I have a hard time believing that Configuration Management (i.e. DoD COE,) == STO. In many cases, Configuration Management reduces the security risk by assigning one person or a group of persons (who are security experts, and not lazy,) responsible for the security of an entire network.
Except when the DoD COE machine is infected within three hours after a virus is released which DoD COE hasn't applied the patch for. We've had to remove a ton of DoD COE machines from the network temporarily until the patches were applied on the machines several times within the past couple years. Same excuse we get from the DoD COE folks all the time is that with Configuration Management, it takes time to implement the patches and get them pushed out to everyone, yet some of the scariest worms and viruses in the last couple years have come out weeks or even days after the vulnerability was discovered and patches made available.
C2 cert. is depricated but still branished by the uneducated as though it means something (even thought it's 10 to 15 years past it's "prime") and is primarily used as a budgetary contraint on projects.
Yes, C2 cert is depriciated, but it still is something that the vendors are pushing as the silver bullet for security woes, except that the C2 certification for WNT 4.0 was frozen with service pack 3, and a couple hot fixes, and wasn't even on the network. For W2K, SP3, the Common Criteria certification is EAL4, but that still is frozen in time, any new vulnerabilities discovered and any patches applied will invalidate the EAL4 certification, yet so much has changed since then, and running an EAL4 certified Windows2000 box is a death wish. Yet how many security professionals preach that Windows2000 has an EAL4 cert?
[ link to this | view in chronology ]
Re: Not the Answer
Ah yes, security through obscurity. I love systems that rely on security through obscurity to protect themselves (though I am not suggesting that it won't help in your case, since you make sure that there are other security capabilities like firewalls, etc., in place...) I've been personally responsible for slaying security through obscurity mechanisms in a number of vendor products.
Actually, he's not arguing for security through obscurity. Just the opposite, in fact. The only way he could warrant any software was if he knew the EXACT configuration of the computer or computers, and if that configuration never changed. Once it changes, he can not be sure that his product would not be affected as well, and therefore would not warrant the product.
[ link to this | view in chronology ]
Re: Not the Answer
I had to reread the post. You are right, that is exactly what he is saying. Oops.
[ link to this | view in chronology ]
wouldn't it be nice if...
Now, if, on the other had, we found out the Billy G. had cut a deal with the NSA/US Government to make his software insecure (inside or outside of the USA), I think this would be the perfect application of such a proposed law.
Still, I think the marketplace is in the process of sorting all of this out; so, maybe such a law really is truely unimportant.
[ link to this | view in chronology ]
No Subject Given
However, courts can't even put away CEOs who blatantly lie cheat and steal with our EXISTING laws. this only thing that would happen with this is that a lackey would take the fall and the fatcats would go on their merry way consequence free.
[ link to this | view in chronology ]
IAVA Patching Invalidates Common Criteria ?
Many thanks
D Teed
DoN IA
[ link to this | view in chronology ]