Sending Software Execs To Jail For Bad Security

from the that'll-wake-'em-up dept

It's pretty easy to make suggestions like this when you're sitting at a desk writing articles, and not writing code. Over at News.com, Charles Cooper is suggesting that the way to solve the cybersecurity issue is to put the fear of jail into software execs. Basically, he wants a Sarbanes-Oxley for cybersecurity that says that if a system is not secure, the executives of a company that makes the products can go to jail. While he is right that many security problems are due to sloppy programming, programming is not quite the same as accounting, and mistakes are going to slip through - not because of fraud (as in most accounting issues), but because it's nearly impossible to find every potential security hole or imagine every possible attempted intrusion scenario. While such a law likely would improve security on a number of products, it would also create a huge burden on software companies (and make many smart execs wary of becoming a software CEO). While it might make sense to increase the liability of software companies that do a bad job, there has to be a limit, or it will make it absolutely impossible for any new software company to ever get started.
Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • identicon
    Anonymous Coward, 19 Dec 2003 @ 9:46am

    I don't think so

    I agree that this would in no way guarantee completely secure products. The only thing this would accomplish would be to drive up the price of software as well as increase development cycles.

    link to this | view in chronology ]

    • identicon
      LittleW0lf, 19 Dec 2003 @ 1:54pm

      Re: I don't think so

      I agree that this would in no way guarantee completely secure products.

      But it could go a long way if implemented properly. Of course, there would have to be due process, and the law would have to be written in such a way that a vendor, who knows there is a bug in the software, and who purposefully ignores the bug and makes no attempt to warn their customers of the bug, and has no intention of ever releasing a patch, for free, to customers who have *bought* the software, then they should go to jail.

      There are plenty of companies out there who would get nabbed in this one, (Oracle and Microsoft would both have to seriously revisit their support models,) but many other security (and customer) conscious companies would not be in trouble. There is an awful lot of blatent ignoring of security issues, when I released an SSL vulnerability in Netscape about 4 years ago, Netscape flat out ignored it (even though another group from Brazil was more successful at getting Netscape to fix other SSL vulnerabilities not related to mine at the time, it was only after threatening them.) They finally quietly fixed the flaw a few weeks later, after my post to bugtraq got them motivated enough to contact me and work with me.

      Funny thing was about 8 months later, I was in a room with several other security researchers listening to the head of the QA department at Netscape say, point blank that they "don't care about any security issues unless it is posted on the front page of the Wall Street Journal." The whole room laughed at the guy, it was pretty tragic, especially when I asked him if he really wanted that much attention, since he was basically telling us that we should publish the exploits in the Wall Street Journal. He never did recover...

      A carefully crafted law would prevent this type of corporate ignorance, and corporate negligence towards customer security. But then again, since when did recent multi-billion dollar companies actually give a darn about the customer beyond suing them or trying to pull the wool over their eyes about security and vulnerability disclosures?

      link to this | view in chronology ]

  • identicon
    FreeWine, 19 Dec 2003 @ 11:05am

    Connecting Responsibility with Authority

    I agree that the author's suggestion implies that corporate security issues are caused by unethical and/or lazy IT workers. I see more evidence that leads me to believe that the cause is more often choices of the leadership.

    The benefit of Sarbanes-Oxley is that it seeks to place responsibility and liability into the hands of those who already have the authority.

    link to this | view in chronology ]

  • identicon
    Ed, 20 Dec 2003 @ 10:06am

    Not the Answer

    There are already software products that are certified to be secure, including versions of Windows and UNIX, but nobody outside of governments and contractors working on secret projects has the slightest interest in using them because the security comes at the expense of functionality and connectivity.

    As a software developer, I would only warrant a software product to be completely secure if I installed and configured it myself on a closed system, where the customer has no chance to install other software or reconfigure my software.
    I would further limit my exposure by requiring that any number of computers be behind a dedicated firewall. Who is going to accept all of that?

    link to this | view in chronology ]

    • identicon
      LittleW0lf, 20 Dec 2003 @ 3:41pm

      Re: Not the Answer

      There are already software products that are certified to be secure, including versions of Windows and UNIX, but nobody outside of governments and contractors working on secret projects has the slightest interest in using them because the security comes at the expense of functionality and connectivity.

      I hate to break it to you, but the civilian and contractor folks working on secret projects don't have much interest in using them either. The biggest problem with the "certified" software packages (which aren't secure either,) is that it is locked in time...any time someone comes up with a new exploit for a certified OS, it must be recertified once the fix is applied. The cost is overwhelming, so most vendors who wish to get a certification, do it once and then use the certification as bragging rights.

      I would only warrant a software product to be completely secure if I installed and configured it myself on a closed system

      Ah yes, security through obscurity. I love systems that rely on security through obscurity to protect themselves (though I am not suggesting that it won't help in your case, since you make sure that there are other security capabilities like firewalls, etc., in place...) I've been personally responsible for slaying security through obscurity mechanisms in a number of vendor products.

      Who is going to accept all of that?

      If I am buying a product from you, or am giving you sensitive information, I'd want to make sure that you've done as much as you can to prevent the unnecessary disclosure of information to others. I certainly don't buy products from companies with questionable security policies, and I suspect more people are doing the same now after so many "black-eyes" out there have shown that security isn't something to fake.

      link to this | view in chronology ]

      • identicon
        Anonymous Coward, 22 Dec 2003 @ 8:41am

        Re: Not the Answer

        >There are already software products that are >certified to be secure, including versions of >Windows and UNIX, but nobody outside of >governments and contractors working on secret >projects has the slightest interest in using >them because the security comes at the expense >of functionality and connectivity.
        >
        >I hate to break it to you, but the civilian and >contractor folks working on secret projects >don't have much interest in using them either. >The biggest problem with the "certified" >software packages (which aren't secure either,) >is that it is locked in time...any time someone >comes up with a new exploit for a certified OS, >it must be recertified once the fix is applied. >The cost is overwhelming, so most vendors who >wish to get a certification, do it once and >then use the certification as bragging rights.

        The DoD COE has significant security extentions and is patched frequently (assuming a dilligent admin who is actually concerned with keeping his systems' paperwork current). Most folks (including the admins) aren't privy to COE modifications and that's the way that it should be... security by obsecurity, for all it's weakness, is but an outer layer in the security onion. C2 cert. is depricated but still branished by the uneducated as though it means something (even thought it's 10 to 15 years past it's "prime") and is primarily used as a budgetary contraint on projects.

        link to this | view in chronology ]

        • identicon
          LittleW0lf, 22 Dec 2003 @ 1:02pm

          Re: Not the Answer

          The DoD COE has significant security extentions and is patched frequently (assuming a dilligent admin who is actually concerned with keeping his systems' paperwork current).

          Correct, but the DoD COE isn't certified at any level (nor is it meant to be.) And there are some of us in security working with DoD COE which find it to be not patched quickly enough for our tastes, though with IAVA, there is much more of a push to get timely patches in now that they have a limit of 30 days to get the system up to IAVA compliance.

          Most folks (including the admins) aren't privy to COE modifications and that's the way that it should be... security by obsecurity

          I agree that STO (Security Through Obscurity) is ok when it is backed up by stronger security mechanisms (as said above.) Configuration Management is good, and I have a hard time believing that Configuration Management (i.e. DoD COE,) == STO. In many cases, Configuration Management reduces the security risk by assigning one person or a group of persons (who are security experts, and not lazy,) responsible for the security of an entire network.

          Except when the DoD COE machine is infected within three hours after a virus is released which DoD COE hasn't applied the patch for. We've had to remove a ton of DoD COE machines from the network temporarily until the patches were applied on the machines several times within the past couple years. Same excuse we get from the DoD COE folks all the time is that with Configuration Management, it takes time to implement the patches and get them pushed out to everyone, yet some of the scariest worms and viruses in the last couple years have come out weeks or even days after the vulnerability was discovered and patches made available.

          C2 cert. is depricated but still branished by the uneducated as though it means something (even thought it's 10 to 15 years past it's "prime") and is primarily used as a budgetary contraint on projects.

          Yes, C2 cert is depriciated, but it still is something that the vendors are pushing as the silver bullet for security woes, except that the C2 certification for WNT 4.0 was frozen with service pack 3, and a couple hot fixes, and wasn't even on the network. For W2K, SP3, the Common Criteria certification is EAL4, but that still is frozen in time, any new vulnerabilities discovered and any patches applied will invalidate the EAL4 certification, yet so much has changed since then, and running an EAL4 certified Windows2000 box is a death wish. Yet how many security professionals preach that Windows2000 has an EAL4 cert?

          link to this | view in chronology ]

      • identicon
        Anonymous Coward, 22 Dec 2003 @ 8:58am

        Re: Not the Answer

        I would only warrant a software product to be completely secure if I installed and configured it myself on a closed system

        Ah yes, security through obscurity. I love systems that rely on security through obscurity to protect themselves (though I am not suggesting that it won't help in your case, since you make sure that there are other security capabilities like firewalls, etc., in place...) I've been personally responsible for slaying security through obscurity mechanisms in a number of vendor products.

        Actually, he's not arguing for security through obscurity. Just the opposite, in fact. The only way he could warrant any software was if he knew the EXACT configuration of the computer or computers, and if that configuration never changed. Once it changes, he can not be sure that his product would not be affected as well, and therefore would not warrant the product.

        link to this | view in chronology ]

        • identicon
          LittleW0lf, 22 Dec 2003 @ 1:06pm

          Re: Not the Answer

          Actually, he's not arguing for security through obscurity.

          I had to reread the post. You are right, that is exactly what he is saying. Oops.

          link to this | view in chronology ]

  • identicon
    Anonymous Coward, 20 Dec 2003 @ 9:12pm

    wouldn't it be nice if...

    ...the only people we sent to jail were those who were actually they had done something wrong.

    Now, if, on the other had, we found out the Billy G. had cut a deal with the NSA/US Government to make his software insecure (inside or outside of the USA), I think this would be the perfect application of such a proposed law.

    Still, I think the marketplace is in the process of sorting all of this out; so, maybe such a law really is truely unimportant.

    link to this | view in chronology ]

  • identicon
    thecaptain, 21 Dec 2003 @ 8:52am

    No Subject Given

    There's no way this would ever work. The intention is good because YES, corporate management and greed is a BIG part of the blame for insecure/faulty software. Planned obscolesence anyone?

    However, courts can't even put away CEOs who blatantly lie cheat and steal with our EXISTING laws. this only thing that would happen with this is that a lackey would take the fall and the fatcats would go on their merry way consequence free.

    link to this | view in chronology ]

  • identicon
    D Teed, 22 Jul 2008 @ 3:23pm

    IAVA Patching Invalidates Common Criteria ?

    Anyone have any insight/experience on this one ?

    Many thanks

    D Teed
    DoN IA

    link to this | view in chronology ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.