Why Making Developers Liable For Security Vulnerabilities Won't Work

from the it's-a-problem dept

It seems you can't go a few months before someone, somewhere brings up the question of whether or not developers of software products should be liable for the security vulnerabilities in their products. The frustration level is high with buggy software, and so out come the suggestions that lemon laws should apply and execs at companies that make buggy software should go to jail. The problem, though, is that software will remain insecure no matter what's done, and adding liability might actually make the problem worse. However, along comes Howard Schmidt saying instead of companies being held responsible, the actual developers of products should be held liable for security vulnerabilities. This is the same Howard Schmidt who announced 10 months ago that technology would solve the phishing problem in a year. He's got two months to go, and last we checked, phishing was still a growing problem. Should we hold him liable for falsely claiming that phishing would be gone? That's the crux of the problem. No matter how careful a developer is, there are always going to be holes he can't foresee. Making developers liable will only cause a few things to happen: vastly fewer programmers will be available to work on security issues, as it's just not worth the risk, and fewer companies will even try to make security products. Also, just about every product you buy will be surrounded by pages and pages of legalese to tell you that the product isn't at all secure to try to legalize themselves out of liability. That won't help anyone in terms of actually building more secure applications. Schmidt is right in saying that developers need to be better trained in computer security, but that doesn't mean adding liability issues without looking at the unintended consequences of such an action. Update: There's an update to this story, where Schmidt clarifies that he was talking about accountability, not liability -- and the ZDNet article misconstrued it.
Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • identicon
    Tom, 12 Oct 2005 @ 9:35am

    Why Making Developers Liable For Security Vulnerab

    Well they should not be responsible for security but they should be responsible for poorly written software that was not tested properly or rushed out the door. To many time in my day have I had to explain to a customer that they purchased sub-standard software to save a buck. The people who wrote that should be liable for the headache caused by saving a couple weeks or a few dollars.

    link to this | view in chronology ]

    • identicon
      AC, 12 Oct 2005 @ 9:46am

      Re: Why Making Developers Liable For Security Vuln

      Yeah right, because its the developers idea to skip all QA and ship it before the end of the quarter to boost profits. Devs usually want to do good work, however the suits only care about making money. There is bound to be a conflict, and who do you think usually wins? Thats right the folks that sign the checks.

      What really kills me about this is the fact that it is so very biased. It in effect is saying lets punish all the techno-serfs instead of holding the management of the corporation responsible for their decisions.




      link to this | view in chronology ]

    • identicon
      John, 12 Oct 2005 @ 10:14am

      Re: Why Making Developers Liable For Security Vuln

      Sure, blame the developer... As a developer I can say that this is BS. Bugs will live forever. But the amount of bugs is always directly related to management. Some clients don't want to pay the extra dollars for QA and full testing, nor do they want to wait the amount of time it takes to properly test, fix, and retest software.

      And if you really want to get rid of bugs, you have to start in the operating system and maybe as low as the hardware. All of this technology is built upon itself and if the piece below has a bug, it will show through to other programs on top.

      link to this | view in chronology ]

  • identicon
    Bill, 12 Oct 2005 @ 10:02am

    Read the EULA

    At best, you should be able to get back the cost of the software. This is the same for most manufactured products. If I sell you a $1.00 nut and bolt, and because it's defective, you want me to buy you a new car, or what ever failed, your crazy. I could build some libility into the price, but them each nut and bolt is going to coust you $10

    link to this | view in chronology ]

  • identicon
    elvis, 12 Oct 2005 @ 10:38am

    What about the underlying code?

    What about the libraries that the security-vulnerable application depends on? What if they are vulnerable? Who is liable then? Does the original application accept liability for the underlying library code?

    link to this | view in chronology ]

    • identicon
      Happy user, 12 Oct 2005 @ 10:53am

      Re: What about the underlying code?

      Truth is, no one has held a gun to your head and is telling you that "you must use XX Software or you will die."

      It was your choice to use XX Software. By not doing your research on the company before purchasing the software, you remain the only one to blame here.

      And if there is no other software that will get the job done except the one with "bugs" or "security holes", then that is part of the price you pay for wanting to use that software.

      Humans are too quickly to blame others for when things don’t work 100% as they expected. I am sure it was hardly the intent of the developer to create "flawed" software – or maybe it was?

      Signed, Happy user.

      link to this | view in chronology ]

  • identicon
    S, 12 Oct 2005 @ 11:44am

    No Subject Given

    The above points about cutting out QA and security issues in underlying software/hardware are absolutely issues. We face the "we don't have money for QA" problem every year when the budget is outlined.

    Here's another example of the security/buggy debate. Our help system uses compiled html help files. Suddenly our help systems didn't display the html when access across the network. Turns out that the problem is a security fix that was implemented in an OS patch. So now what? Do we hold the OS company liable? Their developers for not seeing this problem years ago before we developed our help system? No.

    We will deal with it and make things work eventually, for now we have a work around. Times change, businesses change, software changes, security issues change, etc. It's not as if the technology world is static. And, nobody has a crystal ball. Instead of wasting time and money pointing fingers, put effort and money into positive change, into solutions and MOVE ON!

    (stepping down from soap box...)
    ;)

    link to this | view in chronology ]

  • identicon
    Y Pennog Coch, 12 Oct 2005 @ 12:54pm

    A second computer for all would be cheaper

    Fit a second network in your office, or use wireless, and buy everyone a small quiet mini-ITX PC (they're fast enough for surfing) and a KVM switch.
    Make it clear that frivolous use of the work systems instead of the ITX will be a sacking offence. For personal mail, get a webmail account and access it from the ITX box. Similarly, placing commercial data on the ITX boxes is a serious offence.
    Now you can lock down your original business network and apply some aggressive filtering.
    If anyone wants to check out a filtered site for business purposes, they do it from the ITX box then put in an unblocking request. Requests could be granted automatically for established employees and domains that aren't explicitly blacklisted.
    Now when a PC goes down to a security hole, 19 times out of 20 it'll be one of the ITX boxes, no data to rescue, and all the IT guys have to do is boot it from a recovery disc. And if you use Linux, you can leave the discs with the employees.
    If this sounds too expensive, well so long as Moore's Law keeps going it can only get cheaper.

    link to this | view in chronology ]

    • identicon
      Jeff, 12 Oct 2005 @ 5:17pm

      Re: A second computer for all would be cheaper

      This made me chuckle... He he he he...

      link to this | view in chronology ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.