Why Making Developers Liable For Security Vulnerabilities Won't Work
from the it's-a-problem dept
It seems you can't go a few months before someone, somewhere brings up the question of whether or not developers of software products should be liable for the security vulnerabilities in their products. The frustration level is high with buggy software, and so out come the suggestions that lemon laws should apply and execs at companies that make buggy software should go to jail. The problem, though, is that software will remain insecure no matter what's done, and adding liability might actually make the problem worse. However, along comes Howard Schmidt saying instead of companies being held responsible, the actual developers of products should be held liable for security vulnerabilities. This is the same Howard Schmidt who announced 10 months ago that technology would solve the phishing problem in a year. He's got two months to go, and last we checked, phishing was still a growing problem. Should we hold him liable for falsely claiming that phishing would be gone? That's the crux of the problem. No matter how careful a developer is, there are always going to be holes he can't foresee. Making developers liable will only cause a few things to happen: vastly fewer programmers will be available to work on security issues, as it's just not worth the risk, and fewer companies will even try to make security products. Also, just about every product you buy will be surrounded by pages and pages of legalese to tell you that the product isn't at all secure to try to legalize themselves out of liability. That won't help anyone in terms of actually building more secure applications. Schmidt is right in saying that developers need to be better trained in computer security, but that doesn't mean adding liability issues without looking at the unintended consequences of such an action. Update: There's an update to this story, where Schmidt clarifies that he was talking about accountability, not liability -- and the ZDNet article misconstrued it.Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Reader Comments
Subscribe: RSS
View by: Time | Thread
Why Making Developers Liable For Security Vulnerab
[ link to this | view in chronology ]
Re: Why Making Developers Liable For Security Vuln
What really kills me about this is the fact that it is so very biased. It in effect is saying lets punish all the techno-serfs instead of holding the management of the corporation responsible for their decisions.
[ link to this | view in chronology ]
Re: Why Making Developers Liable For Security Vuln
And if you really want to get rid of bugs, you have to start in the operating system and maybe as low as the hardware. All of this technology is built upon itself and if the piece below has a bug, it will show through to other programs on top.
[ link to this | view in chronology ]
Read the EULA
[ link to this | view in chronology ]
What about the underlying code?
[ link to this | view in chronology ]
Re: What about the underlying code?
It was your choice to use XX Software. By not doing your research on the company before purchasing the software, you remain the only one to blame here.
And if there is no other software that will get the job done except the one with "bugs" or "security holes", then that is part of the price you pay for wanting to use that software.
Humans are too quickly to blame others for when things don’t work 100% as they expected. I am sure it was hardly the intent of the developer to create "flawed" software – or maybe it was?
Signed, Happy user.
[ link to this | view in chronology ]
No Subject Given
Here's another example of the security/buggy debate. Our help system uses compiled html help files. Suddenly our help systems didn't display the html when access across the network. Turns out that the problem is a security fix that was implemented in an OS patch. So now what? Do we hold the OS company liable? Their developers for not seeing this problem years ago before we developed our help system? No.
We will deal with it and make things work eventually, for now we have a work around. Times change, businesses change, software changes, security issues change, etc. It's not as if the technology world is static. And, nobody has a crystal ball. Instead of wasting time and money pointing fingers, put effort and money into positive change, into solutions and MOVE ON!
(stepping down from soap box...)
;)
[ link to this | view in chronology ]
A second computer for all would be cheaper
Make it clear that frivolous use of the work systems instead of the ITX will be a sacking offence. For personal mail, get a webmail account and access it from the ITX box. Similarly, placing commercial data on the ITX boxes is a serious offence.
Now you can lock down your original business network and apply some aggressive filtering.
If anyone wants to check out a filtered site for business purposes, they do it from the ITX box then put in an unblocking request. Requests could be granted automatically for established employees and domains that aren't explicitly blacklisted.
Now when a PC goes down to a security hole, 19 times out of 20 it'll be one of the ITX boxes, no data to rescue, and all the IT guys have to do is boot it from a recovery disc. And if you use Linux, you can leave the discs with the employees.
If this sounds too expensive, well so long as Moore's Law keeps going it can only get cheaper.
[ link to this | view in chronology ]
Re: A second computer for all would be cheaper
[ link to this | view in chronology ]