Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
The Courts Government News

Cure For Bad Software? Legal Liability 456

satch89450 writes: "SecurityFocus had a column that I missed when it was first published a few days ago, titled 'Responsible Disclosure' Draft Could Have Legal Muscle, but I discovered it when researching an answer to a comment on the CYBERIA mailing list. In this article, Mark Rasch discusses how the Draft would set the rules for reporting security vunerabilities, and in particular define the boundaries of liability assumed by bug-disclosers. By adopting a "Best Practices" RFC, the IETF could help the reporters of security-related bugs do their job, and put the onus of fixing the bugs on the vendors who make the mistakes, where it belongs. (The RFC draft described in the article, 'Responsible Vulnerability Disclosure Process, is here at the ISI repository.) This is, of course, in direct opposition to the process that Microsoft's Scott Culp, Manager of the Microsoft Security Response Center, would like to see. As Microsoft is more part of the problem than part of the solution, I believe that the path to a formal process would better serve the entire community - and that community includes Microsoft's customers. I'm taking this seriously because the mainstream press is talking about the issue, and what it's going to take to fix it. Here is an example from BusinessWeek that scares me silly. I'm glad I'm looking to change careers from software development to something safe, like law."
This discussion has been archived. No new comments can be posted.

Cure For Bad Software? Legal Liability

Comments Filter:
  • by BWS ( 104239 ) <swang@cs.dal.ca> on Wednesday March 13, 2002 @03:47PM (#3158457)
    if we have software liabilities then we also open "Open Source" software to liabilities....

    It would be crazy to say that "Open Source" have no liability while "Closed Source" do...
  • by FortKnox ( 169099 ) on Wednesday March 13, 2002 @03:48PM (#3158464) Homepage Journal
    MS has bigtime lawyers.
    So does Oracle.
    So does Sun.

    But you're average startup? They don't have guarunteed paychecks, sometimes.

    This only hurts the little guy.
  • by marian ( 127443 ) on Wednesday March 13, 2002 @03:50PM (#3158485)

    Just consider that Microsoft is trying to get a version of Windows used by car companies for the computers that control various functions in the cars they sell. Regardless of what my opinion of Microsoft's business practices are in the marketplace, the thought of a Windows version controlling something like the fuel/air mixture in my vehicle terrifies me. Software vulnerabilities and bugs are moving into the realm of causing deaths. Is this what it takes to force accountability onto the companies who market buggy code?

  • by tiltowait ( 306189 ) on Wednesday March 13, 2002 @03:54PM (#3158505) Homepage Journal
    Perfect, and finished.

    [Insert your own Daikatana lawsuit joke here]
  • Fallout (Score:5, Insightful)

    by Petersko ( 564140 ) on Wednesday March 13, 2002 @03:55PM (#3158525)
    Should such a situation come to pass, the fallout would include:

    1) Higher development costs
    2) Far fewer small companies in consulting
    3) Shrinking job market for new grad coders
    4) Larger legal costs on both sides on the fence

    On the brightr side, it would also include:

    1) Lessening of age discrimination - experience outweighs youth
    2) Alteration of programming education to focus on security
    3) Higher standard of programming excellence
    4) Self-policing. Companies who fail to adhere will run themselves right out of business in short order.

    Finally, legal liability for Open Source projects is not a bad idea at all.
  • by Anonymous Coward on Wednesday March 13, 2002 @03:56PM (#3158533)
    If I'm using a tool, component, or class library from a 3rd party, what happens if the vulnerability is in their code? As a contractor would I have to spend $10,000 in legal fees just to prove it's Borland or MS or Sun's fault? Besides, how can you gurantee 100% that anything is safe? With the lawsuit happy society we have today the smallest mistake could put even a medium sized company right out of business. And if you think this will help open source, it won't. Would you use "free" software that has no liability while commercial software does? Would you get a "free" operation from a doctor with no liability or pay for one from someone who does.
  • Comment removed (Score:5, Insightful)

    by account_deleted ( 4530225 ) on Wednesday March 13, 2002 @04:00PM (#3158568)
    Comment removed based on user account deletion
  • Blame Trend (Score:3, Insightful)

    by ksw2 ( 520093 ) <[obeyeater] [at] [gmail.com]> on Wednesday March 13, 2002 @04:01PM (#3158571) Homepage
    This topic matched with the previous story about USAF laying blame on Microsoft seems to indicate a new trend... blame the software company! Does this mean that small companies will get sued out of existence? Yet another advantage for big business?

    What about considering alternatives and choosing wisely? If it's such a crucial point, why not take open source and audit it thoroughly, like the NSA?

    I think a lot of people will probably see this as a good thing, forcing M$ to take responsibility... but I think it has the potential to lay a huge disadvantage on the competitors as well.
  • Wrong (Score:4, Insightful)

    by The Cat ( 19816 ) on Wednesday March 13, 2002 @04:03PM (#3158603)
    Software companies don't spend enough time on design and testing the product before it's made public

    No. Managers at software companies don't spend enough time listening to their own #%*@$*)(#^& engineers who are ignored, ridiculed, shouted down, laid off, downsized, or outright fired when they point out repeatedly that the product is being developed WRONG.

    Of course, it's always better to be a team player. Just sign up for the donut list, keep your mouth shut and wear a big smile, BIG SMILE at all the meetings. That's how the job is kept.

    Competence, craftsmanship and professionalism are no longer of any value in the workplace, and until they are, it will be impossible to fix these problems.
  • by juliao ( 219156 ) on Wednesday March 13, 2002 @04:08PM (#3158644) Homepage
    How far can you take liability?

    If I give you a car, am I liable for the fact that it has no brakes? What if I sell you a car?

    What if I give you a tool? Am I liable that it breaks and breaks whatever you were trying to fix with it, too? What if I sell you one? What if I sell you one and say that it's rated for the work you're trying to do, but it still breaks?

    See the differences?

    Now for software:

    What if I give you a binary? Am I liable that it doesn't work? Am I liable that it has flaws?
    What if I sell it to you? Am I liable then?

    Now for something completely different: Source Code What if I give you source code? It's available for your inspection... Can we say that source code documents itself? If you are worried about what the code does, you can read it, compile it, debug it, step-trace it. Source code is NOT a program, it's closer to an algorithm than to a program. Can I be sued for giving you instructions on how to tell you computer to do something?

    If source code if just instructions, directions for a computer, then source code starts to look like something different, and precedent must come not from binary-software but from things like legal advice.

    And you know how that goes... IANAL, so I can say anything, you take my word if you want to. So, if IANAP (not a programmer), can I give you whatever source code I want, and I won't be liable?

    And who defines what a programmer is? The ACM?

  • Merchantability (Score:4, Insightful)

    by mcrbids ( 148650 ) on Wednesday March 13, 2002 @04:10PM (#3158657) Journal
    At heart here, and often forgotten, is the issue of "merchantability". What is that? It's the assurance that something is saleable, that reasonable expectations of performance can be made, and that the product does, in fact, perform its intended function.

    Because of this, it can be SOLD. If I sell you a keyboard for $20, you now have the expectation of merchantability. It is expected to work, and both reasonable business sense and many local and federal laws require that if it does not, I either provide something that works, or give you your money back, within a reasonable period of time. (14 days in California)

    If we re-institute the concept of merchantability in software, all that would happen is that you could get your money back - thus little to no effect on OSS software.

    Red Hat may be impacted, but since they are already selling services rather than products (you can download all their stuff for free) even they would be minimally affected.

    So, as an advocate of open source and "free" software, I welcome the issues of product liability and the enforcement of merchantability. It would improve the industry, force it to get better, and would finally provide its customers what they've been promised all along - a better, easier life!

    What should happen? A date set for a software "merchantability horizon". All products released before that date would be exempt, any products released/sold after that date would have to fit the definition of merchantability, products sold before that point can continue on their merry way.

    Can you imagine how many people would upgrade their Windows if they knew that MS would be liable thereafter if it screwed up?

  • by tkrotchko ( 124118 ) on Wednesday March 13, 2002 @04:10PM (#3158671) Homepage
    In theory, this should help the little guy and open source because they could be more responsible for their customer.

    But in fact, it will have the opposite effect. It means that software will have to be "certified" before it could be released.

    Little developers (guys in their basement) could never afford this. Big guys (Microsoft) could. Again, this favors big, established companies over upstarts.

    But more seriously, lets look at the worst issue with having liability for unsecure software:

    If I have a Firestone tire (as mentioned in one of the links), I expect that it will be safe to put on my car and drive up to the speed rating on the side. But if I used the tire as a swing in my backyard and I fell off and broke my arm, should Firestone be liable? After all, a lot of people use tires for swings, and they didn't do anything to make them safer for this purpose.

    Silly? Maybe. But now apply to something like a computer operating system. What is its intended purpose? Basically its purpose is infinite. It will allow a piece of hardware to begin to have infinite possibilities. So now I have to make sure my software is safe in any possible circumstance that I can't even forsee!

    Mind you, I'm not excusing bad software, but I don't see how this proposal will do anything, because a new license will come out that people will simply have to accept something like:

    "I accept that if I use this software it is completely insecure and will allow bad people to do bad things to me and my computer. I completely waive all rights to bring legal action again the makers of this software, even if they knew there is or was a problem. "

    This is a "good in theory, bad in practice" solution.

  • by Petersko ( 564140 ) on Wednesday March 13, 2002 @04:11PM (#3158675)
    Not reasonable. For a project of any complexity, verifying the integrity and correctness of the code is a financially gigantic undertaking. If you disagree, I have a favor to ask.

    I'm kind of concerned about using this Apache product. Would you mind trundling off and verifying the integrity and correctness of all the source code please? Oh yeah - and if it includes standard libraries I need those verified as well.

    Can you get that done before the weekend? I was hoping to install on Saturday.
  • by ip_vjl ( 410654 ) on Wednesday March 13, 2002 @04:13PM (#3158693) Homepage
    Unlike the 'real world' example of the tire mentioned in the BW article ... software developers have a much harder time controlling the environment in which their software is used.

    For example, If I buy a car tire from firestone, but instead use it on some home-build dune-buggy that I use to drive over lava fields in Hawaii and the tire blows (flipping me into the lava) should Firestone pay? I wasn't using the tire according to the specs that they call for the tire.

    Imposing liability on software will only force software manufacturers to list hardware/software configurations on which they are willing to accept liability. If you use the software outside of that configuration, then you're on your own. My guess is that this would disqualify just about everybody, as they'll only be able to certify a limited amount of equipment (as it will entail actually owning that equipment to test).

    I mean, would you accept liability on a product that can be used on a multi-use computer that may have god-knows-what software/hardware config?

    So this will lead to something like:
    • the back of the software box listing the exact system requirments that the software is good for (and liable on) and if you use it outside of that environment, you're no longer using the software as it was intended.

      Which then just gives software companies even more reason to offer less support, as they'll then only need to offer support on their specific hardware, or risk the liability of condoning the use of their software on unsafe/untested environments.
    • more incentive to legislate the demise of the multi-use computer in favor of locked computing appliances ... which is exactly what a number of people would like (think DRM)


    Think about it.

  • Comment removed (Score:4, Insightful)

    by account_deleted ( 4530225 ) on Wednesday March 13, 2002 @04:14PM (#3158700)
    Comment removed based on user account deletion
  • by gregfortune ( 313889 ) on Wednesday March 13, 2002 @04:15PM (#3158712)
    Ok, so I'm currently working on a auction system that is in use by at least one company. They ask for a change in the software so the commission percentages that are charge to their consignors are handled in a slightly different way. I make the change and under certain conditions, it's now possible for the consignor to be charge half of what they should be. I can see there should possibly be some liability here especially if I were "selling" the product.

    btw, none of the things I'm listing here ever happened, I'm just supposing...

    Now, they ask for a change that resizes the storage size for the Notes for each customer. I make the change, but my code does not also make the change to their database schema. I provide a separate script that does that. The customer installs the upgrade, but does not upgrade the db. Who is liable? Can I be held liable for not making my upgrade *easy* enough if the client forgets to run the db upgrade script and loses data?

    Let go even further. I use MySQL for the db, python-mysql for the db module, python for the language and Qt for the interface. ReportLab is being used for pdf generation, lpr for printing, X-windows for launching the program, KDE for the desktop manager, and Acrobat Reader to parse the pdf files into ps for printing. Without these things, the program will not run.

    Now, due to a bug in MySQL, the company finds that it is losing n*$50 where n is the number of items in the auction for every auction. Perhaps the 50 entry fee is not getting stored correctly and suppose that's a database problem. Who's liable? Me, for leveraging off an existing system without it being totally stable? The db? Maybe in this case it's clear the db maker would be held responsible.

    Now let's lose some data because MySQL was not *configured* correctly. Who's fault now? Customer, me, or MySQL?

    Lastly, let's lose some data due to a bug in the database that was caused by a ambiguity in the API of glibc that allows a function to be called in a way that was not intended and works as expected most of the time, but is clearly not a bug when it doesn't work the expected way. Who now? MySQL? The library they used? Me for using MySQL? The customer for being stupid enough to hire me when I'm not even competent enough to ensure the tools I use have absolutly no bugs in them? ARGH!

    I'll tell you one thing... I've never associating my name with a general library if this kind of thing goes through. Blame would very often be passed back down the chain as far as possible trying to find a scapegoat other than yourself.
  • by jc42 ( 318812 ) on Wednesday March 13, 2002 @04:17PM (#3158728) Homepage Journal
    As a programmer, I have often given a simple explanation of why I can't write reliable software. On most vendors' computers (Microsoft obviously, but also Sun, HP, IBM and most of the rest), the inner workings are totally hidden from me. I can't even in principle know what a lot of my code will do in all cases, because I much make calls to the underlying system and its libraries, and the code for these things is a proprietary secret.

    What I usually use as a parallel is: Imagine that the people who built buildings or bridges were required to use commercial steen and concrete, but the specs for these materials were trade secrets. Imagine that construction firms had to use whatever material was delivered, and were not permitted to see its specs. There would be no way that anyone could calculate the effect of loads and stresses, and things would fall down under load.

    This is how software is built.

    On Open Source systems, it's somewhat different, because the source is available. But even there, you can only understand the system "in principle". You usually don't have the time it would take to thoroughly investigate all the components that you use. Open Source software does generally work better, true, but it's not because every programmer has examined every piece of the source. It's because a lot of them have examined a few pieces, and they can tell each other about problems (and fix them).

    This probably has significant legal impact. Consider the construction parallel again. If I design a structure and specify materials of a certain quality, those materials are used, and the structure collapses, I am probably liable. But if the material vendors substitute material with different properties (usually for cost reasons), all I need to do is show in court that the material didn't meet my specs. I'm not liable, and the vendors end up facing some serious fraud charges.

    With software, this sort of fraud happens routinely, with all sorts of system components that are delivered knowing that they don't do what the manuals says they do. Or the vendors don't even bother checking that things work right, because they know they can't be held liable. Then people hire programmers like me to write software using such shoddy systems, and expect us to write reliable software on top of it. Then it turns out that some parts of the system have "undocumented features", and the code doesn't work right.

    Until we find a way to force reliability on the Microsofts and Suns and IBMs of the world, the way we have with companies that sell steel and concrete, there's no way whatsoever that programmers can ever write reliable software.

  • by SomeoneGotMyNick ( 200685 ) on Wednesday March 13, 2002 @04:23PM (#3158769) Journal
    Maybe it should be looked at this way....

    If you purchase software in which the purchase amount benefits the commercial entity who develops the software, you are entitled to legal recourse in the event of failure due to the software. A guarantee of serviceability if you will.

    On the other hand, if you wish to be absolved(sp?) of legal liability for software you create, then offer it for free, like most GPL software is.

    I think this would be great for some of the excuses for shareware out there. If you charge a shareware fee, it better work. I've found better working freeware compared to shareware alternatives.

  • Comment removed (Score:4, Insightful)

    by account_deleted ( 4530225 ) on Wednesday March 13, 2002 @04:23PM (#3158774)
    Comment removed based on user account deletion
  • by AdamBa ( 64128 ) on Wednesday March 13, 2002 @04:26PM (#3158795) Homepage
    First of all, I don't like these "soft" RFCs (aside from joke ones) that are not technical.

    Second of all, the RFC really has no force given the RFC language. The two key provisions, that companies SHOULD fix holes within 30 days, and that customers SHOULD apply patches in a timely manner, can both be ignored since "SHOULD" in RFC-speak is different from "MUST".

    Thirdly, this RFC is a bit too targeted at Microsoft:

    1) The Vendor SHOULD ensure that programmers, designers, and testers are knowledgeable about common flaws in the design and implementation of products.

    2) Customers SHOULD configure their products and systems in ways that eliminate latent flaws or reduce the impact of latent flaws, including (1) removing default services that are not necessary for the operation of the affected systems, (2) limiting necessary services only to networks or systems that require access, (3) using the minimal amount of access and privileges necessary for proper functioning of the products...

    This is too "ripped from today's Microsoft headlines". This stuff about removing default services is bogus. Something like UPNP in Windows (designed to makes things easy for novice users) is useful only if it is turned on by default. Anyway what does "not necessary for the operation of the affected systems" mean. You can run Linux without a GUI...so if an exploit is found in KDE or Gnome will someone jump up and say, "You enable the GUI by default and it wasn't necessary and you violated the RFC"? The solution to flaws in UPNP to not ship with them, not to disable everything in the box.

    Fourth, what the heck is this supposed to mean:

    7) The Customer SHOULD give preference to products whose Vendors follow responsible disclosure practices.

    Can we please keep the social engineering out of the RFC -- this is an absurd requirement to put in there. Why not just say "Customers SHOULD give preference to open source software because we think it's k3wL"?

    - adam

  • by dasmegabyte ( 267018 ) <das@OHNOWHATSTHISdasmegabyte.org> on Wednesday March 13, 2002 @04:28PM (#3158801) Homepage Journal
    I think a lot of software is released buggy as hell simply because investers and customers expect development houses to show results very quickly. Many contract jobs are six months or shorter, barely enough time to come up with a dog & pony slideshow of great software, let alone develop a secure product. Most developers depend on tools from other companies to cover the gaps in the process -- tools like IIS and apache.

    The problem lies with the fallacy of internet time -- that software advances can keep up with hardware advances. The difficulty here is that Moore's law is based on years of research -- an advance in memory that doubles the speed next year will have begun five years or more ago with tons of R&D. Software doesn't really have that luxury -- it's all about the now.

    One might say that this sort of demand is a requirement in business -- but in many ways, it's a self maintaining fad. Look at biotech -- a biotech company might do research for dozens of years before they can release a new drug or procedure. They have amazingly tedious checks and balances. Why? Because human lives are at stake. Because a single slip up will cost them millions in malpractice.

    Holding software companies liable for security failures is a great idea in the respect that it will force dev houses to make better software. But in the process something will have to be done about the expectation that software is a need it now sort of deal.

    As a side note: this sort of legislature would be a godsend for contract programmers. If company X has to wait years for a secure product to come out of Microsoft or hire somebody now to do the work cheap and sign off on the liability, they'll probably choose the latter. It'll also decrease on the feature blitz of new products that is leading to the increased need for pay for play software licensing.
  • by chris_sawtell ( 10326 ) on Wednesday March 13, 2002 @04:49PM (#3158973) Journal
    A large proportion of the of the security problems would just go away if the subroutine return address was stored in a separate memory area from the data area. This would make the buffer overflow / stack-smashing type of attack impossible. It's such a simple idea I am amazed that it has not implemented long ago. There must therefore be something wrong in my thinking, what is it?
  • by debrain ( 29228 ) on Wednesday March 13, 2002 @05:25PM (#3159258) Journal
    Recall that an American Destroyer was rendered dead in the water as a result of NT crashes and space shuttle missions rendered write-offs because of NT crashes. Not to pick on NT, but these are cases where lives did depend upon software. Death is just an example of liability.
  • Re:Fallout (Score:3, Insightful)

    by lynx_user_abroad ( 323975 ) on Wednesday March 13, 2002 @05:39PM (#3159358) Homepage Journal
    Yes, it would be the end of Open Source. Who in their right mind would code for a project part time if it meant they were legally liable for anything that might go wrong with it?

    There is no software you can write which I cannot make faulty with the right (wrong) compiler. And it doesn't matter how good a programmer you are, or how simple the program you wrote.

    It makes no sense to hold the author of the software liable for faults, because the faults could be intorduced by the compiler, or by the later stages of deployment and configuration. So there should be blanket immunity for anyone who vends software in source form under the theory that anyone who has access to the source must exercise due dilligence to ensure that the software is appropriate for the situation in which it is deployed.

    On the other hand, vendors who deliver software as a pre-compiled binary must assume some liability, as the consumer is no longer in a position to exercise due dilligence.

    This would be a win for free software developers, as long as they only deliver code as source; no liability.

    This would be a win for companies like RedHat, who would be able to offer pre-compiled free software, and assume some of the liability for making sure it was compiled correctly.

    This would be a win for anyone who uses software, because vendors would ensure their products have less faults, under threat of liability.

    This would be the death blow for Microsoft, because (as a company which vends primarily pre-compiled binaries only) they would be fully liable for the software they ship, but would be fully responsible for detecting and correcting their own faults.

    I say bring it on.

  • by neuroticia ( 557805 ) <neuroticia AT yahoo DOT com> on Wednesday March 13, 2002 @05:40PM (#3159363) Journal
    There should be multiple levels of liability for faulty sourcecode just as there are multiple levels of liability in other areas of the law.

    Opensource should not automatically be excused of all liability. If a bug exits and a sizeable amount of time passes with no fix, as new users are downloading and using the product *without being warned* then the maintainers of the source should be held liable. Opensource vendors should be required to post an updated list of bugs as they appear and fix them before releasing the next version of the software.

    Commercial software vendors should be given a certain amount of time to remedy the problem based on the severity and spread of the problem, and for each day/week/month incur fines until the issue is resolved. Registered users of the software should be notified both when the bug is discovered and when the fix is released. All users should be able to access the information via the internet. A new version of the software cannot be released until known bugs in the last version are fully patched.

    The liability of vendors should be clearly outlined and have the same tiers and exceptions that current liability laws have. It should be clear that vendors are not responsible for misuse of intended features of their systems (ie: Linux developers are not responsible for warning people that rm -rf / will trash their system.) and vendors liability will be determined on a set of criteria: a.) Software version number-- it should mean something again. b.) intended impact of software--vendors of backup software will be held to a higher standard if their software fails than would the creators of games or graphics software.

    Vendors should not be allowed to attempt to silence those who make bugs public knowledge. There should be fines for companies that try to initiate lawsuits for third-parties publishing bug reports, examples of exploits, or other information. Perhaps there should be a certain set of guidelines as to the "release schedule" of those bug reports, however. Exploits can only be made publically available after a patch is available, bug reports can be made as soon as the bug is discovered, etc.

    I think software liability is a good idea as long as it's not a loosely interpreted law that is applied equally to all vendors regardless of software genre and company size.

    -Sara

Software production is assumed to be a line function, but it is run like a staff function. -- Paul Licker

Working...