Outlining a World Where Software Makers Are Liable For Flaws 508
CowboyRobot writes with this piece at the ACM Queue, in which "Poul-Henning Kamp makes the argument for software liability laws. 'We have to do something that actually works, as opposed to accepting a security circus in the form of virus or malware scanners and other mathematically proven insufficient and inefficient efforts. We are approaching the point where people and organizations are falling back to pen and paper for keeping important secrets, because they no longer trust their computers to keep them safe.'"
Re:You can't trust code ... (Score:5, Interesting)
That reminds me of an anecdote one of my CS professors mentioned. When fly-by-wire technology for passenger planes was starting to get rolled out, they polled some people about their willingness to fly on a plane that was controlled by a computer. The group that had one of the largest negative response was programmers. For everyone else the software is just magic.
All we need is Love (Score:4, Interesting)
... All we need is love and Free Software. And even the love is not strictly a requisite.
Let's say everyone owns Free software, so nobody (i.e. everybody) is liable for faulty Free software. Everybody (i.e. nobody) pays.
In other words, sure, let the proprietors of proprietary software pay for software behaving badly.
If the software is free it's everybody's and nobody's responsibility. It's like culture and language in general. We do it together.
Who's with me?
Don't trust applications, ever (Score:4, Interesting)
The responsibility for preventing security problems with PCs should strictly fall into 2 places, the User, and the OS.... however... not the way 99.99% of you are thinking about it.
The user should decide what resources a program NEEDS in order to do a task, such as which folder it can access, what network connections, etc. This allows the user to decide ahead of time what they are willing to risk. Once that determination is made, the user then would give that list, along with a pointer to the program, to the operating system.
The OS should then enforce the users choices.... if it's not in the list, the application shouldn't even be able to find it, let alone access it. If the OS fails to enforce the users will, then the OS is at fault.... if the User gave away the store, well... they gave away the store.
This requires a simple change to the base design of operating systems, instead of permitting everything, and limiting actions of a running program to that of the user's account... the OS should limit the actions of the program to a short list of resources supplied by the user... and nothing else. Of course, the refactoring of everything to add this additional layer of logic is a massive undertaking.
There would still be the traditional user rights, access control lists, etc.... but there would also be a level of control where the user decides which of the resources they have should be given to the application. This is called "capability based security", or cabsec for short.
It's going to take somewhere between 10 and 15 years before people are fed up enough to make the switch.... but it will happen eventually.
Security isn't an application issue... it never was, and never will be.
Re:Sure (Score:2, Interesting)
He said none of what you allege.
Nothing says a mason isn't capable of designing a good bridge, either. Simply that without the credentials, there's no trust.
I think this has more to do with the fact that software is a relatively new area that grew rapidly. It began with computer science degrees, because it was more a science than engineering. Now, computer science degrees - from what I've seen - still focus more on abstract minutia and theory, while software engineering has more classes in forward-looking design and sustainability and, as with all engineering degrees, ethics. Which sucks for the people who went to school before software as an engineering degree came to exist, now that there's that perception of difference.
Your story says the same thing. First you were hired as a hacker (not an engineer) and that's the kind of thing a CS degree is better for. Industry matured, your credentials didn't. I think it's pretty obvious that it has to do with your degree. Sucks for you, but you could do something about it, other than make ridiculous claims about this conspiracy that only engineers can be good software designers, and that this somehow justifies (or is even at all related to) the anon and lulzsec tards.
Truth of it is, as software has grown into an industry that lots of things depend on, people want engineers to be supporting it, just like designing bridges or buildings or computer chips or cars. It's not that no one else can do it; it's simply that those with engineering degrees have verifiable credentials for doing it, and that historically societies place trust in engineers. It's not taken so seriously in the US, but in Canada for example, all practicing engineers have to be registered and take an oath and whatnot. Not indifferent from those practicing medicine.
If hacking is what you want, major software isn't the place for you to be working. Try getting a job as an indie game dev or something, or shoring up your credentials. As (I suppose) a CS grad, it shouldn't take much for you to pick up the SE degree. If you've been good at your job, chances are your employer will support you to keep you around.
But if you're just going to whine profanely on /. and blame everyone else, then I have no sympathy. Life isn't always fair. Man up and make the best of it.
Re:Sure (Score:4, Interesting)
Re:Sure (Score:5, Interesting)
so a PE can get out of being liable for a badly designed bridge by putting the blueprints and the bill of materials on a sign before you get on the bridge?
there is a point where i agree that the programmers should be liable for their code - to the extent that it shows negligence. the fact that software for so long has gotten away with "good luck, thanks for the cash" mentality is kinda sad.
I am a programmer - and i would be willing to stand behind my code used in the environment for which it was intended.. but at the same time i would want to be compensated for the risk.. same way a PE gets compensated based on the scope of work they have to sign off on.
What truly irks me about discussions such as this is that everyone wants to lay the blame on the programmer. It is the organization that is at fault. Matter of fact, the responsibility for a defective software product lies squarely with upper management. Frankly, I just don't get this perceived need to roast programmers and software engineers alive, when defective designs in every other industry cause harm, and nobody talks about throwing those engineers under a bus.
... well, guess what. I wouldn't be a software engineer anymore. Why should I go to jail, or be bankrupted with legal fees, when I did a perfectly competent job, but a bug still managed to get by QC? Might as well put the QC team on the hot seat too: they're the ones that missed it. Fact is, the corporate veil is there for a reason.
Standing by your code is one thing: taking the legal responsibility for a finished, shipping application that has problems that you would certainly have fixed if you knew about them is something else again. Management decides who works on what project, how much (if any) quality control time is assigned to that project, management decides what bugs are minor enough to fix in an update (and sometimes they're wrong about that.) Management decides who to hire in the first place.
I work in an industry where my codebase, if it were to malfunction in any serious way, would be a major problem for some rather large plants worldwide. But here's the thing: if the responsibility (and legal penalties) for such problems were mine, and mine alone
In any organization it's the people at the top (the people who get the big salaries and golden parachutes) who ultimately maintain responsibility for such failures. And that is how it should be: they make the big decisions, they're the ones who allocate resources. Your average code monkey is no more at fault for a product failure than the janitor. That's why, unless there's gross mismanagement, it's the company that is penalized, not the individual employees. There are supposed to be checks and balances. Face it people: we know how to do code right, but most vendors simply don't want to spend the money.
That bridge you were talking about is a perfect example: the reason bridges don't fail very often because of design flaws is because those designs are reviewed and cross-checked and signed-off upon by slew of other engineers and designers who make sure the design is solid. It's that way because nobody is perfect. Again, who decides how much code review and design assurance is necessary? Yeah, you got it: management.
All the disclaimers in the world don't mean squat in court if your software causes significant economic or physical harm. The company that produced it (not the individual developers) certainly can be sued and redress granted. But penalizing individuals for systemic problems within a given organization? Even discussing that is patently ridiculous.
There's no good reason to burn engineers at the stake. Plenty of reason to boil a lot of CEOs and managers in oil though.