Outlining a World Where Software Makers Are Liable For Flaws 508
CowboyRobot writes with this piece at the ACM Queue, in which "Poul-Henning Kamp makes the argument for software liability laws. 'We have to do something that actually works, as opposed to accepting a security circus in the form of virus or malware scanners and other mathematically proven insufficient and inefficient efforts. We are approaching the point where people and organizations are falling back to pen and paper for keeping important secrets, because they no longer trust their computers to keep them safe.'"
Sure (Score:5, Insightful)
It will just cost 100x more, just like healthcare with the torts. Time to take out software developer insurance, similar to the healthcare insurance of approximately 1 million dollars a year paid by doctors these days.
Re: (Score:3, Insightful)
Re:Sure (Score:5, Informative)
What liability?
Clause 1. If you deliver software with complete and buildable source code and a license that allows disabling any functionality or code by the licensee, then your liability is limited to a refund.
Re:Sure (Score:4, Insightful)
so a PE can get out of being liable for a badly designed bridge by putting the blueprints and the bill of materials on a sign before you get on the bridge?
there is a point where i agree that the programmers should be liable for their code - to the extent that it shows negligence. the fact that software for so long has gotten away with "good luck, thanks for the cash" mentality is kinda sad.
I am a programmer - and i would be willing to stand behind my code used in the environment for which it was intended.. but at the same time i would want to be compensated for the risk.. same way a PE gets compensated based on the scope of work they have to sign off on.
Re:Sure (Score:4, Insightful)
am a programmer - and i would be willing to stand behind my code used in the environment for which it was intended..
ROFL! Wow, you actually expect liability to be limited to the scope the product was INTENDED? That ranks up there with lawsuits against toys because little jimmy choked on a Lego brick or Peggy Sue shoved a jet fighter figure up her nose and shot the plastic missile into her sinus. There is no limit to the stupid and out of intended uses people will put things. There is NO SUCH THING AS IDIOT PROOF. The world keeps making better idiots. If this becomes law, at some point you WILL be sued. No ifs, ands, or buts about it.
Re:Sure (Score:4, Informative)
Re:Sure (Score:4, Insightful)
the fact that software for so long has gotten away with "good luck, thanks for the cash" mentality is kinda sad
Genuinely critical software isn't usually handled like this.
The whole premise is retarded. You want guarantees? Great, we already have a handy tool of commerce for that. They're called contracts. Just a heads-up... it's going to cost more.
Re: (Score:3)
Re: (Score:3)
there is a point where i agree that the programmers should be liable for their code - to the extent that it shows negligence.
This has good and bad points. A few things of note:
1) One major function of tort-liability is cost-shifting--the programmer's negligent behavior causes an actual cost to the business owner who uses his software, and maybe the programmer should have to reimburse him. If the programmer does, then this means that a part of the total cost of making that particular software--the part otherwise paid by the loss the business owner suffers--gets built into the expected costs of making the software on the part of
Re: (Score:3)
Obviously, that is an extreme case, and most software is not designed with the expectation of having lives depend on it. Just like you have different standards for military grade hardware and consumer hardware. There are a lot of options we have as a society in deciding how to treat risk.
You should see the amount of code that goes into a modern car, elevator, or the summation of code in plc's in plants. There is plenty of code now days that have an expectation not to kill a user.
Re: (Score:3)
And yet the PLC manufacturers themselves specifically disclaim using them in elevators or medical equipment, or other places where lives could be lost.
They also sell the "safe" version but if you want it, it costs way more than the other version (and usually is just the same product or older and well-known product, plus insurance). As the recolidesnake said, this is can be very very expensive.
Re:Sure (Score:5, Interesting)
so a PE can get out of being liable for a badly designed bridge by putting the blueprints and the bill of materials on a sign before you get on the bridge?
there is a point where i agree that the programmers should be liable for their code - to the extent that it shows negligence. the fact that software for so long has gotten away with "good luck, thanks for the cash" mentality is kinda sad.
I am a programmer - and i would be willing to stand behind my code used in the environment for which it was intended.. but at the same time i would want to be compensated for the risk.. same way a PE gets compensated based on the scope of work they have to sign off on.
What truly irks me about discussions such as this is that everyone wants to lay the blame on the programmer. It is the organization that is at fault. Matter of fact, the responsibility for a defective software product lies squarely with upper management. Frankly, I just don't get this perceived need to roast programmers and software engineers alive, when defective designs in every other industry cause harm, and nobody talks about throwing those engineers under a bus.
... well, guess what. I wouldn't be a software engineer anymore. Why should I go to jail, or be bankrupted with legal fees, when I did a perfectly competent job, but a bug still managed to get by QC? Might as well put the QC team on the hot seat too: they're the ones that missed it. Fact is, the corporate veil is there for a reason.
Standing by your code is one thing: taking the legal responsibility for a finished, shipping application that has problems that you would certainly have fixed if you knew about them is something else again. Management decides who works on what project, how much (if any) quality control time is assigned to that project, management decides what bugs are minor enough to fix in an update (and sometimes they're wrong about that.) Management decides who to hire in the first place.
I work in an industry where my codebase, if it were to malfunction in any serious way, would be a major problem for some rather large plants worldwide. But here's the thing: if the responsibility (and legal penalties) for such problems were mine, and mine alone
In any organization it's the people at the top (the people who get the big salaries and golden parachutes) who ultimately maintain responsibility for such failures. And that is how it should be: they make the big decisions, they're the ones who allocate resources. Your average code monkey is no more at fault for a product failure than the janitor. That's why, unless there's gross mismanagement, it's the company that is penalized, not the individual employees. There are supposed to be checks and balances. Face it people: we know how to do code right, but most vendors simply don't want to spend the money.
That bridge you were talking about is a perfect example: the reason bridges don't fail very often because of design flaws is because those designs are reviewed and cross-checked and signed-off upon by slew of other engineers and designers who make sure the design is solid. It's that way because nobody is perfect. Again, who decides how much code review and design assurance is necessary? Yeah, you got it: management.
All the disclaimers in the world don't mean squat in court if your software causes significant economic or physical harm. The company that produced it (not the individual developers) certainly can be sued and redress granted. But penalizing individuals for systemic problems within a given organization? Even discussing that is patently ridiculous.
There's no good reason to burn engineers at the stake. Plenty of reason to boil a lot of CEOs and managers in oil though.
Re: (Score:3)
No. As a Software Engineer myself I see this 'blame the management not the engineer' mindset as an unacceptable abdication of responsibility. Management isn't the technical expertise, the engineer is. If your a Civil Engineer PE and your MBA boss asks you to sign off on a design, that's great and all, but you don't sign off unless you're sure that the designs are sound and acceptable according to your trained, professional opinion. The company is paying you to make that call honestly, they did not and can
Re: (Score:3)
real engineers build things that can kill people if they do things wrong. they have all the same pressures from management, but they still (theoretically) have standards, and licensing bodies, and like, rules and stuff.
Yes, all of which are designed to ensure competence, not to assign blame. If an executive hires an incompetent, the fault for any future problems lies with that executive. Who is more the fool: the fool ... or the man who hires him?
Re: (Score:3)
real engineers build things that can kill people if they do things wrong. they have all the same pressures from management, but they still (theoretically) have standards, and licensing bodies, and like, rules and stuff.
This is part of the current problem. Software Engineers are writing lots of things that can kill you and we don't have any licensing body or laws requiring a PE to make specific applications. It generally means we can't be held responsible, but that cuts both ways. If we're working on a serious application we have nothing to hold back from management if we know the design doesn't pass muster. A PE must attach his signature to his work to approve it so a PE has leverage in the ability to refuse to do so
A broken bridge kills people (Score:3)
Software Liability is just the big companies trying to take control. Nothing else (well, there's a healthy dose of fearful stupidity there, but those people are silly, so I don't count 'em).
Re: (Score:3)
Re: (Score:3)
No, you hold the aircraft manufacturer liable because they're the one who put buggy software in an airplane.
Or, if you're an aircraft manufacturer and you want the person who developed the software to assume liability, you make them sign a contract to that effect before you pay them.
Re:Crash! (Web of responsibility) (Score:4, Insightful)
"Wouldn't you hold the software developer accountable for that?"
Which gets to why this idea by itself won't work.
First, who is the "software developer" of a system that uses lots of modules from a variety of vendors (including hardware aspects)? You have an entire ocean of people involved with a big project like that from designers to coders to testers to users...
Second, companies will just use corporate law to create liability shields where each part that could go wrong will be in its own sue-able unit with minimal assets.
Third, let's say something does go wrong, and you can point at a bit of offending code. But, was that really the problem? What about the compiler not smart enough to catch a *semantic* error? What about the simulators that were not good enough to discover the bug in advance? What about the testing procedures? What about the broken CS training programs that focus on theory and not practice? What about the managers who picked a poor development platform because it was popular? When you can go up a chain (or web) of responsibility, why blame the coder at the bottom when there are so many factors involved in making that accident, some of which operate on different timescales?
This whole issue is part of the reason why things like Forth and Smalltalk were so wonderful as small and understandable self-reflective systems, but we got mainstream adoption of buggy C/C++ and bloated Java instead. When the plane crashes from a pointer error, maybe we should blame those who did not choose to support Smalltalk decades ago?
Re: (Score:3)
In that particular industry, they are held accountable. This is why the software for aviation is so heavily tested and costs many times what commercial software costs.
Re:Sure (Score:4, Insightful)
Ah, idealism! The proposed law, with Clause 1 in place, and enforced, doesn't sound too bad. Do you really think that's the way it would work? In the real world, any software liability law would be written by lobbyists working for Microsoft, Oracle, Adobe, EA, et al., and there is no way in hell it would make life easier for open source developers than for the big commercial developers.
Re: (Score:2)
Dang. Stupid reality. Maybe stones and bombs could help. Where should we put them? ;) Just kidding. I don't wish to stone and/or bomb anybody. We must forgive them, for they don't know what they're doing. Or so I'm told.
Re:Sure (Score:5, Insightful)
They already have the beginnings in place.
It's called "patent indemnification," which they insist that vendors must have. Yes, effectively "patent violation insurance" to keep other companies off your back. Granted it's not entirely "liability insurance" but it's a step towards the state where you cannot develop software independently, but instead must be under the thumb of some larger corporation (or somehow have millions in insurance) to write and distribute software.
Re: (Score:3)
There is little evidence to suggest the NT kernel is especially insecure. The vast, vast majority of "exploits" don't rely on kernel design flaws o
From TFA ... (Score:3)
From TFA:
So if you're distributing the source code (and license it correctly) the most you'll be out (aside from malicious intent) is a refund.
Re:Sure (Score:4, Insightful)
It's very important we decimate the last industry the US has that's still mostly functional, profitable, and productive
Re: (Score:2)
1. That is medical industry bullshit. They just want tort reform to improve their profit margins.
2. When you make 250k+/year a million worth of insurance is not something too surprising. People routinely insure homes that cost more than 4 times their yearly income.
Re: (Score:3)
Re: (Score:3)
Define middle class.
It used to mean all the wealth of aristocracy and none of the privilege. So then there has not been much change by that metric.
If you mean they were considered middle income and paid like other white collar workers. Then we can be pretty sure this is the result of the regulations they have protecting them.
Re: (Score:2)
Define middle class.
It used to mean all the wealth of aristocracy and none of the privilege. So then there has not been much change by that metric.
If you mean they were considered middle income and paid like other white collar workers. Then we can be pretty sure this is the result of the regulations they have protecting them.
40 years ago I went to public schools with children of Dow Chemical CEOs.
Re: (Score:3)
Well, as a median, it implies that it goes both up and down from there.
And
http://www1.salary.com/Physician-Family-Practice-salary.html [salary.com]
Suggests the median might be a bit lower, and that curve looks pretty bell (not sure if that's by definition at the source, or by actual sampling).
Re: (Score:3)
Re: (Score:2, Insightful)
If Console game developers can put in the added effort to make a product that is reasonably bug free, or is otherwise unplayable, back before consoles could update the software then I'm sure MS can debug Office a little bit better before shipping.
Office has a heck of a lot more code than Atari 2600 Space Invaders. And a heck of a lot more ways to interact with the user.
Office bugs aren't 'I press the left button and go right', they're 'I embed an Excel spreadsheet with 500,000 columns and when I change the font to 96-point Comic Sans the first column displays in the wrong font'.
Game Bugs Look More Like (Score:3)
Re: (Score:2)
Re: (Score:2)
What statistics do you have to demonstrate the cost savings "tort reform" would bring to healthcare?
Or, did you just lazily accept what you were spoon fed by people who don't want to be responsible for their actions?
Re: (Score:2)
If the medical industry was serious about this it would already be the case. You could get your surgery X% off if you sign some tort limitation form. That would however go counter to what they want, which is to limit tort and pocket the cash.
Re: (Score:3)
How could there be evidence for something like that? The closest you get is opinions from economists, practitioners barely one step above Voodoo priests, and you can always find one that supports your position. Clearly what we have isn't working very well, and many of us who've been around long enough to see the way the system has changed over the years don't find it hard to imagine torts as a cause.
Re:Sure (Score:4, Interesting)
Re:Sure (Score:5, Insightful)
Re:Sure (Score:5, Insightful)
No, licensed engineers just cover their asses better.
Or do you think the engineer should be held liable when someone parks a 30 ton vehicle on a bridge rated for 10 tons and the bridge fails? Well, then why should a software developer be held liable when the software asks you to enter your name and, instead, you feed it data which causes a buffer overrun which allows you to root the database server and steal everyone's credit card numbers? If you would have just entered your name correctly that never would have happened. A clear case of misuse if I ever saw one.
I think software developers should be liable but the liabilities need to be defined first. And if someone hacks the software outside of the scope of the security standards and practices that have been set by the government, put in place correctly by the developer and verified by the assigned regulatory bodies then there is no liability if something goes wrong.
Meanwhile the cost and time required to develop software will skyrocket. If you need any evidence of that, just look at how much time and money it takes to build a bridge these days.
bad example (Score:3)
Is it still your fault... (Score:3)
If you design it before the invention of the hairpin?
Another law? No thanks. (Score:3, Insightful)
"There should be a law!"
No. No, there shouldn't. There also shouldn't be disclaimers that "this coffee can burn your ass," "don't point this gun at your face" or "don't use this curling iron to stir your bathwater while it's plugged in."
If organizations see pen and paper as the only alternative, then they're probably getting the quality of IT support that they're paying for.
Re: (Score:2)
There also shouldn't be disclaimers that "this coffee can burn your ass," "don't point this gun at your face" or "don't use this curling iron to stir your bathwater while it's plugged in."
No, there should be laws that make the people who made such things liable! If i'm stupid i should be able to profit from it dammit! I tried to snort my latte and starbucks didn't stop me, i deserve 1 million dollars!
Re:Another law? No thanks. (Score:4, Insightful)
The author is talking about making the producer of bad software liable, just as we would hold a gun manufacturer liable if the gun blows up in a person's face.
Re:Another law? No thanks. (Score:4, Insightful)
The buyers bewared, ganged up together, and started to act pre-emptively.
Re:Another law? No thanks. (Score:5, Funny)
Funny, none of my firearms actually say don't point at face
It's usually engraved at the end of the barrel. Look closely.
Great idea (Score:2, Insightful)
Yeah, let's drive the cost of software through the roof. That will solve everything! Companies will employ a lot more people to do testing but will still have to invest in huge insurance policies just in case they miss something. Your next copy of Windows will cost more than a well equipped car.
Re: (Score:2)
A terrible idea... (Score:2)
Re: (Score:2)
If you read the proposal, it isn't likely to harm hobbyists much -- it basically exempts open source from the requirements. The likely result will be for proprietary software companies to either go out of business and become service companies, and for software to be developed in the future as open source by hardware and service companies.
Which I suppose wouldn't be the end of the world.
Re: (Score:2)
Automobiles are complex enough that even the most diligent automotive designer will produce bugs. It's nigh impossible to create 100% bug free autos. I think this would pretty much kill the industry as well as be detrimental to hobbyists.
See how foolish it sounds when you start crying like the sky is falling?
What's foollish about that? Automobiles routinely ship with potentially disastrous bugs, particularly now they're full of software; one big manufacturer recently had a recall because repeatedly switching between drive and reverse on some of their auto transmissions could destroy the transmission due to a bug in the transmission controller software, for example.
If you want a car with no bugs, you'd better be prepared to pay $500,000 for a Honda Civic.
Re: (Score:2)
"one big manufacturer recently had a recall"
sane would be requiring software manufacturers to do the same thing (or have a way to patch things when these bugs come to light.
Re: (Score:2)
See how foolish it sounds when you start crying like the sky is falling?
...like doctors are liable for taking out the wrong kidney...
...giving you a vasectomy when you were in for Lasik...
...architects are liable for designing a home without support columns that collapses and kills your family...
...chef would be liable for feeding you rat poison...
Wait. Who's using crying that the sky is falling, My Hyperbole?
Re: (Score:3)
I'm sure you are aware of the fact that even NASA don't always get it right.
http://en.wikipedia.org/wiki/List_of_software_bugs [wikipedia.org]
It's a great article, by the way. But still...
"...on a dollars-per-line basis, it makes the group among the nation's most expensive software organizations."
"The specs for that one change run 2,500 pages, a volume thicker than a phone book."
Re: (Score:2)
For more details on the Mariner 1 error, see http://en.wikipedia.org/wiki/Mariner_1 [wikipedia.org]
It's really interesting to see that it's believed to have been an error "...in hand-transcription of a mathematical symbol in the program specification, in particular a missing overbar". If the specs you write code from aren't right, it's very hard to write code that passes muster. Even more fascinating is the way it was simplified when explaining the error to congress and other groups (the press, etc), since most people woul
You can't trust code ... (Score:5, Informative)
"You can't trust code that you did not totally create yourself."
I can't trust the code that I did totally create myself, either.
Re: (Score:2)
You know, it just occurred to me that there really isn't a secular alternative to 'Amen' that gets the point across quite as well (at least not one I know of).
Re: (Score:2)
A-Freaking-men. You know, it just occurred to me that there really isn't a secular alternative to 'Amen' that gets the point across quite as well (at least not one I know of).
I think that is what the new batch of kids are using the word, "This!" for... even though it doesn't quite jive with the cut of my jib. :-)
Re: (Score:2)
Re:You can't trust code ... (Score:5, Interesting)
That reminds me of an anecdote one of my CS professors mentioned. When fly-by-wire technology for passenger planes was starting to get rolled out, they polled some people about their willingness to fly on a plane that was controlled by a computer. The group that had one of the largest negative response was programmers. For everyone else the software is just magic.
Re: (Score:3)
Yep. DO-178B is a bitch.
Re: (Score:3)
Some quote (approximate) from Knuth or some other guru:
"Be careful: I've only proven the code to work; I haven't actually run it or anything."
Re:You can't trust code ... (Score:4, Informative)
I can't trust the code that I did totally create myself, either.
When was the last time any of us totally created code? I've been coding to various operating system APIs for a long, long time. Even back in the DOS days I made quite a few DOS and BIOS calls. We use(d) lots of 3rd party libraries for various things. Not to mention the libraries that come with your compiler/IDE.
I'm pretty sure I've never totally created any runtime code. Maybe some useless crap I did back in an assembler class would count?
I did have a radio-shack 8-bit processor kit when I was a kid though. That was all machine language (there was no ROM or non-volatile storage). However, I still had to trust that the opcodes did what they were supposed to do. Intel (and others) have shown us you can't even count on that all of the time.
Re: (Score:3)
Probably never - because the only way to totally create code is to directly generate machine code (not assembler) directly on the bare iron. Even at the assembler level, lat alone at higher levels, you're dependent on the guy who wrote the compiler.
You Can't Trust An Assembler, Either (Score:3)
Trust an assembler? Who wrote it? The closest I've come to creating software of my own hand has been on a PDP-11 test station, and the embedded processor it tested... writing hex values directly into memory. But even while massaging words by "hand", I was still relying on someone else's tools to get my intention from the keyboard to the flip-flops, and thus still suffering from more levels of abstraction than any civil or mechanical engineering effort.
Re: (Score:2)
Maybe part of the reason is that the skilled professionals are actually liable for any damage that is due to their error or neglect?
Engineering liability (Score:2, Insightful)
I need you to design a bridge. We've already promised the customer that it'll be light and strong, but we only have budget for paper (so we're ok on 'light', just make sure that it's strong), and the deadline is next Monday.
If you think it can't be done, I have the "outsourcing provider" on the phone telling me that there are 500 engineers who would do it. I need an answer in two hours. I know that you've just bought a house and have a new baby on the way, so think again before you protest.
By the way, we've
Re: (Score:2)
and any PE would walk away..
People need to stop equating software to buildings (Score:5, Insightful)
Re: (Score:2)
In this context, "over coding" software refers to, for starters, defensive programming techniques (i.e. checking the return values of all the functions you call, fully validating external inputs, etc). It does not reduce the number of points of failure, but it does require the programmer to consider them and the gracefully handle them or transparently report
Re: (Score:3)
Engineering is a profession (Score:3)
Hey, engineering in general is a profession. Bridges and skyscrapers get built. And if the engineers mess up people can die. And there's liability for flaws.
Should all software hold to this standard? No. I didn't involve a civil engineer building a clubhouse as a kid. But there are places where correctness does matter and is worth the extra discipline and professionalism.
Re: (Score:2)
No because 90% of coding is working with pre-existing frameworks in code.
70% of the job is working around bugs in IE 6/7, MFC, and Win32 for all software development in the real world. Believe it or not people need to memorize race conditions in IE 6 as sometimes code will work in a test release but in real life it wont work randomly etc.
Sure, this is slashdot and someone may may reply they code in C, but that is niche 3% of all programmers. No one designs things from scratch all by themselves from the grou
Re: (Score:2)
Sure; but this is not about a few corners being cut, this is about an order of magnitude difference in costs. The design requirements, implementation, and QA are all massively increased.
That's fine if you're building a nuclear power plant's control systems, or an autopilot, but to be blunt, people are happy with Windows because it makes the right compromises for them.
Re: (Score:2)
"For everything else, there are poorly paid, incompetent Indian programmers"? Because that's what it comes down to.
no thanks (Score:2)
All that would happen is vendors lock down the system totally and only allow signed, vetted code. Approved websites only. Pre-scanned emails only.
I'll take my chances.
Treat software as an Engineering process (Score:2)
Unfortunately, that costs both time and money. It requires that you have a formal systems engineering approach; Independent Verification & Validation; Testing and first of all; Formal Requirements that are traced to the implementation.
You can't get away with doing it "On the cheap". I don't know many countries that allow Rail Traffic
Re: (Score:2)
All we need is Love (Score:4, Interesting)
... All we need is love and Free Software. And even the love is not strictly a requisite.
Let's say everyone owns Free software, so nobody (i.e. everybody) is liable for faulty Free software. Everybody (i.e. nobody) pays.
In other words, sure, let the proprietors of proprietary software pay for software behaving badly.
If the software is free it's everybody's and nobody's responsibility. It's like culture and language in general. We do it together.
Who's with me?
Standing on shoulders (Score:2)
The solution seems a little too simplistic. Just look at any very large software project, like an operating system. Even a simple operating system like an iPod has a huge set of sub-licenses (go check out the Legal menu item, at least twenty on my nano). Large commercial projects often have code contributed from other sources; some open source, some not. If the problem comes from one of those contributions or sub-licenses, what happens?
I could definitely see Microsoft setup a fully owned subsidiary tha
Outlining a World Where No One Writes Software (Score:3)
There are already far too many lawyers sucking overhead out of software development companies. Increasing liability for code will drive up how much it costs; software is only cheap because it's relatively low risk to release.
I make my living working on open-source projects. Given how many imperfect components I work with, in a world with liability issues my full time job would become contract paranoia instead. It's already extremely dangerous to try and make a living from open-source work due to the huge patent minefields you're exposed to. If something like this happened, the only companies who would still be able to afford the risk of coding would be corporations with large legal departments. I'd have to move to a country that doesn't have these laws instead, which is exactly where all the software jobs will migrate to (even faster than they are already migrating now).
What would be SANE (Score:2)
have things set so that REASONABLE EFFORTS are required. Now the lawyers can sort out the meaning of the term but if you have done everything possible (input checking and not using known unsafe code ect) then you should be safe from being sued.
Also if something is later found to be "unsafe" then the required patch/update should be given out free to existing customers (no fair bundling an error fix with a program feature update just to be able to sell the update). Now yes this should have a reasonable limit
Don't trust applications, ever (Score:4, Interesting)
The responsibility for preventing security problems with PCs should strictly fall into 2 places, the User, and the OS.... however... not the way 99.99% of you are thinking about it.
The user should decide what resources a program NEEDS in order to do a task, such as which folder it can access, what network connections, etc. This allows the user to decide ahead of time what they are willing to risk. Once that determination is made, the user then would give that list, along with a pointer to the program, to the operating system.
The OS should then enforce the users choices.... if it's not in the list, the application shouldn't even be able to find it, let alone access it. If the OS fails to enforce the users will, then the OS is at fault.... if the User gave away the store, well... they gave away the store.
This requires a simple change to the base design of operating systems, instead of permitting everything, and limiting actions of a running program to that of the user's account... the OS should limit the actions of the program to a short list of resources supplied by the user... and nothing else. Of course, the refactoring of everything to add this additional layer of logic is a massive undertaking.
There would still be the traditional user rights, access control lists, etc.... but there would also be a level of control where the user decides which of the resources they have should be given to the application. This is called "capability based security", or cabsec for short.
It's going to take somewhere between 10 and 15 years before people are fed up enough to make the switch.... but it will happen eventually.
Security isn't an application issue... it never was, and never will be.
Re: (Score:2)
This is called SElinux. It already exists and is in wide use.
Re: (Score:2)
No... it's not... SE Linux and App Armour enforce static rules.... not dynamic ones decided by users. However... it is a step in the right direction.
Re: (Score:2)
You can make your own SELinux policy all you want. I fail to see how that does not fulfill this.
Re: (Score:2)
You can make your own SELinux policy all you want.
Good luck with that.
Apparmor is hard enough for a typical user to configure, SELinux seems to be pretty much impossible unless you're an expert.
Re: (Score:3)
Does your toaster automatically attach itself to bread as an option? - Computers are different beasts, and analogies don't always apply.... be careful.
When your
Users are the biggest problem anyway (Score:2)
So who is held libel when the user gets an email that says they've won millions of dollars if they click this link, ignore the security warning telling them they probably shouldn't click the link, and proceed to install some malware from god knows where?
If you're telling me that I am... well that's fine, becuase you're no longer allowed to click links. Or install stuff. Or do anything other then what I've whitelisted. Congratulations, you no longer have a PC.
Nothing wrong with paper (Score:2)
The advantage for paper is that you need physical access to break its security. Paper in a safe is even better. And every educated person understands the characteristics of pen and paper, while understanding IT security requires an expert (I am one). I personally have some things on paper that I would not put on my computer.
So, yes, this is an indication of failure on the part of rolled-out IT security, but it is not a problem. At least I do not see one here.
"used normally" (Score:2)
Makes sense in some cases ... (Score:2)
If the consequences of the poorly written code are negligible, who cares.
If the user can take well known preventative measures to avoid damages, and don't, then they are liable.
If a software fault causes damage to life or property, then the liability of the developer is a serious consideration.
A lot of research has been done to improve software engineering practices to make software more reliable. A lot of research has been done in computer science to prove algorithms. If you're writing mission critical s
why "back to"? (Score:2)
There's nothing wrong with pen and paper. Computers don't "replace" pen and paper, they "add" to pen and paper. That's always been the case. Just look at data storage. Pen and paper, subject to fire and flood, but otherwise reliable for ages. Computer storage can die for any number of reasons, but you can duplicate it thousands of times easily. That's the safety net.
Today, welcome to the internet, your computer is accessible -- by the way, you didn't need to plug it in -- so it's accessible to all. B
Already done in Contracts (Score:2)
In contracts for software projects, rules to guarantee certain safety and security levels are already present. In embedded systems and trading platforms, there are even laws in place which define how safe something has to be. For example the Safety Integrity Levels are used to define how many failures may occur before violating the law. Similar stuff exists for security. The real problem is, that no one in low risk areas is willing to pay for higher safety and security levels. Present end user software is t
I proposed something similar in 2000 (Score:4, Insightful)
I proposed, back in 2000, that Microsoft be required to provide a full warranty on their products [animats.com] as part of their antitrust remedy. "Full warranty" has specific meaning in US law; see the article. A few vendors have provided full warranties and not found it too expensive. Notably, GTech, which builds gambling systems, is held financially responsible for errors made by those systems. This costs GTech less than half of one percent of their revenue.
It's time for the computer industry to grow up and take on warranty responsibilities. The auto industry had that forced on them by Congress in the 1960, over the screams of the auto industry. Cars rapidly became safer and more reliable.
Re: (Score:2)
As opposed to the current system where the cost of incompetent software development is borne almost entirely by the people buying the software or third parties. If there's a vulnerability in say IE that allows people to get their hands on my password for my bank, it's not going to be MS that's out the money, it's either going to be me or the bank.
Introducing some liability for companies that release buggy software then hold off on providing patches until the last minute is exactly what we need. Otherwise MS
Re: (Score:2)
Nice trolling. So, if there's a vulnerability in a browser that reveals their password, it's obviously the end users fault for having chosen to use a browser programmed by incompetent people.
I'm not sure how one can be expected to personally audit every piece of software that they install on their computer. At some point it ought to be the responsibility of the people creating the product to do the necessary QA and patching to prevent such things.
It's not likely to be 100% effective, which is why due dilige
Re: (Score:2)
OK, then who precisely should have to pay for the cost of the exploits and who has the ability to actually influence the number of exploits in the software?
As long as the developers are the only ones with the ability to patch those bugs, they're going to have to shoulder some responsibility for the vulnerabilities that exist in the software. I'm not sure who else has that level of responsibility for the software package.