Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
The Courts Government News

Who is Responsible? The Developer? The User? 376

Anonymous Coward II asks: "I am working on a paper for a computer ethics course, and need to answer the following question: Who must be held responsible: The person that develops a software that will (or can) be used to illegal ends (like to break into a computer system, to illegaly monitor other users, a virus, etc), or the person that use it afterward? I'd like to know what Slashdot users think, and what is the answer according to the law." Software is a tool, just like any other, so when things go wrong I think this then boils down to a question of personal responsibility or negligence. What are your opinions?
This discussion has been archived. No new comments can be posted.

Who is Responsible? The Developer? The User?

Comments Filter:
  • Having not thought about this very long
    My gut feeling is that if the program has a legitamite use (Like BO2K can be used as a remote admin thing), then the person who misuses it is to blame.
    But if there's no legitamite use, then the author should be held to blame.

    The only problem is to work out what constitutes legitamite use.

    iain
    (sorry for the poor spelling)
  • by JatTDB ( 29747 ) on Wednesday November 24, 1999 @02:08AM (#1508018)
    I don't blame gun manufacturers or knife manufacturers for murders. I don't blame car manufacturers for drunk drivers. And I don't blame developers for writing software that could be used in an illegal way.

  • You are to be made responsible if you issue a tool that is potentially dangerous without indicating it. Companies like AOL make internet connection look like a breeze, and therefore are responsible for hapless users unknowingly offering their box as a spam hub.

    At least this was the essence of our Lunux User Group's last night's discussion.
  • Man, I need to figure out how to get Slashdot to do all my research for me......
  • If one is killed by a hammer (in other persons hand, of course :-) is the hammer manufacturer responsible by this crime? What if the murder was by a gun (pistol, shotgun, ....)? Or a chainsaw? Well, this is a polemical issue. But the truth and justice must be between the opposite sides of the question, I believe the Judge must examine each case separadelly.
  • well, if you consider gun makers, and cigarette makers and other such organizations, who produce harmful products, abused by others, its obvious that the law sees the user as being the responcibile party for whichever use the product is put to.

    Of course, a virus maker who writes a virus which is infectious, and 'accedentally' leaves it where someone can see it to distribute it would be guilty of at least neglegence... so the producer wouldnt be totally off the hook.

    Its like, if I make peanut butter and someone is allergic to peanuts, I shouldnt be held responcible if they eat some unless I mislabeled it and make it not taste like peanut butter. Now if someone hid some of it in the allergic person's food, I dont think that I should be responcible for it at all. Many of the hacking apps are demonstrations of exploits or are legitimate tools to test ones own network... therefor they have legitimate uses too, and are not made expressly to cause trouble :)
  • I think alot of it depends on the situation.
    A gun manufacturer can not be sued if one of its guns is used to kill someone. They are simply making a 'tool' to be used for personal protection. I think that more and more these days we are seeing exploits coming out that are dis-abled (bad shellcode) or simply print a message ('I h4v3 0wn3d j3w!') while still getting the point across that there is a problem with the software.
    --codemonky
  • My totally uneducated guess :



    If the software can be used for legitimate [response.cx] poiposes, then you could argue any illegal [microsoft.cash.sales] usage is the fault of the user [antionline.com]

    However if the software
    • promotes
    illegal activities, then the fault lies with the programmer [response.cx]

  • Obviously the one who commits the crime should be held responcible. There is nothing in this world that can not be used in a wrong way.
    Cars get you to work....but they can kill
    Email can be used for fast reliable communication....Spammers can flood your inbox with garbage.
    Scripts can automate frequently used processes....Melissa Virus

    In the end the question becomes similar to the statement---Guns don't kill people, people kill people.
  • is that it depends on the design of the tool. A gun for example can be used for assualt or self defence, so the responsibilty for it's use comes down to the user. On the other hand, if a company designed and sold an item that only had a negative use eg. car bombs(extreme example I know), then both the company and the user should be held responsable. Just common sense.
  • But what about software that has NO legal utilisation ? virus are such things. A gun allows you to defend yourself, not only to attack others. I think for some kind of software, the developer HAS a responsability.
  • Look at other tools that are used to break the law:

    Tools to break into houses and guns. They can be developed legally and they can be sold legally (possibly with restrictions). Why should software be different?

    You can hardly draw a line between software that *can* be used for illegal purposes (almost anything) and software that is built to break into others computers. Look at Back Orifice. First a crackers tool, now a remote administration toolkit. It is much easier to see that automatic guns can only be used for evil purposes (killing people), yet their development and production is normally legal.

    - Alex
  • Yes, I agree to an extent. But what if a gun manufacturer was selling rocket launchers to terrorists? (a bit extreme I suppose) How's that different from a developer writing programs that have no purpose other than causing damage and distributing it to script kiddies?

  • by Ratface ( 21117 ) on Wednesday November 24, 1999 @02:15AM (#1508034) Homepage Journal
    The Hacker News Network has been asking much the same question. Anti Virus companies have been labelling some programs that allow remote undetected monitoring of a computer as virusses (e.g. BO2K) while other products released by "mainstream" software companies,(such as Softeyes) are not scanned for at all.

    What makes an anti virus company label one program as a vrius, while another program with similar uses is unlabelled?

    HNN ask the question at http://www.hackernews.com/orig/avind ustry.html [hackernews.com]

  • There are so many other variables.

    It's all a question of 'intent'.

    Ultimately if someone is knowingly using software for illegal things then they are responsible, end of story.

    However you can also argue that the people who develop the software can be held accountable for enabling people to perform these illegal actions. In the same way that it is illegal to sell certain guns to people in the UK unless they specifically have an owners license.

    Then again, people use windows, linux and all sorts of other things for illegal purposes, visual interdev creates programs that do illegal things all the time (haha - sorry - had to throw that one in).

    It's an interesting ethical question, creating software purely for illegal purposes is indeed unethical, but it *can* be a fine fine line.
  • if i produce lock picks for sale to locksmiths and you manage to get hold of a set _and_ get caught possessing them, you can be charged federally. although it's true that gun manufacturers may be held responsible for the damage caused by their weapons, i don't believe that it applies here. if i write a security tool and you use it to violate one or more laws, you should be the one to go to jail.
  • hehe theres another precedent! UHF
  • To me, the answer is simple, the consequences should lie with the person who used some software to hurt someone or damage something. Where does the finger-pointing stop? If someone writes an extremely lethal virus and compiles it using gcc is somebody going to try and blame GNU for providing the tools to build the virus? There is a major problem today, in the US at least, of blaming everyone and everything remotely associated with someone bad (think: Doom and Columbine wackos). As much as I personally dislike guns, I believe the same argument holds there as well...guns don't shoot themselves. Blame the responsible person: the one who committed the act!
  • by rde ( 17364 )
    Like pretty much everyone else, I've got to say that it depends on context. In nearly all cases, though, I'd be inclined to blame the user.
    I'd like to rephrase the question slightly, though.

    Does the fact that a Virus Construction Kit can be used by sysadmins to aid in network defense justify its existence?
  • While a virus might have no legal use, what about studying it to learn about it? A virus is usually a fairly nice piece of code.

    When it comes down to it, it's just a series of 1s and 0s, like and other software. It's up to the user to use it responsibly.
  • I should presume it is *your* opnion which matters at the assignment, so just do your homework!

    I agree with previous poster that the one who actually commits anything should be held responsible; however, how about drug dealers or tobacco companies (still remember those suits?). If a program is designed to be malicious (like a virus) then the author is ultimately responsible. People who run it (like users on infected PCs) are actually victims then.

    And then, I never liked ethics lessons.

  • To get a better feel for the issue consider other products and their users. If a drunk driver kills a pedestrian can we sue GM for making the car? If someone uses a steak knife to kill or maim another person can we sue tramontina for making the knife? If I wrote "rm" can someone sue me when their disk gets wiped?
  • Neither of those 2 is repsonsible, I think the people resonsible is the company who leaves security holes to all allow for cyber instrusions
  • I notice that a lot of posters are using the gun analogy, in that gun manufacturers are not to blame for shootings. But if you look at this link on the BBC [bbc.co.uk] it seems that people *are* suing gun manufacturers, or at least makers of assault rifles, as they are not 'self-defense weapons'.

    I think we can stretch this to malicious software too - e.g. viruses. But then, what if you were to write viruses for 'educational' purposes? If you write cracking software, I think you'd have to prepared to face some legal action.
  • if you consider gun makers, and cigarette makers and other such organizations, who produce harmful products, abused by others,

    Guns are not inherently dangerous products at all. Used properly, at a target range, a gun will do no harm to anyone whatsoever. Cigarettes, on the other hand, will always do harm to the user, even if used in the proper fashion, so please don't compare the two !!!

  • Many years ago Dan Farmer authored a paper with a title similar to "Improving Your Computer Security By Breaking Into It". The paper illustrated a number of means of hacking into a system, some of which sadly are still very possible today. His intention wasn't to be the first enabler for script kiddies, it was actually to make the internet a better place by improving security. His thesis was that the best way to counter these attacks was to learn to think like your attacker. He didn't have any hidden motives. It wasn't like a lot of self-proclaimed security experts who say they're producing material to enhance security with a few concealed winks and nods to the script kiddies. He went on to write SATAN later. Apparently educating system administrators and programmers didn't help. Buffer overflows were still rampant, critical security patches weren't applied and the internet itself was rapidly growing. It wasn't just touching the most wired of the geeks anymore but was starting to become part of the general publics experience. SATAN was an automated audit system. Some moron at SGI even fired him over this.

    Both of these systems could be exploited and abused and both of them were. Dan's intentions were still honorable though. Yes, it was possible that they could fall into the wrong hands but both items in the right hands could help armour your systems against these attacks. It's a failure of system administrators everywhere that script-kiddies COULD use these tools against them.

    The responsibility here is firmly planted on two groups. Foremost are the abusers of the tools. Just because somebody leaves the doors open doesn't give anybody the right to exploit it. The administrators who were compromised by things which were EXPLICITLY EXPLAINED OR AUDITED also bear some responsibility to any users who were effected. Ideally I'd love to see the day where script kiddies are locked away or otherwise punished (I loved somebodies suggestion the other day of forced community service teaching computer skills) and administrators that are proven to have not been dilligent in applying patches were open to financial repurcussions.

    Some groups write scripts for the sole use of script kiddies. They may claim they're writing security tools but I find it hard to believe them when comments in the source code proclaim "// n4m3 0f z1t3 t0 b3 0wn3d" so they're liable. They're purposely producing tools to enable computer crimes.
  • To answer the question -- both author and user should be held responsible, to varying degrees and depending on the circumstances. It should be based in part on the severity of the damage and in part on the intentions of the people involved. Intent to cause harm is hard to prove, however.

    Another problem you're going to have is your idea of punishing people for writing programs which cause (or simply are capable of causing) harm. Compilers, interpreters, and even good ole' DOS DEBUG are good examples of programs that can be either very useful, but can also be used for destructive purposes. I think we have to look back again at the overall intent of the person writing or using the program.
  • 1-you don't need to take a virus to see a nice piece of code. I'm sure some things like demos are also REALLY nice pieces of code 2-when you write a program, you offer somebody the possibility to do something. for a 'normal' program, you give the user the possibility to achieve a certain kind of work. when you write a virus, you give him the opportunity to do nasty things he could not have done without you (well, without you and the others virus authors. but let's forget the "i'm no the only one!" excuse). So YOU are responsible.
  • I usually believe that a technology/process/program/etc to be neither good,bad,illegal, or evil. Only what we as users of said technology do with it is good,bad,illegal etc. In fact, this is one of the reasons (basically) why the RIAA's suit against the Diamond Rio failed. The Rio's sole intent was to play MP3s. The human is the one that supplied it with MP3s be they legit or non-legit. This is also the same type of arguement given by the makers of Napster. Napster is a distribution and search method for MP3s. Of course they warn people against releasing copyrighted works. "We are a way for unknown bands to get their mp3s out", the makers of napster have basically said. Unfortunately, how does one do searchs for bands you don't know even existed, if your search fields are artist or song title? This is where I think the RIAA's suit may get them. So it would seem that Napster is primarily a tool for searching and distributing known works...almost all of which are copyrighted.

    Diamond Rio is ethical. Napster will probably be found to not be ethical.
  • by Bruenor ( 38111 ) on Wednesday November 24, 1999 @02:24AM (#1508054)
    Okay, I hate to reply to myself, but I just found another reason:

    I'm on BUGTRAQ. I have been for quite a few years. Often a security problem is found and a commercial vendor remains unresponsive until someone produces a working exploit. Then, once the world has access to the exploit, the vendor usually begins work on a patch. Sometimes it's the only way to get their attention.

    Now, the exploit itself has no legal purpose when you use it. It could be an educational tool to explain about buffer overruns/race conditions/whatever, perhaps. But often someone needs to write it and publish it or the vendor will never do anything about it.

    Having virii and exploits should make us all more conscious of security and more prone to check your provider of software, check digital signatures, and more apt to want to see the source code.

    The world is not a nice place and people would attempt to break into machines anyway. If having virii and exploits out there increases the level of security in software and systems then I am all for it.
  • by Father ( 118609 ) on Wednesday November 24, 1999 @02:25AM (#1508055)
    I worked for a contract shop in Florida, and more than once used "hack" tools to get a job done. Occasionally the rules of engagement get you in a bind and you have to work outside those rules to get your job done. We had a source control machine that crashed, dead, inoperable with quite a bit of source code that we needed to retrieve. Without hack tools, etc, we wouldn't have been able to get the data back out by playing the role of script kiddies and using hack s/w to make the drive accessible. A tool is a tool. Without those tools in particular, my company would have had to face a serious financial set back. mike
  • I see it as such: The user of the "tool" whatever purpose the tool may be for, must always be held responsible for the use of the tool, and not the author. Just think, if NSA put up a badly configured IIS webserver, and I would use FrontPage to change their web, should M$ be held responsible ? ( I wish, but... ;) The same must go for software as it goes for hardware - S&W aren't responsible if you would use one of their manufactured guns to shoot the guy nextdoor ?
  • I don't blame gun manufacturers or knife manufacturers for murders. I don't blame car manufacturers for drunk drivers

    If the gun or knife where sold to anyone, with no restrictions, and blatently advertised as for murder; or the car given 3-foot pointy spikes at the front for killing pedestrians -- what then?

    It sounds silly, but that's what things like viruses and the scripts that script-kiddies use (what is the name for these anyways?) are essentially doing.

    The bulk of responsibility does lie with the user, and I believe that it is the person who uses the utility that should be held to account, but there is a line that must be drawn between useful utilities that may be used by the unscrupulous(sp?) to do illegal things, and programs created with the intention of letting people crack systems or whatever more easily.
    --
  • Off the top of my head, I'd say this is rather like the question of firearms, and I'd say that "guns don't kill people, people kill people" is even more applicable for software than for guns.

    Naively making it illegal to produce software capable of being used to break the law would make a lot of vital activity - for instance producing exploits for security flaws - against the law, which would be hugely to everyone's detriment. If that was done, the inabilility of honest law abiding people to effectively investigate security issues was be a massive boost to crackers everywhere.

    As far as I can see liability for breaking the law lies with the person whose intent it was to break it. If the that is the author of some software (eg, a program deliberately designed to spread a virus) then so be it, but if the author produces a tool with multiple functions (eg. BO2K) then he's no more guilty than a man who makes a knife.

    There are of course some tricky cases. For instance a friend of mine once wrote a virus as an exercise and gave it a slightly nasty payload. He never intended to release it, but unfortunately a copy got loose on his hard drive and infected several other machine before it was wiped out. If that had well and truly escaped, and done serious damage, where would the liability lie for that ? or is it a natural hazard ? Possibly there is no criminal liability in that case, but merely civil negligence by failing to contain the virus ?

    IANAL
  • If I make hammers, and someone kills someone by misusing the hammer, I would not be responsible.

    If I yell 'FIRE' in a crowded theater, and people use my 'product' by accidentally killing by stampede, I would certainly be responsible.

    In either case, I personally didn't do the killing, but the line of responsibility clearly falls on different sides. What if I invented and marketed a product that could only be used to kill sleeping people ? What if it had no other uses at all ? Would I be responsible ?

    The difference between who's responsible and who's not eventually is determined by the 'official' making the legal decision, and if that person is on the left side of 'center', the line falls more towards the manufacturer. If the person is on the right side of 'center', it falls more towards the user. In the end, that's the difference.

  • by evilpenguin ( 18720 ) on Wednesday November 24, 1999 @02:27AM (#1508061)
    I think the law has to treat the person who uses a product for illegal means as the "guilty" party. The person who makes it bears no automatic culpability.

    This is my general take. Gun manufacturers are not responsible for murders committed with guns. Now, I'm not a gun nut, but I think this is legally right.

    The same should hold true for the authors of nmap and queso (to name a couple tools that system crackers might use) and the authors of pgp and gpg (to name a couple tools that criminals or terrorists might use).

    Now, if it is a question of ethics, you've opened an entirely different can of worms. Ethically, I think several guns need a closer look. I think teflon tips are something that raise ethical questions. I think nmap has a few grey areas (what legitimate use requires the micro-fragmentation feature? That's there just to avoid string scanning intrusion detection.), but in each of these cases (except maybe those teflon tips) I think the law has to protect the author/maker and hold the user accountable.

    If we hold that the maker/author is responsible for all of the ways in which their product/idea is used, then we should have locked up Darwin because his ideas contributed to holocaust. We should lock up the inventor of the circular saw because it has maimed and killed. And so on...

    Ethics lies behind law, but the cliched figure of justice that adorns so many government buildings (at least so many American ones) wields a scale, a sword, and she is blindfolded. The sword is two edged as well. It may be a cliche, but it is an apt one. The law is not ethics. The law is the minimum interference to maintain the social order. While many conservatives in this country will argue with me about the law being minimal, it is certainly not the opposite. You can write and buy a book about how to crack safes. That's legal. Crack somebody else's safe, and you've broken the law. It seems absurd, but it isn't. To write a book on how to crack safes (so long as you believe in the idea of private property) is unethical, but I for one would not want to see it made illegal.

  • In my opinion, in a situation like this it is the responsibility of the person/people using a piece of software to ensure that they use it responsibly. If it weren't for this ideal within computing, many systems would not be as secure as they are now, due to a lack of tools to enable people to ensure that their systems are not vulnerable to many of the commonly known attacks.

    Some might say that programs which allow this kind of detection of vulnerability should not be made available, and that they make it far too easy for someone wanting to break into a server to do so with little or no experience. These people would be ignoring the fact that the information is readily available on the internet already, and that these programs only serve to make life easier for people. As long as the systems administrators ensure that they use these programs on their machines, and follow the advice given, they can be in no danger of someone using the software against them. In my opinion, any administrator who fails to do this simple task shouldn't be doing the job. They would also be ignoring the fact that these programs are not generally written to be easy to use. Anyone managing to make use of them must have at least some experience, and therefore would be able to do the same thing (though not as quickly) without the software to help them. On the legal side, I don't know if there is any law against this kind of software. But with the prevalence of these pieces of software in existence, and the fact that I have heard of no court cases relating to the author being sued (I'm sure there are several large corporations who would like to do it), I can only assume that there is no such law, or at least none which is strong enough to bring a court case against anyone.

  • hehe i meant to say potentially dangerous. :)
    tho even at a gun range and stuff there are still gun injuries... remember that kid who was shot by a stray bullet at a gun range? or the fact that bullets can apparently backfire? They're a lot safer than cigarettes, of course.
  • This is a little bit complicated, but not that complicated that a 5 years old cannot get the gist of it. Software are tools, and like all tools, they sometimes can be used for good or evil. For example, lets look at a gun? Who is responsible the maker or the one who used it to commit a crime? It depends, If the maker of the gun allowed the gun to be obtained by anyone, as far as ethics go he is partially responsible. The same applies with softwares.

    In the software world, take a look at BackOrifce. The guys who developed are partially responsible for all the wrong things that is done with it. I am not against them for releasing it or sharing the information, so please don't attack me. Likewise, the users who used backorfice are responsbile.

    Now, lets take a look at the latest bind overflow exploit that was released not so long ago. The developers of this code are not responsbile, The exploit was crippled that anyone who has not written or read about bufferoverflows could not use it. Now, if I was to take this exploit and uncripple it, then use it. Then I am to be blamed, not the developers.

    So, as you can see, It depends on why/how the software was released and deployed.


  • And an atom bomb is just a bunch of atoms... But that sure doesn't mean it can't wipe out a city.

    Just as a virus can destroy a network...
  • If you can sue a gun maker for the criminal actions of a third party (which has been allowed in U.S. court), why not allow lawsuits against makers of software such as compilers that allow people to manufacture viruses.

    Consider the Chicago example. It is illegal to own guns in Chicago. People who want to own a gun but live in Chicago drive to the 'burbs and buy a handgun from a legal dealer their and then transport the gun back to their Chicago residence.

    The city is suing gunmakers and dealers saying the dealers are acting in a negligent manner by selling to people they should know are breaking the law.

    The same principles could easily be applied to compilers and other software -- by not making sure that the buyers of the software *aren't* going to use it to create viruses, the dealer and manufacturer are negligent.

    Or to put it another way, has anyone in the software industry taken any positive steps to make sure criminals *don't* have access to their software? No, they haven't, and that's exactly the grounds that people are going after gun manufacturers in court.
  • Likewise, I wouldn't blame a gun manufacturer for illegal use of their product, however I think that legislation restricting the ability to purchase a gun wholly right. Consequently I think there should be legislation prohibiting the manufacture of programs whose purpose is malicious.

    Now, before I get the torrent of "the internet is not policable" posts and all the rest of the freedom online thing, I have a few more things to say.

    I realise that it is much harder to prevent the production of malicious code since it takes only one person using tools that are freely available. Furthermore, recent incidents (deCSS etc) have shown that it is impossble to prevent the circulation of code/binaries on the internet. However, I don't think that difficulty of enforcement is reason enough not to legislate. It would give the law enforcers at least some leverage in certain situation, that they don't have at the moment.

    Secondly I realise that the line is very blurred between legitimate usefulness, and malicious, particularly with tools that can be used both maliciously and defensively, like port scanners. Again, I see this as a challenge for the courts and legislators, rather than a reason not to even attempt to legislate.

    I think that it is also the responsibility of every developer to think about the potential illegal uses of their code, and the damage that their programs could cause. Since it requires a certain degree of brain power to be a developer in the first place, it shouldn't be too hard for everyone to realise that if they write a virus/scanner/exploit, and release it to the public, it will innevitably wreck someone's day, and cost someone money. Just put yourself in the position where you miss dinner with your family/have to stay up all night fixing a server because of some script-kiddie/have to pospone the family holiday because your data was wiped off your servers, and you miss a contract deadline as a result. It doesn't take long to decide not to release malicious code does it? Just remember that there's always some arsehole who think's it's cool to screw things up, and he might be doing it with your code.

  • I feel that the user is totally to blame. Much of the software out there can have one bad use or another. You can't blame a developer for the software being used in a destructive way no more than you can blame a fork for the user having bad table manners.
  • I'd agree with the majority that a user is responsible for the misuse of a tool (and i'd include software in this term).

    HoweverConsider tools/processes that have no legitimate use, such as chemical weapons. I believe i'm correct in thing that development of chemical weapons is illegal (in most civilised contries). Computer viruses should be considered in the same light.

  • ok, so, what's your point?

  • automatic guns can only be used for evil purposes (killing people), what if the time comes when its necessary to overthrow our government because they've grown extremely tyrranical? Just like our forefathers overthrew the yoke of oppression of their government, we should have the means at our disposal to do so. Our forefathers tried to insure that we had those means by the 2nd amendment to the constitution. Now however, our rights are suppressed. You may call them evil if you wish, but I believe that our citizenry should have even more potent weapons than automatic weapons available, so that we will be able to overthrow it, if there comes a time when our gov't becomes like Nazi Germany or pulls a Tianamen square. This may sound radical to some ppl, but I think its anti-American to view it any other way. I mean our fore-fathers started a revolution for independence from their gov't b/c of taxation w/o representation, for the most part. Pretty radical dudes.
  • ...on what the software is, and the purpose for which it was written.

    To use an analogy, guns are designed to kill. That is their sole function. They'd make lousy can openers. As such, I feel that the makers have a measure of responsibility if people use guns in that manner.

    HOWEVER, a measure is just that. A measure. The gun manufacturers don't -make- people use guns that way, that is the choice of the owner, and nobody else.

    I guess my point is that responsibility (as opposed to blame) for anyone involved is, IMHO, never 100% and very rarely 0%. Rather, it's the entire spectrum inbetween.

    If a software package has one, and only one, possible function, then the writer or company needs to take some responsibility if people use it that way. After all, that's what it was intended to be used for, and that alone. For the writer or the company to deny any responsibility, on the grounds they didn't actually -use- the program that way, is denial of reality.

    MOST programs, though, are multi-purpose. SAINT is an excellent example, being very useful for testing for some of the more blatant security flaws in systems. Yes, it can also be used maliciously, but so can a swiss army knife. Doesn't make either program necessarily malicious in it's own right.

    Summary: Where one, and only one, intended use exists in a program, the writer or company should bear some responsibility for people using that function in the manner intended. (NOT blame, just responsibility, and at most 50% of the responsibility.)

    Where more than one use exists, the writer or company should bear responsibility no greater than 50% of the fraction of possible uses that are malicious. (The user is never forced to use the program maliciously, so bears at LEAST half of any responsibility, regardless.)

  • by h2so4 ( 33298 ) on Wednesday November 24, 1999 @02:32AM (#1508076)
    As source code, I wouldn't say that the authors of these programs are necessarily the "bad guy"; the code can provide interesting insights into security flaws.

    In the case of a virus, if the developer keeps the code within a quarantined environment, which he has authorisation to be using, it seems legitimate. As long as he does not distribute the code to unstrusted partied, or release a binary into the wild, then he has not really done any damage, it is when this boundary is crossed that he could be held responsible (to some extent) for damage.
  • A word processor can be used for illegal purposes, but no one would consider holding the developer responsible for that kind of use, and even a virus is not necessarily destructive in nature. For instance the eicar.com virus exists for the sole purpose of testing virus scanning software.

    Where the software was developed for the sole purpose of illegal use, the responsibility is on both the developer and on the user.

    If the software was developed for legitimate use, then the responibility is on the user.

    The intent of the software may be a grey area; what was Back Orifice _really_ intended to do? Be a tool or a crack? I'm not an attorney (but I play one on web-tv). IIRC, the state of mind of the developer; what he intended the software to do, is a legal point that may determine guilt or innocence.

    But, in any case, the user always has the responsibility to use _any_ software only for legal ends.
  • I agree... There are programs that can be used for either means. l0pht comes to mind. As a System Adminstrator this is a very helpful tool for testing password security and getting into boxes of an ex-employee (sometimes disgruntled), however in the wrong hands it can be a dangerous thing. The guys at l0pht are not to blame because they provide a service.

    The guy who writes a malicious virus is another story. Both the coder and the user in this case are equally at fault. I'd be hardpressed to find a useful reason for a virus. It causes trouble for end users and even more trouble for admins who have to keep track of them, clean them, repair the damage, prevent them from returning and explain to the client what it was in the first place and why.
  • How many pieces of software could NOT be used in an illegal way?
    I could write a ransom note in Word on my Windows 95 machine. I could then send it to you via E-mail. I guess that also implicates my AOL software, sendmail and the copy of Eudora you use to read it. I suppose that means we'd better round up their development teams and cart them off to jail.
    Grab the Mozilla team while you're at it... I just looked at some illegal pornography and those developers assisted me.

    The user should ALWAYS be held responsible.
    Developers are blameless... unless they play another role in the problem (such as misrepresenting their software) "Oh yeah, run this 'internet_worm' program, it's even more fun than Zork!"

    If I create an AI lifeform, and it commits a crime, who is at fault?
  • However, what if the smurf program was written to help test a firewall?

    You can't just say this is good, that is bad.

    Intent is the key - and the intent can't be known until the user has the program to use.
  • I don't see your point. The analogy was based around the idea that a coder who develops an axploit/script/whatever and releases it on the internet KNOWS that it will be used for harm, just as an arms dealer who sells to terrorists (or an oppressive regime etc) KNOWS that they will be used for harm, and should carry a moral and legal burden as a result. That is not to say that the "users" should escape legal repercussions, but that the blame should not fall exclusively on the designer or the user.
  • And why do you think an 'assult rifle' is not a self-defense weapon? Often times the only difference in a hunting rifle and an assult rifle is color. And it doesn't matter if I kill you with a rifle or a spoon. In the end, your still dead. And it is not the rifle's or spoon's manufacturers responsibility. The only reason people are bringing these lawsuits about are that you can sue anyone for anything and they are attempting to put the gun manufacturers out of busines because they haven't been able to convince enough people do outlaw the products. Rather sad if you ask me. I think when these people lose thier lawsuits, they should be 100% responsible for the leagle fees of the manufactures. These preditory lawsuits will be the undoing of busines as we know it.

    Quack
  • The best answer to this question I can think of is "How is your disclaimer phrased?"

    Diamond warns their customers that the Rio product is intended to be used for legimate uses only, therefore they are not responsible for their customers violating copyright law.

    McDonald's did not have anything more specific than "Caution: Hot" on their coffee cups, therefore they were responsible for some woman ordering coffee at a drive-thru and scalding herself with it. They never mentioned that pouring the coffee on yourself was not an intended use, therefore they were liable for her injuries.
  • A virus could have be written to demonstrate a security flaw. Since for some big companies, like the one we all love, security flaws are only security flaws when there is a exploit available a virus writer could force a patch to be made a real good deed. :-)

    seriously though, usualy a virus is not "used" by anyone, the virus infects persons without their knowledge (I myself never saw a pop up "installing virus, please wait...":-) ). Usualy the only person that realy "uses" the virus is the creator itself when he starts the spreading, and in my opinion that is the unlawful action.

    Imagine the folloing, someone creates a virus to test a design or a virus-scan technique, uses it in his lab only and never sets it free. Then a lab janitor, that happen to be a cracker, cracks the lab stoles the virus and sets it free. Who is the criminal?

    In a few words my opinion is that to code a virus is not a crime itself, but spreading it to world is.
    --
    "take the red pill and you stay in wonderland and I'll show you how deep the rabitt hole goes"

  • When I was in first year Computer Engineering, we spent quite a lot of time on this issue. (Note: Laws, etc, pertain to Canada, but I believe that the US is the same).

    Currently by law it is the user's responsibility, totally, in every situation. However, there is starting to be significant pressure to make some systems the responsibility of a Professional Engineer, who would have to sign off on a project, and take responsibility for it. The reason for this is not virii, but other systems, such as medical software, navigation/control systems for aircraft, trains, etc.

    Numerous people think that someone who develops the software to control the administration of a drug (for example), should have to take responsibility for the safety of their code

    I don't have a reference for it, but one of the big examples that we discussed had to do with a machine that administered chemotherapy drugs to patients in the US. There was software controlling the dosage, and a hardware safety check to prevent ODs in the first version. Then in the second version they removed the hardware check and (I think) about 20 people died of ODs because a lazy programmer didn't check whether the dose was allowed or not. In this case, the hospitals were deemed responsible for the deaths, but personally I think that situations like this need the developer to take responsibility for safety.

    Of course the problem with the developer's taking responsibility is that most projects depend on numerous other products. For example if a developer writes code that is safe, but is rendered unsafe by the compiler, or by the OS the system is running on, who is really responsible, the developer, or the tool vendor. Which brings me to my final question, if the thrid party vendor is actually an open source project, who takes responsibility for it. As an example, consider this. Some company wants to write a navigation system for a 777. The search freshmeat, and find that there is a really great AVL library that is LGPL'd. They decide to use it rather than roll their own, and some bug in the lib causes the planes to crash. Is the library developer responsible, or the company who made the nav system? I realize that most licenses have a no liability clause in them, but if it becomes a requirement for developers, could this be a major stumbling block in the road to world domination?

    Anyway, I think I have rambled long enough, I should probably go write some code now. (Good thing I am a co-op student, so I won't be working here when the code gets released).


    >~~~~~~~~~~~~~~~~
  • I think the most important factor has to be what the lawyers (at least over here in the UK) call "mens rea", which I think translates as "guilty mind". Its the intent that counts.

    Take a couple of examples: the recent DVD crack, and credit card number generators (the latter generate syntactically valid random credit card numbers). For the purpose of discussion I'll assume that copyright violation is unethical.

    In the case of the DVD crack the purpose of the crack was honest: to let Linux users legitimately watch films without having to pay for Windows just to run the DVD drive. This is a perfectly legitimate goal, and there is nothing unethical about doing it. Of course it is possible to use the same software for unethical purposes, but the author of the software is not responsible for such a decision.

    On the other hand the author of a credit card number generator has produced a piece of software which exists for only one purpose: to facilitate theft. The author set out to aid theft, and is therefore morally an accessory to the thefts which are carried out using the software.

    Of course there is a big grey area in between these to extremes. What do we say about software which has some minor or marginal use, but which is almost entirely used for some bad and foreseeable purpose? Back Orifice might come into this area: it has some legitimate use for remote admin, but its primary purpose is to break Windows NT security.

    Here ethics moves away from the legal domain: lawyers are concerned with proof. However ethics is more about formalising matters of conscience (although some ethical codes do carry penalties for gross violation). If you believe that cracking is wrong then it follows that the CDC acted unethically in releasing a tool which had, as its primary purpose, cracking NT.

    A program for Linux which was designed to facilitate DVD copying would be an interesting case. It may be ethical to copy a DVD for backup purposes, but the vast majority of copies made would be illegal pirate copies for sale or just given away. Would it be ethical to write such a program?

    The classic hardware scenario for this kind of ethical debate is the shopkeeper who sells a knife which is subsequently used in a murder. If the knife is a cooking knife brought in the normal course of business then obviously the shopkeeper shares no guilt. At the other extreme if the customer comes in and says "Give me a knife so I can kill my wife with it" and this statement appears believable then equally obviously the shopkeeper is an accessory to the murder. But in the middle is a large grey area. What about combat knives? They are specifically designed to kill. Any individual purchaser might plead a desire for honest self defence, but the fact remains that most of the time that such knives are used it is not in self defence. The vendors must therefore share to some extent in the guilt of the users of these knives.

    Paul.

  • "Caution: The contents of this bottle should not be fed to fish." -- On a bottle of shampoo for dogs.

    "Do not use while sleeping." -- On a hair dryer.

    "Do not use in shower." -- On a hair dryer.

    "Warning: This product can burn eyes." -- On a curling iron.

    "Do not use orally." -- On a toilet bowl cleaning brush.

    "Please keep out of children." -- On a butcher knife.

    "Wearing of this garment does not enable you to fly." -- On a child sized Superman costume.

    "May be harmful if swallowed." -- On a shipment of hammers.

    Are you sure you want a warning label on anything that can be potentially dangerous?

  • There's no way I can justify, in my mind, blaming the author of the software. It's the implementer that is at fault.

    In the case of virii: I don't believe there is anything inherently wrong with writing a virus. The author is not to blame until he unleashes it--deliberately or accidentally.

    I have yet to find a good reason to hold an author responsible for how their software is used. It would be an evil thing if we could be prosecuted for the way someone may abuse software that we write. This could certainly have a chilling effect on free software.

    I don't think any of us will be very happy if the people that can afford to release software are companies that have a full-time legal staff to fend off law suits brought on by misuse of software.

    numb
  • I'm going to open up a can of worms here and open myself up to a flame war. Moderators, go ahead and mark this down as flamebait, but please realize I'm not trying to advocate a political viewpoint:

    Is a gun company responsible for people who get shot?

    Some people say "yes". Like Gail Fox, a Brooklyn lady who watched somebody shoot her son. He survived, fortunately, but she felt that action needed to be taken. Not against the person who pulled the trigger. Not against the dealer who illegally sold the gun. Against the gun industry. 15 of the 25 gun companies named in the suit were found liable for the shooting, and for the deaths of 6 other children.

    Take this logic and apply it to software. If some company is hit by BO2K, it isn't the fault of the script kiddie who installed it. It isn't the fault of the administrator who didn't take proper precautions to secure the servers.

    No, according to the flawed logic detailed above, it's the cDc's fault that the company gets hacked. After all, the cDc distributed something that they knew could be used for illegal purposes! They distributed something that could be easily used by even the most inexperienced person to wreak havoc on the lives of others, right?

    In other words, personal responsibility is gone. Nobody prosecutes the people who sell illegal guns-- they prefer to make the CEO of Colt Firearms go in front of a judge and grovel for mercy. Nobody wants to prosecute the script kiddy or toughen up their system-- it's easier to blame the Cult of the Dead Cow and make them pay for the damages. Nobody wants to make a good copy protection scheme for DVD movies-- it's easier to threaten lawsuits against the people who point out how horribly fucked-up the system is.

    Responsibility for the use of any technology, be it software or guns, is in the hands of the person who uses it. I don't believe in passing the blame around like so much candy-- my actions are my own, for better or worse. If I'm willing to take the credit for my accomplishments, I should damn well be willing to take the blame for my mistakes and blunders.

    A note to the world: don't blame others. It won't do you a damn bit of good. Instead, take a little responsibility for your actions and learn from your mistakes. It's that ninth habit of highly successful people-- they don't pass the buck.
  • Absolutely. Not only hammers and crowbars, but guns, knives, ICBMs and even landmines.

    Tools are tools, they're made for a specific purpose. Their misuse, wether intentional or accidental, must not result in the manufacturer being liable.

    Furthermore, it is the intent that counts. Consider manslaughter vs vehicular homicide. In either case, a car is the machine used to kill a person, but the intent of doing so makes a difference.

    Intentional misuse is what points the finger of blame. If a chemical in a can of hair spray causes harm when used according to design, the manufacturer is to blame. If it is intentionally concentrated and then inhlaed, resulting in Little Johnny becoming a vegetable, it's the kids (or parents?) fault.

    Now, in the case of intentional non-disclosure of harmful potential.... Ah, let's just avoid discussing Microsoft's security issues, shall we?
  • Right!

    If you design and create a tool that is harmfull like a virus or something, you are responsible for what happens to it. That means that publishing it on a mailinglist makes you responsible for the fact that it may fall in the hands of someone not so noble as you (i.e. someone that uses it to harm others).

    Creating a weapon is one thing but freely distributing is another thing. You can't put a box of grenades out on the street and then leave claiming that everybody is responsible for his own deeds because you put that box of grenades there. If some kid plays with one of the grenades and accidentally blows itself to pieces, you can be blamed for that.

    So, if you create harmful software (software whose only use is to harm others), you should license and distribute it in such a way that harmful use is prevented. The license should explicitly forbid using it in a harmfull way, not just provide a disclaimer. And it should definitely not be made available for anonymous download.

    If you do the above, you can claim that you did what you could to prevent harmful use of the software.
  • by Basje ( 26968 ) <bas@bloemsaat.org> on Wednesday November 24, 1999 @03:21AM (#1508139) Homepage
    In the US, a similar debateis over guns. Possesion of certain weapons is illegal. For other guns, possesion is legal, while damaging other people with them isn't (obviously).

    The same can be applied (in general terms) to software. Harmful virusses and the like have only
    limited use: causing damage, for whatever reason. Possession of these could be considered criminal.

    OTH, a lot of programs (eg. portscanners) can be used for good or for bad, directly or indirectly. In case of those, it's up to the user to use them ethically.

    The problems with legislation are rather similar to the guns debate too. Illegal possesion will occur. How to handle that is off topic here.

    ----------------------------------------------
  • by hobbit ( 5915 ) on Wednesday November 24, 1999 @03:23AM (#1508144)
    Off the top of my head, I'd say this is rather like the question of firearms, and I'd say that "guns don't kill people, people kill people" is even more applicable for software than for guns.

    Why? When was the last time you saw a gun with virus-like properties?

    As far as I can see liability for breaking the law lies with the person whose intent it was to break it. If the that is the author of some software (eg, a program deliberately designed to spread a virus) then so be it, but if the author produces a tool with multiple functions (eg. BO2K) then he's no more guilty than a man who makes a knife.

    "a program deliberately designed to spread a virus" - AKA a virus.

    There are of course some tricky cases. For instance a friend of mine once wrote a virus as an exercise and gave it a slightly nasty payload. He never intended to release it, but unfortunately a copy got loose on his hard drive and infected several other machine before it was wiped out. If that had well and truly escaped, and done serious damage, where would the liability lie for that ? or is it a natural hazard ? Possibly there is no criminal liability in that case, but merely civil negligence by failing to contain the virus ?

    Why anyone would give a virus a 'slightly nasty payload' without malicious intent I have no idea. It isn't much more a 'natural hazard' than a bullet flying toward a crowd is a 'natural hazard'.

    Just my tuppence worth (IEIANAL).
  • I'd argue strongly that anything that can be misused can also be put to good use.

    One should also suspect anything that can be used productively as being capable of being misused. The more so with the most flexible and powerful utilities, such as remote control, port scanners and packet sniffing.

    That is because good or evil is determined by the intent of the user and the effect on the people around him. This is intrinsically outside the realm of technical specifications and capabilities. Even a computer virus, like a gun, could be created for moral purposes. If the third reich was as dependent upon computer technology as we are today, few people would view it as immoral to disrupt the coordination of the Holocaust. For that matter, developing and releasing a virus into your own network for research purposes is also moral. It's the initial act of releasing the virus into the "wild" that's immoral -- and demonstrates the intent of the developer was malicious.

    So, while in most cases a software developer should not be held responsible for how his software is used, if he himself uses it immorally he may be doubly culpable. Also, if he colludes with his clients to use his software in an immoral fashion he's also culpable.

    Consider a program that logs keystrokes and other events. This could be highly useful in debugging software and system problems, since users seldom can provide detailed answers. On the other hand, it could be used to spy on subordinates or even coworkers or competitors. If a developer markets the software for these purposes or encourages its use in these ways, then the developer is morally culpable.

    In this way, I find cDc to be morally ambiguous. There is kind of an anarchic, certainly ironic spirit to this group, which normally I applaud. The act of developing BO and BO2K is not in itself right or wrong, but I cannot help but suspect that cDc intends harm to Microsoft. Even this in itself is not necessarily bad, since it generally accepted our society to harm competitors by discrediting them. However, implicit in this is the possibility of harming third parties -- Microsoft's customers. Unfortunately, nobody but cDc can know whether they are good or evil -- the key is locked up within their sardonic personalities. They may merely intend the existence of BO to discredit Microsoft and the values it represents, or they may be mere chaosmongers.

    So, to conclude, I think that developers should never be held responsible for any technical capability their software has, but they should be responsible for how they use their software and how they promote their software to be used.


  • Is the primary, intended use of BO2000 really remote network admin? hmmmmm maybe. This is where the legal concept of "recklessly" is very useful.

    If I happen to be doing all sorts of useful research into a cure for malaria, and happen upon a genetic toxin that kills black people (this might happen, given that malaria immunity is linked to the sickle-cell gene), then I have a responsibility not to, for example, publish my results in "South African Racists' Weekly", noting that although my new formula is meant for innoculating white people against malaria, it can also be used to kill black people.

    Similarly, I tend to think that BO2000 is a network admin tool which is also useful for cracking, and that its developers have not been shy in pointing out how useful it is for this purpose to people who might reasonably be expected to abuse it. So yeah, I'd say that CdC bear a degree of moral (possibly legal?) responsibility for its use.

    jsm
  • What makes an anti virus company label one program as a vrius, while another program with similar uses is unlabelled?

    Simple - one has an install kit that runs in plain sight, reports what it is, requires you to accept an EULA, allows you to configure and restrict remote access, and even has an un-install option.

    The other is BO2K.


  • We cannot (yet) solve the atom bomb problem by hacking into reality and creating a fix for it (and we would be a little worried about putting out the sun if we did). We need to have laws to make up for gods little system oversites.

    The same is not true for computer systems. A virus spreads because the system is broken, and because the system is broken only.

    -
    We cannot reason ourselves out of our basic irrationality. All we can do is learn the art of being irrational in a reasonable way.
  • by Salamander ( 33735 ) <jeff.pl@atyp@us> on Wednesday November 24, 1999 @04:14AM (#1508201) Homepage Journal
    IANAL, but I've been involved in legal matters and talked to lawyers a bit. There's a very difficult distinction involved here that I'll try to clarify a little. The law doesn't recognize actual intent or state of mind, rightly holding these things to be unknowable in any specific instance. However, the law does recognize that the maker of a tool or provider of a service "should have known" how that tool/service might be used. It's very similar to the standard of diligence applied in many other areas. For example, libel/slander cases often hinge not on whether the accused did know that a statement was false, but on whether they should have known and failed to exercise due diligence in checking their facts. Ignorance is not necessarily a permissible excuse under the law, especially when the claim of ignorance is either facile or tantamount to professional malpractice.

    With respect to software, I think the application of this principle is pretty obvious. The person who uses a software tool illegally always bears some responsibility; the question is whether the software author is responsible as well as - not instead of - the user. This can pretty much only be true when the maker of software "should have known" that their software would be used in such a manner, that such use could have been prevented without undue burden or compromise of other functionality, and that the author nonetheless did nothing to prevent it. The phrase "should have known" is of course vague, but I think people who work in a field generally have a pretty strong consensus on what's common knowledge and what's not. What one person in the field should have known, is what the majority of practitioners do know or could figure out in a jiffy.

    This definition obviously does not indict word processors or other common types of software. It's not even clear that it indicts something like SATAN, which the author deliberately tried to present to system administrators and such as a way to improve security. I think the line gets crossed with something like Back Orifice, which was very obviously pushed primarily as a way to hack systems; any claims about it being a remote administration tool are obviously accompanied by a smirk and a wink, which would only piss off judges and juries. Even if the tool's primary purpose was legal and positive, it's pretty bleeding obvious that it can also be used illegally and negatively. Some announcement of its presence on a system would discourage the latter use while in no way interfering with the first, and the absence of such announcement could readily be construed as an indication of the author's lack of professional diligence (remember, we can't impute malice because that comes down to a matter of concrete intent).

  • Interestingly, while I believe that the law won't hold the author responsible some cases (the tools you mentioned for example) there are others where it does.

    The mosts striking example of this is the piracy field. By most countries law (and esp. American) copyright owners have the complete right to go after the very freedom of thought of the people who might create tools to that can used to break there copyrights. Look at the DeCSS people, who have done nothing illegal themselves, but were under attack right away. Napster is being sued for breaking the law by "aiding in copyright violation" while they themselves have not copied anything (napsters servers never contain the file). And while the Swedish courts held up, a kid here was dragged to trial because he LINKED to illegal mp3 sites.

    Personally I think the difference is that while the laws about computer security are working, the copyright laws aren't. It is when society attempts to hold on to legislation that just can't function that peoples rights start being fingered (see: prohibition, drug war). But that is off-topic for this discussion I guess.


    -
    We cannot reason ourselves out of our basic irrationality. All we can do is learn the art of being irrational in a reasonable way.
  • I remember answering similar questions when I was in school. The main issue that always came up was if the programmer wrote the program knowing that complications could threaten human life.

    Typically it would be given in an example where the project pushed through into production without a test cycle, and a short product development life cycle, and a boss that wouldn't listen to reason (do they ever?). But let's ignore all of that and focus on the issue of use vs. creation. I suppose you could even complicate things further and include design vs. use vs. creation(development).

    Design and Development:

    Though slightly different, the two can go hand in hand. But the reason I've differentiated between them is that design precedes development. One would hope that any potential for problems would be detected before development even begins. But in a hurried job, this is often not the case. A lot of the design is in the hands of the programmers while they are developing. And that is the problem. Not that the programmers can't do a good job, but that the design typically becomes ad-hoc. And there is less forethought in a project that is developed with a loose design, or none at all for that matter.

    Use:

    I'm not so sure you can put all the blame on the user. A user may operate under the assumptions that the program/product is safe. They may even have assurance from the company/person that makes the product. Sure anyone is being naive if they think a program is bugless. But if you have a program and you think it'll work, it should. In the case of an accident, the user is free of guilt, the responsibility should lie with the coders.

    Alternatively, if the user knows that the program could threaten a life or sabotage a system, and uses the program to do so, then the responsibility should lie on the user.

    Anyone read "The Case of the Killer Robot?" I think it's a staple among Computer Science Ethics classes.

    JFL (Just For Laughs...)

    Now, if you ask me who really is responsible, I'd say it's the project manager ;)
  • Hmm, good argument. I take it you wear a bullet-proof vest?

    Hamish
  • by QuMa ( 19440 )
    Is the eicar.com 'virus' I real virus? I believe not, it's just a com file containing a standardized signature.
  • Black and white? How is asserting they are ambiguous seeing things as black and white?

    I think cDc is deliberately obscure about its intent. If you read their writings, it's clear that they in their own way are trying to make people think, if necessary by shocking their sensibilities. What they say and do is shrewdly calculated for effect as much for meaning. For that reason, I don't think you can realistically infer much about their opinions and values except that they dislike complacency. However this in itself doesn't make them good or evil, only interesting.

    In any case, there can be do doubt that they must abhor Microsoft, with its success in promoting technical mediocrity and its hypocritical, self-serving creed of the "right to innovate". When they released BO2K, they claimed it was a remote admin tool -- and indeed it makes a very good one -- but despite this claim nobody really believes they went through all that trouble because they believed the poor benighted NT admins deserved a better remote admin tool.

    The whole thing is clearly a hack, and like all great hacks it works on several levels. First, obviously, is the technical challenge; technical elegance is a clear criteria for a winning hack. Secondly, their sardonic claim of interest in the well being of NT administrators only serves to increase the NT admins fear of BO2K -- a fact that no fan of irony can fail to relish. Third, it puts Microsoft in a devestatingly clever bind in that clearly this is an attack on their interests, but they can't credibly denounce the attack because they themselves charge big money for a tool that does essentially the same thing only less efficiently.

    Many hacks have no purpose other than that they're hard to do, but often they have a point -- the sharper the better. BO2K jars anybody with any propensity for thought into considering the moral nature of software. Inspiring either fear or introspection is a subversive act, and I suspect that one may be as good as the other as far as cDc is concerned.

    Making people think, shaking up the complacency of people who follow the path of least resistance, and taking a powerful bully down a notch are all are admirable things. But, there's a dark side to this. They had to know that this thing would go right into the toolboxes of a bunch of adolescent script kiddies. Can anyone doubt that they didn't hope this would be the case? There are plenty of admins out there who aren't fully competent, but that doesn't mean its OK to make their job harder. It's not necessarily their fault that they aren't in the intellectual or technical elite, nor in many cases do they have a choice of tools.

    Because somebody is technically or intellectually inferior to you gives you no more right to interfere in their affairs than their being physically or economically inferior would. I suspect the cDc members understand this well, but the end (a great hack, enhanced mystique, sticking it to the man) justifies the means.

    When I was an MIT student, we had certain understood rules for hacks: Don't vandalize, clean up after yourself, and if possible leave the site of the hack a better place than you found it. So, is the world a better place with BO2K or not? I personally don't have a pat answer, but it certainly is a more interesting place.

  • To some extent, yes. Unless you can be sure of who will be downloading the code, then you have placed it in the hands of untrusted parties. You have released it into the wild, and anybody, be they curious or malicious, can run propogate them.

    Snippets of source code, inter-mingled with an explanation of what is going on, could be useful, but posting the complete source, or especially binaries, can only be viewed as irresponsible - you risk the chance of creating a new type of script kiddie: nobody wants that :)

    The reality is that viruses serve little purpose; sure they can be intreguing for the curious, but it's playing with fire, and it only takes one malicious user, or one slight coding mistake (ahem, Morris, ahem) to wreak havoc.
  • Regarding culpability, the question is not legality but the legitimate uses of a given product. Take, for instance, mp3 search engines... they don't do anything blatantly illegal; all they do is provide a listing of where on the net you might find a given mp3. Its the actions of those who put mp3s on the web and those who download them that are illegal. However, because there are no legitimate uses for such a search engine, or at least, the legitimate uses are by-and-large non-used (yes, i am aware that there are non-copyrighted mp3s out there) the search engine remains culpable in copyright infringement.
    now move on to subjects such as the aforementioned tools such as nmap. These tools can easily be misused to aid someone in cracking a system; however, they also have a very important legitimate use in helping sysadmins secure their networks.
    ethically, programmers are going to have to start taking a closer look at the tools they develop and whether the benefits of their legitimate uses outweigh the dangers of their misuse. BackOrifice's stealth mode, for example, is an example of one such tool. The benefits for a network administrator having such a close look at a users system are almost immeasurable; however, in stealth mode, the user has no knowledge he or she may be watched, which opens the door for crackers into an organization's system. The ethical question is not, should such a tool be developed (it should, IMHO) but should it be so easy to abuse? (it should not) The implication here is that in the not-so-distant future, programmers will be more limited in what we can program, not by lawyers, but by society's acceptance of what we do. While I don't think we'll ever be told what we can and can't write (nor should we be), our image as a community will be tarnished by our less scrupulous members, much the way the community of hackers has been by crackers and script kiddies, or the way lawyers have been given a bad rap by their (much more numerous) dickheads.
    currently one needs no certification to work professionally as a programmer or other computer professional. in the near future, however, this may change, unless we start policing ourselves and holding ourselves to a standard of ethics which society finds acceptable.
  • Compare software to dynamite.

    Revolutionary, changed the way things were done. Caused a great deal of harm criminally too.

    You can never blame the maker, only the user.

    Something our (US) goverment should really remember.

    Also reminds me how useful college is.
  • If I'd said, "Hey, man, get me a _knife_ okay? 'Cause I want to STAB somebody, okay? Get me a knife 'cause there's this guy I don't like and if you give me a knife I'll stab him and remove his entrails and loop them decoratively around his neck, okay?" then yeah, sure :)
    If I wanted it for chopping vegetables for Thanksgiving stuffing and that was my story, then unless I was acting really unstable I doubt you'd be blamed, but if I was totally set on stabbing somebody and knowing this you gave me the knife anyway, you're an idiot, you had the option and cause to refuse (hey, you're the one with the knife) and so (unless you had me at gunpoint or something) you certainly should be blamed.
    You'll find that disclaiming all responsibility for your actions has only limited usefulness in the real world :)
  • I don't want to pick nits, but the DeCSS folks are not in the USA, and what they did is not illegal under US law. If they use what they did to illegally copy a movie, then they would be in violation of US copyright law.

    IANAL.
  • Probably not a popular view in this forum, but...

    You don't blame gun manufacturers because their product is performing within specification. The act of shooting a gun is not inherently bad, nor is shooting someone (self defense...), but who you should and why that matters.

    With a virus, the victims are indescriminate and almost never justified. As a result, the creator of a virus is intentionally doing harm. I believe this is the key distinction.

    With hacking tools, it depends on the tool. Scripts that are exploits that can only serve to to a DoS attack or otherwise endanger a computer are clearly destructive. Tools that can be used effectively but can be dangerous when misused are like guns, the creator isn't responsible, the user is.

    Creating a tool with no constructive purpose, only destructive, is obviously morally wrong. You are intentionally causing indirect harm.

    If you create a tool with a useful purpose that can be abused, you are not in the wrong, the person misusing the tool is.

    If you create a tool with malevolent purposes in mind but create a quasi-true benevolent purpose (BO2K, I'm sorry, but it was NEVER intended for remote administration), you are at least in the morally questionable area.

    Intention matters. Possible uses matter. Something with only destructive purposes is obviously NOT kosher... even with a warning label that says "Use will kill random innocent people."

    Alex
  • Nmap has a feature that puts its "stealth" scans in tiny, highly fragmented IP datagrams (such that only a few bytes of each TCP segment are in each datagram). That can have no legitimate use except to evade detection.

    But I can justify it. It is there so that network admins can test to see if their intrusion detection systems can recognize an attack that uses tiny, highly fragmented IP datagrams.

    By the same token, one might cogently argue that certain small, automatic handguns have only one purpose -- killing or maiming human beings. But one can also argue that they are used for competitive target shooting (which is so), so shall a gun manufacturer be held criminally responsible when one of these guns is used to kill someone?

    The lawsuits at present pending against gunmakers and tobacco companies are all based on civil notions of product liability. I think that is quite appropriate. Tobacco companies have a legal problem primarily because they conspired to hide evidence of health risks associated with use of their product. I'm not so sure gun makers are vulnerable on this score. If it can be shown that gun makers have deliberately sought to suppress technologies that would make guns safer and colluded with one another to do so, then I think they are in more trouble.

    I guess I'm wondering if a chainsaw company is civilly or criminally liable when someone attacks someone else with a chainsaw? I don't think so. The device was operating well within its capabilities. There is no reasonable way to prevent its misuse in this manner that would not also make it useless for its intended purpose.

    Your point is precisely my point. I was saying that the authors of nmap, queso, pgp, and gpg are NOT liable for deliberate misuse of their products.

    I would not, however, be surprised to see warning labels on software. I'm sitting here drinking a Diet Dr. Pepper. On the bottle is the following legend: WARNING Contents under pressure. Cap may blow off causing eye or other serious injury. Point away from face and people, especially when opening. I would expect to see labelling on software that can be used to do dangerous or illegal things that are the equivalent of that warning.
  • I tend to agree with this argument ("it's only virtual, let's be lax"), except for one reason: computers actually ARE used in situations where human lives depend on them. what if such a computer got infected with a virus and crashed?

    I have no sympathy for those who yell "punish the writers of cracker tools". disrupting other people's computers is certainly punishable, but it's ridiculous to even compare it to a crime (unless it's a hospital's computers or something like that), yet people apparently get sent to jail for months, for a simple web defacing.

    and, putting it all together, I think it's been more than proved by practice (and by BUGTRAQ) that full disclosure is good for the whole of the industry.


  • I said "most countries". And you are wrong about American Law, intentionally writing a work around for a copyprotection scheme IS illegal by the "Millenium copyright act" or whatever its called. I don't remember the exact wording but you can look it up.

    The DeCSS people could possibly get away on the fact that they were explicitely working to create Linux support, but just wait until someone cracks the SDMI format, and you'll see the shit really hit the fan.

    -
    We cannot reason ourselves out of our basic irrationality. All we can do is learn the art of being irrational in a reasonable way.
  • by evilpenguin ( 18720 ) on Wednesday November 24, 1999 @07:35AM (#1508302)
    Actually, you are wrong (up to a point). Here is the relevant section of Chapter 12, Title 17 (which was added to Title 17 by the Digital Millenium Copyright act of 1998):

    (Warning: The following was cut-and-pasted from a PDF file and is correcpondingly unreadable). The two relevant facts are that DeCSS comes in under the grandfathering 2-year period documented below, and that anyone seeking to do what DeCSS does should apply to the librarian of Congress and the Register of Copyrights that lack of client software for Linux constitutes an adverse effect on their ability to make non-infringing use. Seems like perfectly good law to me.

    The section below is quoted from the text of the relevant law, and IA (still) NAL.


    '' 1201. Circumvention of copyright protection systems ''(a) VIOLATIONS REGARDING CIRCUMVENTION OF TECHNO-LOGICAL MEASURES.Ð(1)(A) No person shall circumvent a techno-logical measure that effectively controls access to a work protected under this title. The prohibition contained in the preceding sentence shall take effect at the end of the 2-year period beginning on the date of the enactment of this chapter. ''(B) The prohibition contained in subparagraph (A) shall not apply to persons who are users of a copyrighted work which is in a particular class of works, if such persons are, or are likely to be in the succeeding 3-year period, adversely affected by virtue of such prohibition in their ability to make noninfringing uses of that par-ticular class of works under this title, as determined under subpara-graph (C). ''(C) During the 2-year period described in subparagraph (A), and during each succeeding 3-year period, the Librarian of Con-gress, upon the recommendation of the Register of Copyrights, who shall consult with the Assistant Secretary for Communications and Information of the Department of Commerce and report and com-ment on his or her views in making such recommendation, shall make the determination in a rulemaking proceeding on the record for purposes of subparagraph (B) of whether persons who are users of a copyrighted work are, or are likely to be in the succeeding 3- year period, adversely affected by the prohibition under subpara-graph (A) in their ability to make noninfringing uses under this title of a particular class of copyrighted works. In conducting such rule-making, the Librarian shall examineÐ ''(i) the availability for use of copyrighted works; ''(ii) the availability for use of works for nonprofit archival, preservation, and educational purposes; ''(iii) the impact that the prohibition on the circumvention of technological measures applied to copyrighted works has on criticism, comment, news reporting, teaching, scholarship, or re-search; ''(iv) the effect of circumvention of technological measures on the market for or value of copyrighted works; and ''(v) such other factors as the Librarian considers appro-priate.
  • A virus is simple a program that can install itself. A self-extracting archive. It often installs itself into key areas of the system, to maintain a state of high availability. The same technique is used by anti-viral software to bypass normal operation for the purpose of verifying certain software actions.

    Some viruses are badly written, like M$Word, and internet explorer, and netscape. They corrupt files, exhaust disk space, and have a pernicious habit of reinstalling themselves.

    Some viruses have easter eggs or are trojan horses. M$Word for instance fingerprints files, Netscape v4.x (yes all of them) publishes every file it can find on the net so that any web page writer can receive your files in return. RealAudio publishes your playlists, sun's c compiler emailed SUN your compiling habits. Most people do not consider these features, and yet is netscape or microsoft culpable? (Real and Sun have fixed these problems).

    There are malicious programs, I firmly believe NT SP6 was designed to destroy microsofts competition by creating incompatiblities where none existed before. The NT install process oft times corrupts BSD or linux partitions, and always overwrites the boot sector. Standard malicious viral behavior.

    A virus however can be completely harmless, legal, and useful. A virus by the name of AutoDoubler(tm) significantly helped out Mac users when hard disks were measured in tens of megabytes. It surreptiously installed itself into EVERY application on the machine. It would even alter files, and instaled itself into system memory. I believe one version even infected the system software itself (most likely just fonts and whatnot).

    Autodoubler would not have been useful if it did not act in a viral manner. It's ease of use was due solely to the fact that it worked in the background. Whenever an applciation was run, it would intercept that system call and check to see if the binary was UNcompressed (not infected) if so it would add it to a compress list, and wait for the first call to GetNextEvent to comrpess it. Remember you naysayers that in those days MacOS was completely "cooperative" multitasking, if a process wanted to be multitasking it had to depend on every other running process to explicity give up time. Also remember disk seeks and recursive directory scans of an 80mb disk could take an hour. Indeed the previous product "DiskDoubler" died since it normally took up to 6 hours to compress about 80 megs.

    Autodoubler did not noticabnly affect system performance because it used its viral like properties to infect only those files the user actually used, or when the user was idle. It subverted many system calls, altered virtually every file on the system (after about a week of keeping it installed), and ran WITHOUT your explicit permission. Once you installed the "init" as they are called, it did the rest.

    Other harmless viruses might be integrity checkers, the virus installs itself into applications (slowly, quietly, so as not to grind the disk away, and not to degrade performance, and not to have a weird process "INTRUDER_DETECT -R" running for the next several hours as it scans the 10 gig disks. It would simply install checksum code into the initialastion code. It would store a secure hash of the original binary, and code to check it. It would also infect the kernel and wait for about a week, then it would start logging whenever a binary was launched without the checksum, or with an incorrect checksum. Note that intially the administrator would consent to its installation, but the viruses use lies specifically in the fact that he need not worry about it after that, AND that the programs action is completely unnoticed and hidden.

    Another harmless useful virus might be a patch installer, it is initially loaded with a domain name, company.com, and then spreads itself about using worm techniques to update all versions of the software it can find. Why not just do it manually? The whole point is ease of use, and transparency. Also in a large corporation many computers get "lost" and their whereabouts are not always remembered, network-wise or physically. Also new computers sometimes spring up that are from another department, or the purchaser let the new temp fill out the forms, and he forgot to do the paper work on one, etc. If the software is an internal release (say a proprietary database interface used at many data-entry companies) where old versions might be harmful to the database, the preferred infection method would be on connection to the database server. What if the database is distributed, where each client maintains a certain section of the database? Then whenever two clients communicate the patch should have a chance to spread.

    That sort of update would also be helpful for seti@home, distrbuted.net, napster, icq, and lots of other distributed products where old versions don't interact as well with new versions. Of course in those cases the program should have an option "Prompt me before accepting a viral update".

    At any rate, it always irritates me to see virus == malicious software. t4 is a real life virus that is permaps most responsible for our knowledge of genetics at the dna level. It is the virus used to infect E.Coli and give them new genes. Plasmids themselves are really just viruses that bacteria have grown to love. Mitochondria are suspected to be basically co-depenedent mutualistic parasites. They are just barely above the virus level.

    The viral technique is simply a pardigm for writing software. Just like a GUI or an operating system. Its a way of viewing "How is this software going to be used?" Viruses are supposed to run without (further) user interaction, and to withstand attempts to prove their existence or remove them (except when the person removing them makes a concerted effort, an effort that a hacker could not mount, but the original sysadmin or owner could). Just like a tatoo, some people WANT permanent software. The viral paradigm tries to make software as permenent as possible (by distrubting copies in multiple places). It simply backs itself up. Amanda and most disk backups programs are viruses that infect backup tapes with copies of themselves even without the users epxlicit permission.
  • I always thought it would be kind of cool (although dangerous) to use virus-like distribution mechanisms to distribute small binary bug patches :)
  • I didn't read all the comments on this thread, but it seems that our economists didn't state their ideas.

    I'm not into the field of law and economics, but I know that one principle we may consider is, the responsibility is assigned in the way that the outcome is efficient.

    Say, it's extremely easy for the manufacturer to implement measures to prevent bad use, but it's costful to monitor users' usage, the natural conclusion is the manufacturer has the responsibility.

    On the other hand, it's difficult to have the manufacturer to implement preventive measures, maybe technologically impossible, then it's the users' responsibility.

    Say, why we don't blame the knife manufacturer if someone kill someone else using a knife? There's simply no cheap technologically possible measure to prevent this usage. In many countries outside US, gun is forbidded because this is the cheapest (including the opportunity cost of not using them) way to prevent bad use. Maybe American don't think their lives are valuable to forbid guns or maybe they think their normal uses are very valuable.
  • I must be keen to lose some karma, because I'm going to take you up on this.

    This may sound radical to some ppl, but I think its anti-American to view it any other way. I mean our fore-fathers started a revolution for independence from their gov't b/c of taxation w/o representation, for the most part. Pretty radical dudes.
    Amen brother.


    Okay. This may sound radical to you, but disagreeing with (your concept of) the American point of view on this issue is not particularly radical. You need to get out a bit (in a global sense).

    Your forefathers wrote the right to bear arms into their constitution because they wanted to protect themselves from tyranny similar to that practised by their former government.

    So... tell me again, why they didn't protect the rights of Native Americans to bear arms so that they could protect themselves against invading Europeans?

    The answer, in a word? 'Amen', brother.

    Hamish

    p.s. Can anyone in America explain to me in what way people are more empowered against a government gone wrong (such as one which violates the constitution) by the right to bear arms?
    (disclaimer: This is a genuine question, I don't want this to turn into an you-vs-us discussion. I think that my own country's record with respect to human rights and arms is appalling).
  • "I couldn't agree more."
  • Blame, blame, blame. "Who do we point the finger at?" "Who do we sue?"

    That seems to be the main thing on the minds of managerial types when faced with the choice of competing technologies, once of which is free and reliable yet "cursed" with not having a legal "blame me" label attached to it.

    Well, we need to make them see that that is a mentality for the incompetent, that blame is a concept for those that have no other weapon at their disposal. But if they choose wisely then they *do* give themselves additional weapons, and powerful ones, namely the ability of technically competent people to fix things that are open, to modify them to suit the requirement instead of relying on external parties.

    So, I reject the premise on which this thread is based, ie. that party A or B is responsible for the end result. We each make our own nests, and if we choose our building materials unwisely and then seek to blame others, that just shows the height of our incompetence. If you're technically clued up but your advice is ignored, well that's their loss. Go where your skills are valued, and leave them to their problem and to their focus on who to blame for their own lack of skill.
  • One example of this that comes to mind is the Napster [napster.com] software that was recently featured on Slashdot. For those who missed the article or are unaware of what it does, it basically establishes a "dedicated MP3 network" of users who connect into central servers and share their MP3 collections with all other users who are connected. If you want to find (for example) Billy Idol's "Rebel Yell", you enter the song and artist name into a search dialog, and it presents you with a list of dozens of users who are "sharing" that file, along with their connection speed and ping times. Then you pick who you want to download it from, initiate the transfer, and that's it.

    Naturally, the Napster web site and welcome message prominently display warnings about copyright law and piracy, and they strongly stress that the software is only to be used to trade non-copyrighted MP3 music. But nobody is actually naive enough to believe that that's what people use the software for. The Napster developers know damn well that people are going to use the software to trade copyrighted music with each other. You only have to log in once to see that this is the case; a search for any song that is or was once even remotely popular will invariably produce many matches.

    A couple of months ago or so, I bought a Creative CD-RW drive, and on the box it states that the included software will allow you to "share the latest independent MP3 music files with your friends." Here, again, Creative isn't fooling anybody. They know that there's huge demand for the ability to download music from the Internet and then burn it to CD where it can be played on a Discman, in a car, in the home stereo, or anywhere else. And guess what? Most of that music can't exactly be called "independent."

    So are the Napster folks a party to piracy? What about Creative Labs? I think the answer to these questions is "yes"; it's kind of hard to argue that MP3 location software and CD burners have not contributed to copyright violations related to digital music. But (at least in my mind) there is a difference between being a party to piracy and being a sponsor of it. By placing obligatory warning messages on their products ("Thou shalt not pirate") and by essentially saying "Hey look, we're not responsible for what people do with this", the Napster and Creative Labs folks may have absolved themselves of legal responsibilty for what their users do (or have they? IANAL.)

    In the end, I think it's clear that the user is responsible. There are certainly legitimate uses for an MP3 distribution network; it's a great way for garage bands across the world to get quick and cheap recognition for their work. And of course there are tons of legitimate uses for a CD burner! Since the technology is so neat, and since the providers of these technologies have no way to prevent people from misusing them, I don't see how we can place the responsibility on them.

    Anyway, it isn't my intent to either condone or condemn music piracy. It is my intent, however, to illustrate an example of the sort of "moral dilemma" that some software and hardware makers face, instead of falling back on some lame gun analogy. :-)
  • However, because there are no legitimate uses for such a search engine, or at least, the legitimate uses are by-and-large non-used (yes, i am aware that there are non-copyrighted mp3s out there) the search engine remains culpable in copyright infringement.

    I hate it when people say, "there are no legitimate uses" and then say, "well, except for this, but we're not going to consider that one." If there is a legitimate use for something, then you can't say there aren't any legitimate uses for it. If legal MP3s are rare compared to illegal ones, then I think an MP3 search engine sounds like a rather good tool for finding them. Prosecute the people who host the illegal mp3s, and even the people who download them if you like. But there's got to be a line somewhere. The search engines don't make illegal MP3s available, they just happen to make them easier to find, while at the same time making legal ones easier to find.

    The benefits for a network administrator having such a close look at a users system are almost immeasurable; however, in stealth mode, the user has no knowledge he or she may be watched, which opens the door for crackers into an organization's system.

    Which may be exactly what the company or organization wants. From past discussions here, it seems that most people beleive that the company you work for has a right to know what is being done with the company's machines. A stealthy tool such as BO may fit the bill for such a company. Since monitoring your employees is not illegal, I think it should be considered to be a legitimate tool.

  • The NRA says, "Guns don't kill people, people kill people."
    It is interesting to note that People without guns kill a lot less people.

    Think of cracking DVD encryption. People can pirate DVDs by directly bit copying from one disc to another. However, it's made a lot easier by cracking the encryption.

    If you make a program to check security (satan, nmap), there are those who will miss use it.

    The point I'm trying to make is, technology is neutral. If you use it for good, it'd good. If you use it for evil, it's bad. Technology that make some bad things easier should have a counter-technology or deployment to offset the effects (criminals get guns, so officers get guns, etc).
    ---
  • But still, a gun is manufactured with a single purpose and that is to hurt or kill. If one makes a tool with a specific purpose there's no way he/she can say: "Hey, it's not my fault."

    For one, not all guns are manufactured to hurt or kill, although all are capable of it. Just like not all knives are manufactured to hurt or kill, although all are capable of it. There are quite a number of guns that are designed strictly for target shooting purposes, and which are in general never used for anything else (like Olympic match rifles for example).

    You also assume that hurting and/or killing is always wrong. In cases of self-defense, most people would disagree. At any rate, not all guns that are designed and marketed to kill are designed and marketed to kill humans anyway, many are designed for hunting. The majority of us at least here in the US would argue that hunting is an acceptable and legitimate reason for killing animals. Whether you agree or not, both self defense and hunting are certainly legal purposes for using a gun in many parts of the world.

    Same thing with Software. One can't make a password crack program and then claim that he/she didn't think anybody would use it to crack passwords and do nasty things with it.

    Similarly, password cracking programs can and are used for legitimate purposes by sysadmins who are trying to protect their systems by weeding out bad passwords put in by users.

    It is ridiculous to try to push off responsibility to manufacturers when products are misused. If a product is reasonably safe when used as intended, the manufacturer shouldn't be held liable. In the case of products like guns and knives, if they are safe for the operator when used properly, then I'd say the manufacturer is blameless. If a gunmaker builds guns that blow up and injure the user, or a knife that broke and injured the user, then obviously that would be different, and a reasonable person would be expected to hold the manufacturer liable.

    Likewise if a piece of software which has a legitimate use were to unexpectedly cause problems or security holes, then the manufacturer might be liable for those problems.

  • I'm not quite sure of the exact law, but I belive it states that as long as there is a single, reasonable, legal use for a product, the manufaturer is not liable for its use. This is why lockpicks are legal since locksmiths use them, even if more people use them for theft.

    The "resonable" above gives leway to the courts. Back Orifice (for example) is questionable as to its intent for legal use, dispite the manufactures claim, knowning the nature of the CDC.

    Another example--handguns--was brought up. Hand guns certianly have a legitimate use. They work well for personal defence. The more law abiding citizens who own (and responsibly care for) hand guns, the more reluctant criminals will be to asault people or break into homes; overall it decreases the utility of crime if the criminal has a greater chance of being shot. And let's not forget that they could be used to revolt if the government became tyranica (albeit with limited effectiveness); the entire point of the 5th amendmant.

    Of course, if a gun manufacturer continualy sells more guns to dealers in a city than permits are issued for, it again becomes a question of wether the gun-maker can reasonably belive that their guns are being used for legitimate purposes.



  • What reasons the DeCSS authors could use the to defend themselves against this chapter is not too important: I am not a lawyer, and I have no idea about the meaning of all the exceptions of things in that paragraph.

    But my main point, that it is (contradictory to what the original post said) illegal to create a program that circumvents a copyprotection scheme regardless of whether you use it for illegal purposes or not, seems correct.


    -
    We cannot reason ourselves out of our basic irrationality. All we can do is learn the art of being irrational in a reasonable way.
  • First of all, one should understand that the legality of this depends on the country or state. If you are addressing legal issues in your work, anything you say depends on the jurisdiction. As an example, I will take a recent Finnish government bill (which as a good example, because it will be quite unambiguous on this). Soon, it will be illegal to "With an intent to harm information processing or the functionality an information processing or a telecommunications system, produce, offer or distribute a computer program or series of commands which is designed to endanger [such systems] or to damage data in [such systems], or offer or distribute instructions of how to implement such a program." Potential punishment will be fines or up to two years in jail. For those who speak Finnish, the draft is available here [edita.fi].
  • I wouldn't necessarily argue "legitimate use" without further facts. If the person also had a gun, or the threat thereof, yes. If not, no. Police have to abide by "reasonable force". So too, should homeowners.

    Yes, someone breaking into your home is a disgusting affront. Armed with a knife (for example) it is equally revolting for you to send them to the morgue.

  • Not so. You are missing an important point. The law does not go into effect until two years after it is enacted. That's next year. Any circumvention you do this year by legal reverse engineering is a-ok. Next year you would have to go through the exception process described in the law I quoted.

    They are grandfathered right now. Period.

There is no opinion so absurd that some philosopher will not express it. -- Marcus Tullius Cicero, "Ad familiares"

Working...