Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Security Your Rights Online

U.S. Computer Security Advisor Encourages Hackers 275

DarklordSatin writes: "According to this Associated Press article, which I was pointed to by the nice guys over at Ars Technica, Richard Clarke, Dubya's Computer Security Advisor, wants to encourage hackers to find security holes in software. Although he feels that the system only works when the hackers show 'good faith' and disclose the holes to the company before the public, he wants to start offering more legal protection to hackers and that is a very good step in the right direction." As the folks at Ars point out, though, "Naturally, Mr. Clark was using the original, more generalized, definition of "hacker", but I guess saying 'Bush Adviser Encourages Discovery of Software Bugs' just didn't have enough zing."
This discussion has been archived. No new comments can be posted.

U.S. Computer Security Advisor Encourages Hackers

Comments Filter:
  • by MxTxL ( 307166 ) on Thursday August 01, 2002 @11:52AM (#3992581)
    If something like this made it anywhere near being a policy decision, when the popular press got ahold of it, it would not last very long. Joe Sixpack doesn't know much about computers, but he knows the word 'hacker' and he knows that it's mapped to the word 'bad'. So when anyone suggests letting (hackers=>bad people) near our critical computers (which all computers are...) then Joe goes on the warpath and gets it struck down.
  • Friend or Foe (Score:2, Insightful)

    by errittus ( 13200 ) on Thursday August 01, 2002 @11:52AM (#3992584) Homepage
    After going after these people for exploiting bugs in software for the wrong reasons, maybe this will lead to some gainful employment for a few ladies/fellows.
  • by Winterblink ( 575267 ) on Thursday August 01, 2002 @11:53AM (#3992597) Homepage
    At least if you post it to the public you're assured that the company's not just going to push the reported exploit under the rug and ignore it, or "quietly" patch it in a later version to bypass the bad press.

    Being publicly accountable makes a company more diligent with security and bug testing. The only downside to public announcements is that every hacker out there now knows about it. The upside to THAT is that the company now has a hell of a lot of incentive to patch the hole in a prompt manner. Just my 2c!

  • Re:break programs? (Score:2, Insightful)

    by SpatchMonkey ( 300000 ) on Thursday August 01, 2002 @11:56AM (#3992626) Journal
    Because the script kiddies they refer to don't have the skills to fix anything. Like the vandals who daub graffiti on historic monuments, these people know nothing of constructing something useful or interesting or artistic. They just want to destroy.
  • Re:Friend or Foe (Score:3, Insightful)

    by Jucius Maximus ( 229128 ) on Thursday August 01, 2002 @11:56AM (#3992629) Journal
    "Clarke said the hackers should be responsible about reporting the programming mistakes. A hacker should contact the software maker first, he said, then go to the government if the software maker doesn't respond soon."

    I suggest that a US citizen send them a fax or two about Hewlett Packard [slashdot.org].

  • by MarvinMouse ( 323641 ) on Thursday August 01, 2002 @11:58AM (#3992638) Homepage Journal
    I think what he meant was people who try to break their own systems to find bugs in them. Not the people who mindlessly hack into other peoples web pages and change them because they have no time.

    He means responsible hackers who just find the problems and notify the company. Not hack into banks or your computer.

    It is perfectly legal for someone to try to defeat their own home security system. While it is not legal for them to break someone elses (unless requested.)

    Not a very slippery slope at all if you look closer. All he wants is for people who discover or uncover problems on their own little systems or labs to be allowed to tell the companies. Or even just let these people find the problems on their own. As well, he wants to legislate it a bit more, so while they can notify the companies, they won't be able to release to the public exact details on how to break in.

    Just like, if I discovered that my security system on my car was easily breakable. I could tell the company, and let my friends know there is a problem. But I cannot publish a detailed paper explaining how to unlock doors with a screwdriver and some patience.
  • judgemental (Score:3, Insightful)

    by skydude_20 ( 307538 ) on Thursday August 01, 2002 @12:01PM (#3992662) Journal
    system only works when the hackers show 'good faith'

    who gets to decide what a hacker did was in 'good faith'? These proposed laws mixed with the DMCA should make the credibiliy of the system less than it is currently treading at...

  • by Anonymous Coward on Thursday August 01, 2002 @12:06PM (#3992690)
    I heard him on the radio this morning.

    He encouraged hackers who are also "professionals" to look for bugs like this, and then report the bugs to the government and the software maker. There was no policy about what happens when both moribund entities laugh and sit on it.

    Nor did he want the hoi-poli hackers out there looking for software bugs. He was explicit about this: Only Security Professionals Need Apply.

    Allow me to take this moment to reassure that he is as disconnected from things as you could ever imagine. This is just the same crud in a new can. He will happily prosecute you if you do something to make the world better and don't wear a suit / this is not your "job" by his lights.

    So don't take it too much to heart... he really didn't mean you regular people, folks.
  • by ayden ( 126539 ) on Thursday August 01, 2002 @12:15PM (#3992761) Homepage Journal
    I heard the NPR Morning Edition interview [npr.org] with Richard Clarke this morning. Yes, Clarke encourages "hackers" to take find security holes, but be responsible: after discovering the security hole, notify the government and the manufacturer, but DO NOT tell the world. Clarke argues that he wants the software manufacturer to have time to develop a patch before announcing the vulnerability.

    Clarke also said he wants "Computer Security Specialists" to hack and not the people doing it for fun. This ambiguity is the problem: how do you define "Computer Security Specialist"? Most of everything I learned about IT came through hacking for fun. Now I'm employed as a "Computer Security Specialist."
  • Re:Ethics (Score:3, Insightful)

    by Restil ( 31903 ) on Thursday August 01, 2002 @12:19PM (#3992798) Homepage
    Good point. I can't really see what the Government can do anyway. There's no law that says you have to write secure software. There's just laws regarding disclosure of bugs/holes. Some software companies will be genuinely concerned about the security of their software and will respond promptly. Others weigh it against other bottom line concerns and will wait until a convienent time to address the problem (next major software release).

    The government can't do much more than tell the company what they already know. I suppose the government could stop using such software, but beyond the operating systems and generic office applictions, I doubt the government makes a widespread use of any other commodity software packages. All the government can really do is make an announcement about the problem, and the "hacker" can probably do the same thing, more efficiently, and in a way that more effectively reaches those that need to know about it.

    -Restil
  • by Blkdeath ( 530393 ) on Thursday August 01, 2002 @12:39PM (#3992932) Homepage
    Ten days seems a little steep. The thing that rolls around in my mind, over and over again, is this;

    If a noble, virtuous person can find the vulnerability and report it responsibly - who's to say that an evil, amoral person can't do same? Isn't this blatantly obvious? Shouldn't this be pointed out to companies, the government, and the media? What are the chances that a "Black Hat" is going to be responsible about disclosing the latest vulnerability; especially if it will allow him to gain access to, or wreak havoc on thousands of systems?

    I think something to the tune of 30 days for a company to get something rolling before it's released is perfectly reasonable. After all, the hard work of finding the source of the problem is already done for them. If they're not actively trying to find and solve the problem, IMHO they're acting in bad faith and are putting all of their customers in jeopardy. After 30 days, tell the public that a vulnerability exists. Put the heat on them. If they don't budge (or worse -threaten with suit), let a few other white hats know about it so that they too may "discover" the vulnerability.

    Now we have proof-of-concepts coming from half dozen different sources. This should be more than ample proof (to the company, media, and court of law) that the probability of somebody exploiting it for their own ends is very real, and more dangerous than somebody who remains quiet (and cooperative) for over a month.

    If they haven't acted within a week of the 30-day mark, publish the code to every online forum you can find. SlashDot, ZDNet, ARSTechnica, Kuro5hin<sp?>, your own website, etc.

    As to 'full disclosure' - that, IMHO, is not a proper solution. If somebody releases a fully disclosed [BIND/SendMail/Apache/ProFTPD/IMAPD/etc.] vulnerability while I'm asleep, I could be '0wn3d' before I even wake up to read my e-mail.

    Most importantly, the public has to be made aware of the fact that software companies do, in fact, stonewall whitehat hackers. This leads to the customers' systems being in danger from blackhats, which is a Very Bad Thing. I thought this would be obvious, but then I've always been of the radical mindset of personal responsibility.

    Is that even politically correct?!?

  • by evenprime ( 324363 ) on Thursday August 01, 2002 @12:41PM (#3992945) Homepage Journal
    Be careful when you say that Clarke "encourages discovery of software bugs". On NPR this morning they mentioned Ed Felton [eff.org] and Dmitri [eff.org] (though not by name) asked Clarke if his statements at blackhat was consistent with the government's prosecution of people who find holes in software. Clarke responded that US law prohibits people who are not "security professionals" from intentionally looking for security holes in software, and that the reverse engineering of software to find holes in it is prohibited.
  • What is mine? (Score:4, Insightful)

    by gmhowell ( 26755 ) <gmhowell@gmail.com> on Thursday August 01, 2002 @12:46PM (#3992972) Homepage Journal
    What is 'my system'? I am responsible for the whole shebang: NT servers, 2k terminal servers, Linux firewalls and web servers, NT desktops, wireless access points.

    How can I attack my own systems without attacking someone else's 'intellectual property' or some such BS? I can't. But by the terms of the licenses (even the GPL and BSD, I believe) I can't blame the people I got the software from.

    Anyone living in the US, connecting to the US, or who has even heard of the US should not be doing computer security. Anyone who is doing even a reasonable job of it is checking into and poking into the products supplied by vendors. But this is illegal. The vendors can't be blamed. Only you. You can be blamed, but you don't legally have the right to do the thing/s that will make your work effective.

    Run. Run and hide.

    I said it in a response to a journal on this story (posted yesterday, BTW) but I'll say it again: in a fight between this guy and Ashcroft (which is what this essentially is), Ashcroft will win every time. The only way to get around the problem is to invalidate the disclaimer of warranty of merchantibility of a product. If nothing else, computer software must be fit for a specific purpose. At that point, GM and Walmart become aligned with anti-DMCA forces. Then Microsoft and the Senator from Disney get to see REAL political power.

  • by Mr.Sharpy ( 472377 ) on Thursday August 01, 2002 @01:15PM (#3993250)
    This guy was on NPR this morning. When asked about his remarks in context of the laws against such hacking he specifically said that he was talking about hacking by "security professionals" only and then only for the purpose of quietly notifying the software maker. In fact, he explicitly said it should remain illegal for any regular joe to hack or reverse engineer software looking for exploits just for the fun of it.

    This guy is not your friend. He, like the rest of the administration, is solely concerned with corporate interests. What he has in mind here is definitely not exposing exploits and causing bad corporate PR. It is the quiet uncovering of holes and the quiet informing of the software makers so they can issue mystery patches.

    The reasoning behind that I suppose is to keep malicious hackers from using public exploits. But in reality, by the time the so called "security experts" stumble on the holes, the real evil hackers have already known about them for a long time. This is just more the "keep the problem secret and it will go away" policy that has gotten us into trouble.
  • by zenray ( 9262 ) on Thursday August 01, 2002 @01:22PM (#3993303) Journal
    The way I see this issue is that I have an ethical responsibility to other users of a product to inform them of any security flaws I find. The EULA of most propriety software contain disclaimers as to fitness of use and the end users have no legal recourse for any damages incurred. In other words they put out crappy, bug ridden, security flawed software and they expect use to shut up and just use it. To not publish any security problem is to leave every user unaware of the problem and therefore open to potential damage. I say full public disclosure up front of all bugs and security problems with just enough technical detail to verify the problem. No need to provide the script kiddies with automatic tools that they can use. Perhaps the propriety software companies will start to put out a better quality product if they know that any security problem or bug will be quickly published. The end users decision might be to start using some open source software that can be fixed a lot quicker than the insecure propriety software.
  • by sayno2ms ( 559159 ) on Thursday August 01, 2002 @02:43PM (#3993927) Homepage
    I also heard Mr. Clark on NPR this morning and liked most of what I heard until he said only Security Professionals should try and find bugs and that anyone else who does is assumed to be doing it with criminal motive. I'm sorry I thought in our country guilt was not assumed but proven.

    "If there are legal protections they don't have that they need, we need to look at that,"

    No hurry!!
  • Re:Ethics (Score:5, Insightful)

    by jafac ( 1449 ) on Thursday August 01, 2002 @02:53PM (#3993997) Homepage
    That's bullshit.

    If some shadetree mechanic is working on his buddie's Camaro, and finds a manufacturing defect that ought to prompt a safety recall - he goes to the manufacturer and most likely gets promptly ignored (for the sake of argument here). He can then go to something like Consumer Affairs, but he sure as hell doesn't have to. He can go straight to the press to warn people that their Camaros (or whatever) are going to fall apart at 88 miles per hour.

    That is PRECISELY what the hackers are doing - they're going to the press.
    Respected, established, journalistic entities, specializing in the field of computer security. 2600 magazine, BugTraq, etc.

    Not publishing a security hole, not being able to report something to the press, THAT is an abridgement of free speech. It's BULLSHIT that someone needs to be an "employed security professional" to have the right to work on computers and find bugs.

  • Two things bother me about this statement:

    1) Define "Security Professional". How do I get to be one? Do I have to hire a lawyer? Am I an SP if I config user passwords? Write code? Use keys in doors? Write papers on PKI systems and techniques? Hack my DreamCast to play Simon? Can I get an SP license from somewhere (other than the BAR Assoc. ;) that excuses me from the DMCA? How much is it?

    2) How do you get SP's if "ordinary Joe's" are forbidden from studying and learning the techniques required to do the hacking?

  • by stratjakt ( 596332 ) on Thursday August 01, 2002 @05:14PM (#3994971) Journal
    Ok, so the security professional finds a big flaming hole - yet can't come up with the code to prove his hypothesis.. He calls up software company A, if he's lucky he manages to wade through the phone system and find a human. "blah blah thank you for your interest in our products we here at Co. A take our customers satisfaction very seriously we'll take that issue under advisement .." So he calls up some magical government agency (department of computer experts?).. Hell - he calls the FDA, for all the good its going to do. "Thank you for calling the FDA we care deeply about your concerns blah blah dont smoke winners dont use drugs" So he's fed up, and wants the problem fixed; perhaps NEEDS the problem fixed, because he's got script kiddies driving herds of elephants through that hole in his system. So he goes public - without writing an exploit, and posts "Software Co. A is knowingly selling unsecure software" on the web somewhere or in some industry mag. Now, without proof to backup his claims, he's on the recieving end of a libel lawsuit. After all, a security expert talking down Software Co. A costs them a gazillion dollars a word in a lawyers eyes. So he proves it with an exploit - or even worse - a workaround/patch of his own, violating the DMCA, and spends the next 5 years doing all his port-sniffing in a prison shower. His response to the Felton case is that a Uni. comp sci professor isn't an 'expert'? A cryptogropher like Dmitri isn't either? Is he? Cause if he ain't, how dare he suggest any software has bugs in the first place. Where do I go to enroll in Security Expert school? Sounds even better than Bovine University.

If you have a procedure with 10 parameters, you probably missed some.

Working...