Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Security Your Rights Online

U.S. Computer Security Advisor Encourages Hackers 275

DarklordSatin writes: "According to this Associated Press article, which I was pointed to by the nice guys over at Ars Technica, Richard Clarke, Dubya's Computer Security Advisor, wants to encourage hackers to find security holes in software. Although he feels that the system only works when the hackers show 'good faith' and disclose the holes to the company before the public, he wants to start offering more legal protection to hackers and that is a very good step in the right direction." As the folks at Ars point out, though, "Naturally, Mr. Clark was using the original, more generalized, definition of "hacker", but I guess saying 'Bush Adviser Encourages Discovery of Software Bugs' just didn't have enough zing."
This discussion has been archived. No new comments can be posted.

U.S. Computer Security Advisor Encourages Hackers

Comments Filter:
  • by FortKnox ( 169099 ) on Thursday August 01, 2002 @10:49AM (#3992565) Homepage Journal
    If only the left hand knew what the right hand was doing...
    • You don't want to know what the right hand is doing... ;-)
    • ...where the RIAA is legally allowed to break into your computer and DDoS you, and you are legally allowed to use any hacking trick necessary to plug the software's "security holes," bugs, flaws and other "undocumented features" (to stop them), and so on. Boy, it could be fun for just...minutes!

      Ok, I'm removing my tongue from my cheek now!
    • ...and he is obviously ignoring the recent flap over the HP-Tru64 su vulnerability.

      He said that he encourages those in the computer security field (but not anybody else) to run and attempt to crack industry software on their own computers (but not anybody else's) - ignoring the fact that this violates the DMCA - and then report any vulnerabilities to the government (as well as the manufacturer).

      This seems like a tinly-veiled attempt to give the NSA a few more backdoors to me.

      I vote for a 1-week courtesy notification period before a full, public disclosure - no matter who you are, or how much money you have.

      • ...and he is obviously ignoring the recent flap over the HP-Tru64 su vulnerability.

        There's a good chance he has never heard of the HP-Tru64 su vulnerability.

        He said that he encourages those in the computer security field (but not anybody else) to run and attempt to crack industry software on their own computers (but not anybody else's) - ignoring the fact that this violates the DMCA

        Who says this violates the DMCA? The DMCA only applies to hacking systems which protect copyright. HP has threatened to sue over the HP-Tru64 su vulnerability. It's not illegal until they actually follow through with the lawsuit and win.

        -a
      • " and then report any vulnerabilities to the government (as well as the manufacturer)."

        If this message from Snosoft is any indication, I wouldn't have much confidence in reporting to the government either.

        From: KF
        To: full-disclosure@lists.netsys.com ; bugtraq@securityfocus.com ; recon@snosoft.com
        Sent: Wednesday, July 31, 2002 7:42 PM
        Subject: [Full-Disclosure] for the record... (Tru64 / Compaq)

        http://www.msnbc.com/news/788216.asp?0dm=T14JT

        Clarke cautioned that hackers should be responsible in reporting programming mistakes. A hacker should contact the software maker first, he said, then go to the government if the software maker does not respond soon.

        --

        For the record... we contacted HP(at the time Compaq), and CERT several times. I attached the original version of our su exploit (not the one that phased leaked) to NIPC and to CERT BOTH. We recieved an extremely long delay at CERT before they even responded. At that point I called CERT 2 times to see what the heck was going on and eventually I establish contact (Ian Finley). I also mailed nipc.watch@nipc.gov or whatever the email address on their page was. They didn't mail back ... no auto responder or nothing. ( I mailed the back weeks later and said I was shocked that I got no response and still got nothing back). I then called the NIPC hotline 3 times. The first 2 times I called I spoke to someone that should have been flopping whoppers "uhhhh a non-executable computer security what... let me send you to so and so's voicemail". Then I called back a week later and gave them the CERT vu numbers (after CERT finally responed). I left my cell phone number on someones voicemail again at NIPC... no one called me back.

        I deeply regret the fact that one of my team members plagerized another and leaked some code but my god people WE TRYED to give SEVERAL people a heads up!

        -KF

  • The government encourages People to go to work.

  • by MxTxL ( 307166 ) on Thursday August 01, 2002 @10:52AM (#3992581)
    If something like this made it anywhere near being a policy decision, when the popular press got ahold of it, it would not last very long. Joe Sixpack doesn't know much about computers, but he knows the word 'hacker' and he knows that it's mapped to the word 'bad'. So when anyone suggests letting (hackers=>bad people) near our critical computers (which all computers are...) then Joe goes on the warpath and gets it struck down.
    • ..Actually I heard him interviewed on NPR this morning. His whole story was that ONLY computer security professionals should engage in this type of "hacking". For everyone else no attempt should be made to reverse engineer or post exploits to the world. He did stop short of adding the popular closing "or the terrorists win" but really he was pushing M$'s security-through-obscurity line over and over again.

      To his credit though, he did explain the difference between the current perception of hackers as being evil lawbreakers and the original definition of the old MIT hackers. He did broaden it just a bit by saying that old hackers were anyone who was into computers...whatever that means.

  • by stray ( 73778 ) on Thursday August 01, 2002 @10:52AM (#3992582) Homepage
    From the article: A presidential advisor encouraged the nation's top computer security professionals and hackers Wednesday to try to break computer programs, but said they might need protection from the legal wrath of software makers.

    ... and there I was, thinking that most computer programs were broken to begin with. How about encouraging computer professionals to *fix* programs?

  • Friend or Foe (Score:2, Insightful)

    by errittus ( 13200 )
    After going after these people for exploiting bugs in software for the wrong reasons, maybe this will lead to some gainful employment for a few ladies/fellows.
  • Too Late (Score:2, Interesting)

    by ShishCoBob ( 516335 )
    It's a little too late for these. We already have a number of people in jail for finding software bugs and releasing the details without doing any damage... And isn't there a law already against this exact thing here?
  • by Maran ( 151221 ) on Thursday August 01, 2002 @10:53AM (#3992596)
    Which is more surprising: Government representative supports hackers, or Government representative uses correct meaning of "Hacker".

    Maran
  • by Winterblink ( 575267 ) on Thursday August 01, 2002 @10:53AM (#3992597) Homepage
    At least if you post it to the public you're assured that the company's not just going to push the reported exploit under the rug and ignore it, or "quietly" patch it in a later version to bypass the bad press.

    Being publicly accountable makes a company more diligent with security and bug testing. The only downside to public announcements is that every hacker out there now knows about it. The upside to THAT is that the company now has a hell of a lot of incentive to patch the hole in a prompt manner. Just my 2c!

    • Disclose it to the company, then if they ignore that, post it publicly. That way the good companies get to fix things quick and be applauded, whilst the baddies get embarrassed even more by "we sent this to them a month ago but they blew us off" comments.
  • by Jucius Maximus ( 229128 ) on Thursday August 01, 2002 @10:53AM (#3992598) Journal
    They recognise that 'hacking' is a good way of helping to secure systems, which is good.

    Now I hope that a USA Citizen tells them that they are encouraging something that is outlawed by the DMCA.

  • by Anonymous Coward
    No wonder a Trojaned version of OpenSSH was put on OpenBSD's FTP server. They were acting on Presidential recommendation!
  • Cnn Story:
    Linky Linky [cnn.com]

    it's said WE have to be the world's debuggers
  • by Rude Turnip ( 49495 ) <valuation@NOsPam.gmail.com> on Thursday August 01, 2002 @10:53AM (#3992605)
    There's a pretty good chance you'll get sued/fined/imprisoned due to the DMCA. Of course, the advisor did say that some legal protection for hackers should be in place to prevent such a mess.

    These days, with "corporate fraud" being the buzzword d'jeur, one could make a very strong argument that the DMCA encourages corporate fraud because it allows companies to sweep their product defects under the carpet.
    • pretty good chance you'll get sued/fined/imprisoned due to the DMCA.

      Sued/fined? I have a hard enough time convincing the people that I work with that there is a difference between PHP and HTML. And they are reasonably intelligent people. Try convincing G.W. that there is a difference between "hackers" and "malicious hackers". Problem solved, label them all as terrorists and throw them in jail forever. The DMCA is the last thing I'd be worried about.
    • Of course. Plea to all the competent computer folks to get themselves locked up so the gov't can look like it knows what it's doing in the eyes of all the non-criminals.
  • by iritant ( 156271 ) <lear&ofcourseimright,com> on Thursday August 01, 2002 @10:55AM (#3992616) Homepage
    There was the incident of the fellow who discovered that the New York Times was left wide open by FrontPage. So he called to tell them, and was promptly arrested. I wonder if Mr. Clarke thinks that's fair.
  • then put you in jail for DMCA violations.
  • A top Bush-administration official, in a tie in with Richard Clarke's press release on hackers today gave his support to the Cult of the Dead Cow, a hacker group responsible for creating the juvenile-hacking utility known as "Back Orifice" or simply B.O. Whether this official's support is a tie in with the Bush administration's fundamentalist leanings is unknown. CotDC representatives were quoted as saying, "5w33t! 7h1s r0x0rs! w3 w1ll 0wnz j00 4ll n0w! ph34r u5!" President Bush was unavailable for comment.
  • Ethics (Score:4, Interesting)

    by YanceyAI ( 192279 ) <IAMYANCEY@yahoo.com> on Thursday August 01, 2002 @11:00AM (#3992653)
    This is an interesting ethical question. Clarke said the hackers should be responsible about reporting the programming mistakes. A hacker should contact the software maker first, he said, then go to the government if the software maker doesn't respond soon. The philosophy is good in theory, but often large companies ignore problems to avoid the press and/or expense of fixing the security hole.

    I wonder how long the "hacker" should give the company. And is the government really the next best step? I work for the government and I seriously doubt that will get the ball rolling.

    The obvious problem with full disclosure, of course, is making malicious hackers and even terrorists aware of the problem. Solutions anyone?

    • I wonder how long the "hacker" should give the company. And is the government really the next best step? I work for the government and I seriously doubt that will get the ball rolling.

      Well Microsoft and others having been pushing their "reasonable disclosure" requests and that states 30 days.

      Which i think is fair enough.

      Let them know. If they haven't released a fix within that time then fully disclose it.

      Yes, people will argue that as soon as it's found out, others may be using it so it would be better to know all the details immediately.

      However the likelyhood of someone finding the problem and writing a worm or something that exploits it is substantially less when they don't have all the gory details laid out for them in a nice document. Which is the major downside to full disclosure.

    • Well, obviously, if the government cannot influence the company within a reasonable timeframe, then the vulnerability should be disclosed.

      If I wasn't living in the United States, perhaps I would try this tactic at least once to give the US government benefit of doubt. If they fail, then no need to try it again. If they actually force the company to make the patch, it may be a good thing.

      Shouldn't we report the su exploit in Tru64 to the US government now? Like "a company in your country is making unsafe software and refuses to fix it, please consider if you still want to buy their software for the government and the military".

    • Re:Ethics (Score:3, Insightful)

      by Restil ( 31903 )
      Good point. I can't really see what the Government can do anyway. There's no law that says you have to write secure software. There's just laws regarding disclosure of bugs/holes. Some software companies will be genuinely concerned about the security of their software and will respond promptly. Others weigh it against other bottom line concerns and will wait until a convienent time to address the problem (next major software release).

      The government can't do much more than tell the company what they already know. I suppose the government could stop using such software, but beyond the operating systems and generic office applictions, I doubt the government makes a widespread use of any other commodity software packages. All the government can really do is make an announcement about the problem, and the "hacker" can probably do the same thing, more efficiently, and in a way that more effectively reaches those that need to know about it.

      -Restil
    • I'm not sure that there is one. So long as a bug exists malicious people can find it and exploit it. Keeping it a secret won't help because anyone who really wants to cause damage is also going to go looking for holes. The only people who wait for abug to be made public before exploiting it are the lazy kiddies, people who want to cause damage but don't have the inclination and the energy to find exploits themselves. While this latter group can cause damage, it is the former (the "real" crackers) that I am worried about.

      IMHO bugs should be made public, even in a private company's software. Because, it isn't really "theirs" and theirs alone. I depend upon the OpenSSH and SSL systems as well as my Windows box to keep my data secure. So do many many other people. If there is a hole then it puts my credit card numbers, medical info, me in danger. At that point it is a public problem, and a private company should not be permitted to "just hide it" any more than Ford/Firestone should be able to just hide the dangers of their products.

      IMHO you should give the developer a "sufficient time" (depending upon the size of the bug, number of developers, etc.) to fix it. If they don't make any reasonable attempts then yeah, make it public. If there is a danger to the rest of us from the truly malicious then we ough't to know about it. Yes it will unleash a torrent of script kiddies, but when you compare that against a sea of quiet thefts or, got knows what else perpetuated by the truly determinedly vicious.

      I'd be suppeised if you couldn't argue this under existing whistleblower laws.
    • Suppose I find a vulnerability in some random company's web site. After telling them about it, whom else do I tell? The NIPC?

      And same for a widely used piece of software - after the software company, who in the government gets the report?
    • Re:Ethics (Score:5, Insightful)

      by jafac ( 1449 ) on Thursday August 01, 2002 @01:53PM (#3993997) Homepage
      That's bullshit.

      If some shadetree mechanic is working on his buddie's Camaro, and finds a manufacturing defect that ought to prompt a safety recall - he goes to the manufacturer and most likely gets promptly ignored (for the sake of argument here). He can then go to something like Consumer Affairs, but he sure as hell doesn't have to. He can go straight to the press to warn people that their Camaros (or whatever) are going to fall apart at 88 miles per hour.

      That is PRECISELY what the hackers are doing - they're going to the press.
      Respected, established, journalistic entities, specializing in the field of computer security. 2600 magazine, BugTraq, etc.

      Not publishing a security hole, not being able to report something to the press, THAT is an abridgement of free speech. It's BULLSHIT that someone needs to be an "employed security professional" to have the right to work on computers and find bugs.
  • judgemental (Score:3, Insightful)

    by skydude_20 ( 307538 ) on Thursday August 01, 2002 @11:01AM (#3992662) Journal
    system only works when the hackers show 'good faith'

    who gets to decide what a hacker did was in 'good faith'? These proposed laws mixed with the DMCA should make the credibiliy of the system less than it is currently treading at...

  • Run to Uncle Sam? (Score:4, Interesting)

    by Rogerborg ( 306625 ) on Thursday August 01, 2002 @11:05AM (#3992686) Homepage

    A more interesting quote is in this CNN article. [cnn.com]

    • "A hacker should contact the software maker first, he said, then go to the government if the software maker does not respond soon."

    Umm, really? To whom in the government? The Department of Fixing Stuff? The FBI? The FTC? The DoJ? Gosh, that'll keep (e.g.) Microsoft on their toes. Bwahahahaha!

    Precedent would suggest that a more likely result will be the jailing of the hacker, and the awarding of a fat contract to the vendor.

    Thanks all the same, but this is just some guy in a suit. When it's written up in law by Congress, signed by G.W.Bush, and delivered to the Library of Congress by flying pig courier, I might change my mind.

    • Isn't there another member of Dubya's computer security staff that is a former Microsoft senior executive? One can imagine two scenarios:

      (1) that fellow is messing his pants and wants Mr. Clarke canned for pushing hacking, or

      (2) Mr. Clarke was put up to this as a way of getting free security labor for Microsoft while restricting press leaks about their software.

      And whom would you be required to contact in the case of Open Source?
        • And whom would you be required to contact in the case of Open Source?

        Perhaps we could resurrect the House Unamerican Activities Committee.

        "Are you, or have you ever been, a member of the mind control cult known as 'The Free Software Foundation'?"

  • by Anonymous Coward
    I heard him on the radio this morning.

    He encouraged hackers who are also "professionals" to look for bugs like this, and then report the bugs to the government and the software maker. There was no policy about what happens when both moribund entities laugh and sit on it.

    Nor did he want the hoi-poli hackers out there looking for software bugs. He was explicit about this: Only Security Professionals Need Apply.

    Allow me to take this moment to reassure that he is as disconnected from things as you could ever imagine. This is just the same crud in a new can. He will happily prosecute you if you do something to make the world better and don't wear a suit / this is not your "job" by his lights.

    So don't take it too much to heart... he really didn't mean you regular people, folks.
  • by ayden ( 126539 ) on Thursday August 01, 2002 @11:15AM (#3992761) Homepage Journal
    I heard the NPR Morning Edition interview [npr.org] with Richard Clarke this morning. Yes, Clarke encourages "hackers" to take find security holes, but be responsible: after discovering the security hole, notify the government and the manufacturer, but DO NOT tell the world. Clarke argues that he wants the software manufacturer to have time to develop a patch before announcing the vulnerability.

    Clarke also said he wants "Computer Security Specialists" to hack and not the people doing it for fun. This ambiguity is the problem: how do you define "Computer Security Specialist"? Most of everything I learned about IT came through hacking for fun. Now I'm employed as a "Computer Security Specialist."
  • Mailing address (Score:2, Informative)

    by tww-china ( 171273 )
    Anyone have the mailing address of the President's Critical Infrastructure Protection Board (PCIPB)? Their home page is http://www.whitehouse.gov/pcipb/ but there's no address and the email address for feedback, feedback@who.eop.gov, doesn't work.
  • INTERVIEW THIS GUY (Score:5, Interesting)

    by geekoid ( 135745 ) <dadinportland@y[ ]o.com ['aho' in gap]> on Thursday August 01, 2002 @11:26AM (#3992854) Homepage Journal
    we need to get Richard Clarke to do a slashdot interview. I think this would be an enormous opportunity for the slashdot readers to find out what someone high up thinks about the dmca and its effects to the community. It will also give Richard Clarke the opportunity to here the concerns right from the community instead of from corp. reps.

    • by pmz ( 462998 ) on Thursday August 01, 2002 @12:54PM (#3993527) Homepage
      we need to get Richard Clarke to do a slashdot interview.

      This is a good idea. A natural extension to this would be to invite other goverment figures, such as Justice Department officials or members of Congress. People who have an interest in federal or international technology policies might appreciate the open, yet moderated, forum of Slashdot. This could be an example of the U.S. goverment at its best.

      This could be an easier way for people to "write their Congressmen", since there really is a lower courage threshold when posting to Slashdot (yes, writing Congressmen isn't trivial for many people, even though it should be).
      • This could be an easier way for people to "write their Congressmen"...

        Especially if they are more tolerant of typographical errors on such a forum: "goverment" --> "government"

        Also, posts to Slashdot are harder to ignore (no trashcans, only trashmouths)
    • I can't help but think that this must be some sort of mistake. Maybe this is some guy who found Richard Clarke's ID card, and is claiming to be Richard Clarke. Or maybe Richard Clarke had a brain hemmorhage last week, and doesn't know it. I mean, really. Don't you exect the BSA lobbyists to be on the phone to Dub telling him to reign in that "loose cannon"? (Just like the drug companies did for Clinton's Surgeon General when she had the gall to say that maybe we ought to look into legalizing Marijuana for medical use).
  • by davebooth ( 101350 ) on Thursday August 01, 2002 @11:27AM (#3992860)

    Disclaimer: My personal side in the above-mentioned debate is already decided. I advocate responsible full disclosure. Tell the vendor first, but dont agree to any NDAs and always make it clear to the vendor that after a reasonable delay you go public with everything you've got relating to the hole.

    Having proclaimed my bias, it was interesting to hear the guys own words on NPR this morning. On the positive side he correctly defined "hacker." On the negative side he clearly preferred a more restrictive disclosure policy that could be summarized as "Tell the vendor then shut the hell up and go away" When gently pressed he was prepared to allow notification of a "responsible" coordinating agency but he made very sure to never advocate anything so liberal as responsible full disclosure. I was busily making breakfast and coffee at the time so I might have missed an implication or two but these days the usual spin on "responsible" when linked to the word "agency" mean either government-sanctioned-&-corporate-owned or government-operated. Some security hackers find this a potentially scary thought.

    Personally, I take responsibility for my own systems security. Based on the information I have I do my best to keep them buttoned down. Only in that way can I ethically place any blame on the persons that might try and crack them. (Of course I also know my limitations - if a true expert wants to smoke my systems I know they're gone. I'll be satisfied with keeping the worms and kiddies out whilst trusting that theres nothing on my own boxes that a true expert wants badly enough to put in the effort)

    From this standpoint, anything other than responsible full disclosure denies me knowledge I need in order to make an informed decision about the risks I'm assuming. Similarly to do anything less myself, should I discover a security hole, is failing in my obligations to my colleagues.

    To my mind he's advocating using the community as a source of free QA services whilst at the same time making sure that the vendors can get away with the old oxymoron of security through obscurity. Who'd bet against a government sponsored coordinating body being followed rapidly by laws prohibiting disclosure of holes other than through that body?

  • HP (Score:2, Interesting)

    by Osiris Ani ( 230116 )
    In the wake of the recent HP debacle [slashdot.org], I'd have to say that this is very interesting.

    Regardless of the fact that it wasn't actually SnoSoft that officially published the exploit, even if they had, Clarke is basically saying that they went about things in pretty much the most appropriate manner.

  • by JUSTONEMORELATTE ( 584508 ) on Thursday August 01, 2002 @11:33AM (#3992889) Homepage
    On the drive in, NPR had an interview with this guy (Yes, I listen to NPR in the car. Yes, I'm old.) and his remarks there made it clear that he thinks reverse-engineering software to find security holes should be criminal unless the person doing it is employed as a computer security professional.

    I'd rate him above-average on the clue-o-meter (certainly as federal gov't employees go!) but he's not a friend to the hackers by any stretch.
  • by evenprime ( 324363 ) on Thursday August 01, 2002 @11:41AM (#3992945) Homepage Journal
    Be careful when you say that Clarke "encourages discovery of software bugs". On NPR this morning they mentioned Ed Felton [eff.org] and Dmitri [eff.org] (though not by name) asked Clarke if his statements at blackhat was consistent with the government's prosecution of people who find holes in software. Clarke responded that US law prohibits people who are not "security professionals" from intentionally looking for security holes in software, and that the reverse engineering of software to find holes in it is prohibited.
    • Just curious, does, for example, passing a
      CISCO security specialist certification [cisco.com]
      make you "security professional" ?
    • I forgot to mention that the NPR interview with Clarke can be found here:

      http://www.npr.org/me3.smil

      It is segment number 11

      He did say "...hackers can help" at his defcon talk, but is using a *VERY* limited definition of the H-word. During this morning's NPR interview Clarke said:

      "the law currently says that if you discover a flaw, that's ok, but if you intentionally go out and reverse engineer code, unless you are somebody who is in the computer security business, the law assumes that you've done it for uh, nefarious purposes. So the only people we are encouraging to go out and find these flaws are computer security professionals, not people who are doing it just for the fun of it"

  • What is mine? (Score:4, Insightful)

    by gmhowell ( 26755 ) <gmhowell@gmail.com> on Thursday August 01, 2002 @11:46AM (#3992972) Homepage Journal
    What is 'my system'? I am responsible for the whole shebang: NT servers, 2k terminal servers, Linux firewalls and web servers, NT desktops, wireless access points.

    How can I attack my own systems without attacking someone else's 'intellectual property' or some such BS? I can't. But by the terms of the licenses (even the GPL and BSD, I believe) I can't blame the people I got the software from.

    Anyone living in the US, connecting to the US, or who has even heard of the US should not be doing computer security. Anyone who is doing even a reasonable job of it is checking into and poking into the products supplied by vendors. But this is illegal. The vendors can't be blamed. Only you. You can be blamed, but you don't legally have the right to do the thing/s that will make your work effective.

    Run. Run and hide.

    I said it in a response to a journal on this story (posted yesterday, BTW) but I'll say it again: in a fight between this guy and Ashcroft (which is what this essentially is), Ashcroft will win every time. The only way to get around the problem is to invalidate the disclaimer of warranty of merchantibility of a product. If nothing else, computer software must be fit for a specific purpose. At that point, GM and Walmart become aligned with anti-DMCA forces. Then Microsoft and the Senator from Disney get to see REAL political power.

    • If nothing else, computer software must be fit for a specific purpose.

      When this becomes commonplace (I hope that it does), software makers will be much more modest about their claims, and software will be developed only with the discipline it deserves.

      With genuine warranties on software, companies like Microsoft will have to make their marketing department shut up (finally!) and will have to start selling really good, tried and true, software. However, I believe this sort of software industry is quite a few years away.

      Right now, the software industry is going through some sort of adolescence, where righteous bastards like Senator Hollings (SSSCA), Microsoft (.NET, Palladium), and others are trying to write the law of the future in their own image, and the equally-unreasonable "free love" software culture opposing in the wrong way. I really hope reason wins in the end, and we fall back on traditional good ol' hard work and quality products. Some regulation may be appropriate, but it can't go as far as the DMCA already has.
  • So if a member of the executive branch of the government publicly encourages you to break a law (DMCA), and you're then arrested, it would be considered entrapment right?
  • Does anyone really trust these clowns?
    I mean, their past actions truly don't inspire a single grain of trust. Look at last week where the guy in Houston got busted by the court house for EXPOSING their wifi total lack of security (remember that they claimed he did $5000.00 in damage - no doubt that's exactly how much they paid for all the wifi stuff they had to shut down). Plus...just look at how easy they make it...try to do one good thing and some lawyer begins the mantra: DMCA..DMCA..DMCA.

    Nice words you speak guy, but what did Clara say in the Wendy's commercials: "Where's the beef?"

    Until I see the beef, I'm not trusting a single word you say....
  • by AB3A ( 192265 ) on Thursday August 01, 2002 @12:06PM (#3993167) Homepage Journal
    I heard this interview this morning. What he said was not encouraging. He wants "security professionals" to do the hacking --not programmers or kids down the street. He wants them to reveal the exploit without offering code demonstrating it, and he wants to keep it all secret. He made no mention of any time limits before one should give up and go public with this information.

    So let me see where this puts us. Phred Programmer discoveres a buffer overflow that crashes IE. He tells his security professional about his discovery. Our "security professional" says "what's a buffer overflow?" and the whole thing falls on the floor.

    Wait, let's try this again. Phred Programmer discovers a buffer overflow problem that crashes IE. He puts on his "security professional" hat and calls Microsoft. Microsoft says "So what? It crashes. BFD. We'll fix it on the next major release."

    Phred Programmer waits until the next major release and the mess is still there. Remember, he's not supposed to write code to demonstrate this problem, or the potential harm, so Microsoft has no idea whether they've really fixed this problem.

    So Phred Programmer calls the feds. They respond with "Huh? What's the big deal?" "Well, you could exploit this and hack with full administrator priviliges", says Phred Programmer. "Sounds far-fetched" say the feds. "But just in case you're right, I don't want you writing any code. Why don't you post your notions with Microsoft?" "But I already have and they promised a fix by the next major release", complains Phred Programmer.

    "Hmm. We'll have to take it up with them."

    And so, another major release goes by and still nothing. Meanwhile, somebody else figures out the breeched security and because the don't live in the US, they post a script for the kiddies to use.

    Back to the present: Somebody explain to me why this scenario is not likely. Restricting this information to "security professionals" seems to me like an effort to sweep security problems under the rug.

    Richard Clark's ideas suck, IMNSHO. He clearly has no concept of how bugs are discovered, demonstrated, and how the repair of those bugs is prioritized by software companies. Does anyone here really think Microsoft would have fixed those buffer overflow problems if no-one had written an exploit and published it? Does anyone here think that users in other countries will have any respect for stupid US policy (never mind the law)? Sheesh.

  • by Mr.Sharpy ( 472377 ) on Thursday August 01, 2002 @12:15PM (#3993250)
    This guy was on NPR this morning. When asked about his remarks in context of the laws against such hacking he specifically said that he was talking about hacking by "security professionals" only and then only for the purpose of quietly notifying the software maker. In fact, he explicitly said it should remain illegal for any regular joe to hack or reverse engineer software looking for exploits just for the fun of it.

    This guy is not your friend. He, like the rest of the administration, is solely concerned with corporate interests. What he has in mind here is definitely not exposing exploits and causing bad corporate PR. It is the quiet uncovering of holes and the quiet informing of the software makers so they can issue mystery patches.

    The reasoning behind that I suppose is to keep malicious hackers from using public exploits. But in reality, by the time the so called "security experts" stumble on the holes, the real evil hackers have already known about them for a long time. This is just more the "keep the problem secret and it will go away" policy that has gotten us into trouble.

    • Two things bother me about this statement:

      1) Define "Security Professional". How do I get to be one? Do I have to hire a lawyer? Am I an SP if I config user passwords? Write code? Use keys in doors? Write papers on PKI systems and techniques? Hack my DreamCast to play Simon? Can I get an SP license from somewhere (other than the BAR Assoc. ;) that excuses me from the DMCA? How much is it?

      2) How do you get SP's if "ordinary Joe's" are forbidden from studying and learning the techniques required to do the hacking?

  • by zenray ( 9262 )
    The way I see this issue is that I have an ethical responsibility to other users of a product to inform them of any security flaws I find. The EULA of most propriety software contain disclaimers as to fitness of use and the end users have no legal recourse for any damages incurred. In other words they put out crappy, bug ridden, security flawed software and they expect use to shut up and just use it. To not publish any security problem is to leave every user unaware of the problem and therefore open to potential damage. I say full public disclosure up front of all bugs and security problems with just enough technical detail to verify the problem. No need to provide the script kiddies with automatic tools that they can use. Perhaps the propriety software companies will start to put out a better quality product if they know that any security problem or bug will be quickly published. The end users decision might be to start using some open source software that can be fixed a lot quicker than the insecure propriety software.
  • "but I guess saying 'Bush Adviser Encourages Discovery of Software Bugs' just didn't have enough zing."

    Getting a little nit-picky here? I suspect he used hackers to describe anybody who can gain unauthorized access to otherwise restricted systems, not someone who is encouraged to find out why a "bug" caused the DoD's wargames application to crash. Yep, there's a reason he used the word "hacker" and not "software bugs hunter". I know entry can be exploited using system bugs, but hacking is obviously more than just exploiting "bugs", or did the poster just happen to miss the story immedietly following this one? [slashdot.org] A hacker is a combination of skills, not just a "bug hunter"... Which is probably why good ol' Clark used the popular definition in the first place.
  • That would make the DMCA inapplicable. He'll get fired for sure. M$ and the xxAAs will have a hit squad gunning for his ass.
  • One thing I learned when listening to the Steven Soderbourgh commentary on Traffic was that... set your faces to shocked... politicians are much more objective than you think.

    The problem is that we, the constituents, do not elect them for objectivity but for being subjective, stubborn, and close-minded. It's true... that's how you get elected (or stay in office).

    So what is Richard Clarke doing here? It is quite possible he is beginning to switch popular perception. Using "hacker" correctly is a good start. And I assume most of us can agree that this is a step in the right direction.

    The problem is that too many of the posts in this thread say "He isn't going far enough, therefore its a complete waste of time." because "the end users will never know any better."

    Well I hate to say it, but this is how you get the end users informed: slowly start moving in the right direction, educating the masses, letting them put their fears to rest bit by bit. I think Clarke could really start something here IF we, the supposed IT professionals, didn't just discard what he says right off of the bat.

    As a sidebar, I always wondered why people don't try for more publicity campaigns to get laws passed... especially in foreign countries. Bush can say no to Kyoto because the American people don't care/want him to. You can't much expect to force a population to do what you want by saying "You are an idiot! Think differently!" (and it hasn't ever worked).

    So why don't all concerned parties deluge primetime with an ad campaign? Slowly change popular opinion? Maybe in a year you could get huge differences. The key to remember is that politicians are nothing more than fonts of popular opinion. Clinton proved it. G Dubs is proving it: it doesn't matter what you think it matters what the people believe you think by what you say.

    Clarke seems to be doing that but since it isn't the Free Software/Free Beer/Free Nekkid solution so many on /. want it gets tossed out in favor of waiting for something better to come along. Heh, if that is your modus operandi, you're going to be waiting a very long time.
  • NPR Stream (Score:2, Informative)

    by Dr.Seuss ( 94326 )
    As mentioned previously, NPR had a good interview with Clarke on Morning Edition today. The interviewer even researched the story enough to know the Felton case. Most impressive.
    Their stream is here. [npr.org]


    Good Lord, I've deep-linked to NPR.
  • by sayno2ms ( 559159 )
    I also heard Mr. Clark on NPR this morning and liked most of what I heard until he said only Security Professionals should try and find bugs and that anyone else who does is assumed to be doing it with criminal motive. I'm sorry I thought in our country guilt was not assumed but proven.

    "If there are legal protections they don't have that they need, we need to look at that,"

    No hurry!!
  • If only "Security Professional" can legally investigate security flaws, how does one become such a "Security Professional"?

    It seems you have to start your first day at the job with absolutely no experience in the field.

    I know, it's gonna be a licensed profession like doctors or lawyers, with its own lobby organization, barriers of entry and all the rest. Oh well...
  • _ONLY_ by public disclosure can a company be put under enough pressure to ensure that they will make every effort to fix the security problems discovered. If you privately go to them and tell them about something you've discovered, they'll just rebutt by saying that the chances of anyone else finding are slim to none, since there haven't been any reported problems.

    Of course, after this, they will probably make sure to get a court order forcing you to keep your mouth shut and there won't be a thing you can do about it after that.

    At least by public disclosure you can offer the legitimate defense that for a company whose internal affairs are unknown (which would generally be the case except for people who actually worked there), public disclosure is the only way to be sure that they will actively try to fix the problem.

    Trying to talk to the company privately first will, more often than not, get you nowhere because the only bugs that a company will bother to fix are the ones that actually _cause_ problems. They have too many other things to worry about to bother to fix things that *MIGHT* be exploited later.

"I've finally learned what `upward compatible' means. It means we get to keep all our old mistakes." -- Dennie van Tassel

Working...