U.S. Computer Security Advisor Encourages Hackers 275
DarklordSatin writes: "According to this Associated Press article, which I was pointed to by the nice guys over at Ars Technica, Richard Clarke, Dubya's Computer Security Advisor, wants to encourage hackers to find security holes in software. Although he feels that the system only works when the hackers show 'good faith' and disclose the holes to the company before the public, he wants to start offering more legal protection to hackers and that is a very good step in the right direction." As the folks at Ars point out, though, "Naturally, Mr. Clark was using the original, more generalized, definition of "hacker", but I guess saying 'Bush Adviser Encourages Discovery of Software Bugs' just didn't have enough zing."
Left hand, meet right hand... (Score:3, Funny)
Re:Left hand, meet right hand... (Score:3, Funny)
Re:Left hand, meet right hand... (Score:2)
No, "Welcome to the Great Golden Age of Hacking!" (Score:2)
Ok, I'm removing my tongue from my cheek now!
I heard this guy on NPR this morning... (Score:2)
He said that he encourages those in the computer security field (but not anybody else) to run and attempt to crack industry software on their own computers (but not anybody else's) - ignoring the fact that this violates the DMCA - and then report any vulnerabilities to the government (as well as the manufacturer).
This seems like a tinly-veiled attempt to give the NSA a few more backdoors to me.
I vote for a 1-week courtesy notification period before a full, public disclosure - no matter who you are, or how much money you have.
Re:I heard this guy on NPR this morning... (Score:2)
There's a good chance he has never heard of the HP-Tru64 su vulnerability.
He said that he encourages those in the computer security field (but not anybody else) to run and attempt to crack industry software on their own computers (but not anybody else's) - ignoring the fact that this violates the DMCA
Who says this violates the DMCA? The DMCA only applies to hacking systems which protect copyright. HP has threatened to sue over the HP-Tru64 su vulnerability. It's not illegal until they actually follow through with the lawsuit and win.
-a
Re:I heard this guy on NPR this morning... (Score:3, Informative)
If this message from Snosoft is any indication, I wouldn't have much confidence in reporting to the government either.
From: KF
To: full-disclosure@lists.netsys.com ; bugtraq@securityfocus.com ; recon@snosoft.com
Sent: Wednesday, July 31, 2002 7:42 PM
Subject: [Full-Disclosure] for the record... (Tru64 / Compaq)
http://www.msnbc.com/news/788216.asp?0dm=T14JT
Clarke cautioned that hackers should be responsible in reporting programming mistakes. A hacker should contact the software maker first, he said, then go to the government if the software maker does not respond soon.
--
For the record... we contacted HP(at the time Compaq), and CERT several times. I attached the original version of our su exploit (not the one that phased leaked) to NIPC and to CERT BOTH. We recieved an extremely long delay at CERT before they even responded. At that point I called CERT 2 times to see what the heck was going on and eventually I establish contact (Ian Finley). I also mailed nipc.watch@nipc.gov or whatever the email address on their page was. They didn't mail back
I deeply regret the fact that one of my team members plagerized another and leaked some code but my god people WE TRYED to give SEVERAL people a heads up!
-KF
Re:Left hand, meet right hand... (Score:2)
This reminds me of when Clinton's former surgeon general suggested that students be educated about masturbation (as, among other things, an alternative to unprotected sex). She was attacked for saying something that offended the sensibilities of many -- even if her motives were sound.
So, if the press starts hyping this as "Bush appointee encourages computer hacking" (assuming that this will lead many TV pundits looking for something to rail against to miss the hacker/cracker distinction), we'll see the whitehouse backpeddle in order to look tough on those damn terrorist hackers...
Some things are just too taboo to middle america. If a soundbyte scares the uninformed, it will be condemned by those pandering to the uninformed.
Re:Left hand, meet right hand... (Score:2)
The fact is, that once a misrepresentation is released into the popular consciousness, it is going to persist (to some extent) regardless of how or when it is corrected. As an example, take your favorite political party and consider how they've been wronged by another party that repeats debunked charges.
Politicians are not rewarded for being even handed or factual, they are rewarded for getting people motivated -- this means that presenting things in the most favorable fashion is encouraged. Sometimes this means presenting an economic plan using best/worst case projections, sometimes it means disregarding truthful corrections (You can truthfully state that source X stated Y, even if the following day they issued correction Z, thus negating Y. -- its misleading and dishonest, but may be technically truthful)
The late california congressman and champion of government funded science, George E. Brown Jr. was asked in an interview with the NY Times: [data.free.de]
So, even if Clarke clarifies and reclarifies his position, someone with the proper motivation will allege that he's encouraging hackers to break software.
The same thing holds for software companies. The fact is, with few exceptions (e.g., slashdotters) most computer users will think that a product that has had its flaws reported in the news (even if they've been patched) is less reliable than one that hasn't had its flaws reported (perhaps as a result of threatening researchers with DMCA suits).
In this sense, threatening researchers with lawsuits is less embarrassing than having vulnerabilities reported. Of course, if you're marketing to slashdotters this may not be true, but to J.Q. Sixpack it holds.
Companies care about profit. Selling more units typically increases profit (however, cf. Amazon.com). If reports of vulnerabilities lower sales, then the company is motivated to reduce vulnerability reports.
In related news (Score:2, Funny)
Probably won't last (Score:3, Insightful)
Re:Probably won't last (Score:3, Informative)
To his credit though, he did explain the difference between the current perception of hackers as being evil lawbreakers and the original definition of the old MIT hackers. He did broaden it just a bit by saying that old hackers were anyone who was into computers...whatever that means.
break programs? (Score:5, Funny)
Re:break programs? (Score:2, Insightful)
Fix it 'till it breaks... (Score:2)
Friend or Foe (Score:2, Insightful)
Re:Friend or Foe (Score:3, Insightful)
I suggest that a US citizen send them a fax or two about Hewlett Packard [slashdot.org].
Re:Friend or Foe (Score:2)
Right, because they know exactly how to handle these situations. [slashdot.org]
Too Late (Score:2, Interesting)
More surprising... (Score:3, Funny)
Maran
Disclosing to company vs public (Score:3, Insightful)
Being publicly accountable makes a company more diligent with security and bug testing. The only downside to public announcements is that every hacker out there now knows about it. The upside to THAT is that the company now has a hell of a lot of incentive to patch the hole in a prompt manner. Just my 2c!
Re:Disclosing to company vs public (Score:2)
Re:Disclosing to company vs public (Score:2, Insightful)
If a noble, virtuous person can find the vulnerability and report it responsibly - who's to say that an evil, amoral person can't do same? Isn't this blatantly obvious? Shouldn't this be pointed out to companies, the government, and the media? What are the chances that a "Black Hat" is going to be responsible about disclosing the latest vulnerability; especially if it will allow him to gain access to, or wreak havoc on thousands of systems?
I think something to the tune of 30 days for a company to get something rolling before it's released is perfectly reasonable. After all, the hard work of finding the source of the problem is already done for them. If they're not actively trying to find and solve the problem, IMHO they're acting in bad faith and are putting all of their customers in jeopardy. After 30 days, tell the public that a vulnerability exists. Put the heat on them. If they don't budge (or worse -threaten with suit), let a few other white hats know about it so that they too may "discover" the vulnerability.
Now we have proof-of-concepts coming from half dozen different sources. This should be more than ample proof (to the company, media, and court of law) that the probability of somebody exploiting it for their own ends is very real, and more dangerous than somebody who remains quiet (and cooperative) for over a month.
If they haven't acted within a week of the 30-day mark, publish the code to every online forum you can find. SlashDot, ZDNet, ARSTechnica, Kuro5hin<sp?>, your own website, etc.
As to 'full disclosure' - that, IMHO, is not a proper solution. If somebody releases a fully disclosed [BIND/SendMail/Apache/ProFTPD/IMAPD/etc.] vulnerability while I'm asleep, I could be '0wn3d' before I even wake up to read my e-mail.
Most importantly, the public has to be made aware of the fact that software companies do, in fact, stonewall whitehat hackers. This leads to the customers' systems being in danger from blackhats, which is a Very Bad Thing. I thought this would be obvious, but then I've always been of the radical mindset of personal responsibility.
Is that even politically correct?!?
so US security has a bit of a clue (Score:5, Interesting)
Now I hope that a USA Citizen tells them that they are encouraging something that is outlawed by the DMCA.
Re:so US security has a bit of a clue (Score:2)
Re:so US security has a bit of a clue (Score:4, Informative)
Uhhh...yeah, isn't this what computer security professionals do *already* as part of the normal course of their everyday jobs? (If not, they *should*
Right hand doesn't know what the left is doing (Score:3)
Except that HP is threatening the DMCA against the group who (notified and) publicized the Tru64 vulnerability. AFAIK, this vulnerability was found by their examination of their own systems.
Re:Right hand doesn't know what the left is doing (Score:2)
Re:Right hand doesn't know what the left is doing (Score:2)
Dipshits. If I find a flaw, I'm going right to BugTraq or Security Watch.
Re:so US security has a bit of a clue (Score:2)
Sort of like sending out "Congratulations! You have won a prize!" notices to those with outstanding warrants.
Re:so US security has a bit of a clue (Score:2)
Re:so US security has a bit of a clue (Score:2)
I don't see how this is the case. If you're only notifying the company that makes the product (as was requested by the advisor), then you aren't DISTRIBUTING a circumvention device.
Ah, that explains it (Score:2, Funny)
cnn link (Score:2)
Linky Linky [cnn.com]
it's said WE have to be the world's debuggers
Of course, if you go out and actually do this... (Score:5, Interesting)
These days, with "corporate fraud" being the buzzword d'jeur, one could make a very strong argument that the DMCA encourages corporate fraud because it allows companies to sweep their product defects under the carpet.
Re:Of course, if you go out and actually do this.. (Score:2)
Sued/fined? I have a hard enough time convincing the people that I work with that there is a difference between PHP and HTML. And they are reasonably intelligent people. Try convincing G.W. that there is a difference between "hackers" and "malicious hackers". Problem solved, label them all as terrorists and throw them in jail forever. The DMCA is the last thing I'd be worried about.
Re:Of course, if you go out and actually do this.. (Score:2)
Just be sure not to give out your name... (Score:3, Interesting)
Re:Just be sure not to give out your name... (Score:2)
Re:Just be sure not to give out your name... (Score:4, Informative)
The fellow was Brian West, who worked for an ISP, and he did a little more than just "discover" the security hole in the Poteau Daily News website. A link [nipc.gov] to more info..
They will first encourage you (Score:2, Informative)
In Other News (Score:2, Funny)
Ethics (Score:4, Interesting)
I wonder how long the "hacker" should give the company. And is the government really the next best step? I work for the government and I seriously doubt that will get the ball rolling.
The obvious problem with full disclosure, of course, is making malicious hackers and even terrorists aware of the problem. Solutions anyone?
Re:Ethics (Score:2)
Well Microsoft and others having been pushing their "reasonable disclosure" requests and that states 30 days.
Which i think is fair enough.
Let them know. If they haven't released a fix within that time then fully disclose it.
Yes, people will argue that as soon as it's found out, others may be using it so it would be better to know all the details immediately.
However the likelyhood of someone finding the problem and writing a worm or something that exploits it is substantially less when they don't have all the gory details laid out for them in a nice document. Which is the major downside to full disclosure.
Re:Ethics (Score:2)
If I wasn't living in the United States, perhaps I would try this tactic at least once to give the US government benefit of doubt. If they fail, then no need to try it again. If they actually force the company to make the patch, it may be a good thing.
Shouldn't we report the su exploit in Tru64 to the US government now? Like "a company in your country is making unsafe software and refuses to fix it, please consider if you still want to buy their software for the government and the military".
Re:Ethics (Score:3, Insightful)
The government can't do much more than tell the company what they already know. I suppose the government could stop using such software, but beyond the operating systems and generic office applictions, I doubt the government makes a widespread use of any other commodity software packages. All the government can really do is make an announcement about the problem, and the "hacker" can probably do the same thing, more efficiently, and in a way that more effectively reaches those that need to know about it.
-Restil
Re:Ethics (Score:2)
IMHO bugs should be made public, even in a private company's software. Because, it isn't really "theirs" and theirs alone. I depend upon the OpenSSH and SSL systems as well as my Windows box to keep my data secure. So do many many other people. If there is a hole then it puts my credit card numbers, medical info, me in danger. At that point it is a public problem, and a private company should not be permitted to "just hide it" any more than Ford/Firestone should be able to just hide the dangers of their products.
IMHO you should give the developer a "sufficient time" (depending upon the size of the bug, number of developers, etc.) to fix it. If they don't make any reasonable attempts then yeah, make it public. If there is a danger to the rest of us from the truly malicious then we ough't to know about it. Yes it will unleash a torrent of script kiddies, but when you compare that against a sea of quiet thefts or, got knows what else perpetuated by the truly determinedly vicious.
I'd be suppeised if you couldn't argue this under existing whistleblower laws.
So who in the government gets the report? (Score:2)
And same for a widely used piece of software - after the software company, who in the government gets the report?
Re:Ethics (Score:5, Insightful)
If some shadetree mechanic is working on his buddie's Camaro, and finds a manufacturing defect that ought to prompt a safety recall - he goes to the manufacturer and most likely gets promptly ignored (for the sake of argument here). He can then go to something like Consumer Affairs, but he sure as hell doesn't have to. He can go straight to the press to warn people that their Camaros (or whatever) are going to fall apart at 88 miles per hour.
That is PRECISELY what the hackers are doing - they're going to the press.
Respected, established, journalistic entities, specializing in the field of computer security. 2600 magazine, BugTraq, etc.
Not publishing a security hole, not being able to report something to the press, THAT is an abridgement of free speech. It's BULLSHIT that someone needs to be an "employed security professional" to have the right to work on computers and find bugs.
judgemental (Score:3, Insightful)
who gets to decide what a hacker did was in 'good faith'? These proposed laws mixed with the DMCA should make the credibiliy of the system less than it is currently treading at...
Run to Uncle Sam? (Score:4, Interesting)
A more interesting quote is in this CNN article. [cnn.com]
Umm, really? To whom in the government? The Department of Fixing Stuff? The FBI? The FTC? The DoJ? Gosh, that'll keep (e.g.) Microsoft on their toes. Bwahahahaha!
Precedent would suggest that a more likely result will be the jailing of the hacker, and the awarding of a fat contract to the vendor.
Thanks all the same, but this is just some guy in a suit. When it's written up in law by Congress, signed by G.W.Bush, and delivered to the Library of Congress by flying pig courier, I might change my mind.
Re:Run to Uncle Sam? (Score:2)
(1) that fellow is messing his pants and wants Mr. Clarke canned for pushing hacking, or
(2) Mr. Clarke was put up to this as a way of getting free security labor for Microsoft while restricting press leaks about their software.
And whom would you be required to contact in the case of Open Source?
Re:Run to Uncle Sam? (Score:2)
Perhaps we could resurrect the House Unamerican Activities Committee.
"Are you, or have you ever been, a member of the mind control cult known as 'The Free Software Foundation'?"
His Definition of Hackers. (Score:2, Insightful)
He encouraged hackers who are also "professionals" to look for bugs like this, and then report the bugs to the government and the software maker. There was no policy about what happens when both moribund entities laugh and sit on it.
Nor did he want the hoi-poli hackers out there looking for software bugs. He was explicit about this: Only Security Professionals Need Apply.
Allow me to take this moment to reassure that he is as disconnected from things as you could ever imagine. This is just the same crud in a new can. He will happily prosecute you if you do something to make the world better and don't wear a suit / this is not your "job" by his lights.
So don't take it too much to heart... he really didn't mean you regular people, folks.
NPR Interview this morning ... (Score:3, Insightful)
Clarke also said he wants "Computer Security Specialists" to hack and not the people doing it for fun. This ambiguity is the problem: how do you define "Computer Security Specialist"? Most of everything I learned about IT came through hacking for fun. Now I'm employed as a "Computer Security Specialist."
Mailing address (Score:2, Informative)
Re:Mailing address (Score:2)
http://www.ciao.gov/ [ciao.gov]
INTERVIEW THIS GUY (Score:5, Interesting)
Re:INTERVIEW THIS GUY (Score:4, Interesting)
This is a good idea. A natural extension to this would be to invite other goverment figures, such as Justice Department officials or members of Congress. People who have an interest in federal or international technology policies might appreciate the open, yet moderated, forum of Slashdot. This could be an example of the U.S. goverment at its best.
This could be an easier way for people to "write their Congressmen", since there really is a lower courage threshold when posting to Slashdot (yes, writing Congressmen isn't trivial for many people, even though it should be).
Re:INTERVIEW THIS GUY (Score:2)
Especially if they are more tolerant of typographical errors on such a forum: "goverment" --> "government"
Also, posts to Slashdot are harder to ignore (no trashcans, only trashmouths)
Re:INTERVIEW THIS GUY (Score:2)
Interresting fuel for the full-disclosure debate (Score:3, Informative)
Disclaimer: My personal side in the above-mentioned debate is already decided. I advocate responsible full disclosure. Tell the vendor first, but dont agree to any NDAs and always make it clear to the vendor that after a reasonable delay you go public with everything you've got relating to the hole.
Having proclaimed my bias, it was interesting to hear the guys own words on NPR this morning. On the positive side he correctly defined "hacker." On the negative side he clearly preferred a more restrictive disclosure policy that could be summarized as "Tell the vendor then shut the hell up and go away" When gently pressed he was prepared to allow notification of a "responsible" coordinating agency but he made very sure to never advocate anything so liberal as responsible full disclosure. I was busily making breakfast and coffee at the time so I might have missed an implication or two but these days the usual spin on "responsible" when linked to the word "agency" mean either government-sanctioned-&-corporate-owned or government-operated. Some security hackers find this a potentially scary thought.
Personally, I take responsibility for my own systems security. Based on the information I have I do my best to keep them buttoned down. Only in that way can I ethically place any blame on the persons that might try and crack them. (Of course I also know my limitations - if a true expert wants to smoke my systems I know they're gone. I'll be satisfied with keeping the worms and kiddies out whilst trusting that theres nothing on my own boxes that a true expert wants badly enough to put in the effort)
From this standpoint, anything other than responsible full disclosure denies me knowledge I need in order to make an informed decision about the risks I'm assuming. Similarly to do anything less myself, should I discover a security hole, is failing in my obligations to my colleagues.
To my mind he's advocating using the community as a source of free QA services whilst at the same time making sure that the vendors can get away with the old oxymoron of security through obscurity. Who'd bet against a government sponsored coordinating body being followed rapidly by laws prohibiting disclosure of holes other than through that body?
HP (Score:2, Interesting)
Regardless of the fact that it wasn't actually SnoSoft that officially published the exploit, even if they had, Clarke is basically saying that they went about things in pretty much the most appropriate manner.
Contrary to his remarks on NPR this morning (Score:4, Informative)
I'd rate him above-average on the clue-o-meter (certainly as federal gov't employees go!) but he's not a friend to the hackers by any stretch.
Re:Contrary to his remarks on NPR this morning (Score:2)
Like Suzuki did with the Samurai? (where they simply pointed out that it was poorly designed)
Or like Isuzu did with the Trooper? (where they pointed out the design flaw and a simple design change to improve it and still got sued)
Richard supports the DMCA (Score:3, Insightful)
Re:Richard supports the DMCA (Score:2)
CISCO security specialist certification [cisco.com]
make you "security professional" ?
Re:Richard supports the DMCA (Score:2)
http://www.npr.org/me3.smil
It is segment number 11
He did say "...hackers can help" at his defcon talk, but is using a *VERY* limited definition of the H-word. During this morning's NPR interview Clarke said:
"the law currently says that if you discover a flaw, that's ok, but if you intentionally go out and reverse engineer code, unless you are somebody who is in the computer security business, the law assumes that you've done it for uh, nefarious purposes. So the only people we are encouraging to go out and find these flaws are computer security professionals, not people who are doing it just for the fun of it"
What is mine? (Score:4, Insightful)
How can I attack my own systems without attacking someone else's 'intellectual property' or some such BS? I can't. But by the terms of the licenses (even the GPL and BSD, I believe) I can't blame the people I got the software from.
Anyone living in the US, connecting to the US, or who has even heard of the US should not be doing computer security. Anyone who is doing even a reasonable job of it is checking into and poking into the products supplied by vendors. But this is illegal. The vendors can't be blamed. Only you. You can be blamed, but you don't legally have the right to do the thing/s that will make your work effective.
Run. Run and hide.
I said it in a response to a journal on this story (posted yesterday, BTW) but I'll say it again: in a fight between this guy and Ashcroft (which is what this essentially is), Ashcroft will win every time. The only way to get around the problem is to invalidate the disclaimer of warranty of merchantibility of a product. If nothing else, computer software must be fit for a specific purpose. At that point, GM and Walmart become aligned with anti-DMCA forces. Then Microsoft and the Senator from Disney get to see REAL political power.
Re:What is mine? (Score:2)
When this becomes commonplace (I hope that it does), software makers will be much more modest about their claims, and software will be developed only with the discipline it deserves.
With genuine warranties on software, companies like Microsoft will have to make their marketing department shut up (finally!) and will have to start selling really good, tried and true, software. However, I believe this sort of software industry is quite a few years away.
Right now, the software industry is going through some sort of adolescence, where righteous bastards like Senator Hollings (SSSCA), Microsoft (.NET, Palladium), and others are trying to write the law of the future in their own image, and the equally-unreasonable "free love" software culture opposing in the wrong way. I really hope reason wins in the end, and we fall back on traditional good ol' hard work and quality products. Some regulation may be appropriate, but it can't go as far as the DMCA already has.
Re:What is mine? (Score:2)
If the adolesence you speak of lasts too long, people will ask for these things even less than they do today.
I think for the most part, the software industry up to this point has evolved fairly unsuprisingly: starting out with really expensive custom computers and software and leading to current commoditization where quality has suffered due to pricing competition and relentless market growth. This seems fairly natural, where a system starts out at one extreme, swings over time to another extreme, and eventually stabilizes somewhere in the middle.
Right now, I think we have reached the second extreme, especially since it has come to the point where software that is free can routinely exceed the quality of software that is not (in a way, this is pretty sad).
If this is all true, this adolescence has peaked, and we've begun a trend towards stability. Maturation of the software industry may take a number of years, but I think more and more people are grasping that good quality is possible again, which is encouraging. My hope is that this quality is achieved through a free-market consensus rather than legislation or industry-imposed DRM/Palladium, since legislation and DRM will probably be extremely short-sighted and damaging overall. However, some consumer-oriented legislastion, such as requiring warranties, may actually be good.
Get out of jail free card (Score:2)
Sure...hack & get thanked..then get arrested! (Score:2)
I mean, their past actions truly don't inspire a single grain of trust. Look at last week where the guy in Houston got busted by the court house for EXPOSING their wifi total lack of security (remember that they claimed he did $5000.00 in damage - no doubt that's exactly how much they paid for all the wifi stuff they had to shut down). Plus...just look at how easy they make it...try to do one good thing and some lawyer begins the mantra: DMCA..DMCA..DMCA.
Nice words you speak guy, but what did Clara say in the Wendy's commercials: "Where's the beef?"
Until I see the beef, I'm not trusting a single word you say....
Rehash of NPR's Morning Edition Interview (Score:5, Interesting)
So let me see where this puts us. Phred Programmer discoveres a buffer overflow that crashes IE. He tells his security professional about his discovery. Our "security professional" says "what's a buffer overflow?" and the whole thing falls on the floor.
Wait, let's try this again. Phred Programmer discovers a buffer overflow problem that crashes IE. He puts on his "security professional" hat and calls Microsoft. Microsoft says "So what? It crashes. BFD. We'll fix it on the next major release."
Phred Programmer waits until the next major release and the mess is still there. Remember, he's not supposed to write code to demonstrate this problem, or the potential harm, so Microsoft has no idea whether they've really fixed this problem.
So Phred Programmer calls the feds. They respond with "Huh? What's the big deal?" "Well, you could exploit this and hack with full administrator priviliges", says Phred Programmer. "Sounds far-fetched" say the feds. "But just in case you're right, I don't want you writing any code. Why don't you post your notions with Microsoft?" "But I already have and they promised a fix by the next major release", complains Phred Programmer.
"Hmm. We'll have to take it up with them."
And so, another major release goes by and still nothing. Meanwhile, somebody else figures out the breeched security and because the don't live in the US, they post a script for the kiddies to use.
Back to the present: Somebody explain to me why this scenario is not likely. Restricting this information to "security professionals" seems to me like an effort to sweep security problems under the rug.
Richard Clark's ideas suck, IMNSHO. He clearly has no concept of how bugs are discovered, demonstrated, and how the repair of those bugs is prioritized by software companies. Does anyone here really think Microsoft would have fixed those buffer overflow problems if no-one had written an exploit and published it? Does anyone here think that users in other countries will have any respect for stupid US policy (never mind the law)? Sheesh.
Hacking for "Security Professionals" only (Score:5, Insightful)
This guy is not your friend. He, like the rest of the administration, is solely concerned with corporate interests. What he has in mind here is definitely not exposing exploits and causing bad corporate PR. It is the quiet uncovering of holes and the quiet informing of the software makers so they can issue mystery patches.
The reasoning behind that I suppose is to keep malicious hackers from using public exploits. But in reality, by the time the so called "security experts" stumble on the holes, the real evil hackers have already known about them for a long time. This is just more the "keep the problem secret and it will go away" policy that has gotten us into trouble.
Re:Hacking for "Security Professionals" only (Score:3, Insightful)
Two things bother me about this statement:
1) Define "Security Professional". How do I get to be one? Do I have to hire a lawyer? Am I an SP if I config user passwords? Write code? Use keys in doors? Write papers on PKI systems and techniques? Hack my DreamCast to play Simon? Can I get an SP license from somewhere (other than the BAR Assoc.
2) How do you get SP's if "ordinary Joe's" are forbidden from studying and learning the techniques required to do the hacking?
Ethical Responsiability (Score:2, Insightful)
Picky, picky, picky... (Score:2)
Getting a little nit-picky here? I suspect he used hackers to describe anybody who can gain unauthorized access to otherwise restricted systems, not someone who is encouraged to find out why a "bug" caused the DoD's wargames application to crash. Yep, there's a reason he used the word "hacker" and not "software bugs hunter". I know entry can be exploited using system bugs, but hacking is obviously more than just exploiting "bugs", or did the poster just happen to miss the story immedietly following this one? [slashdot.org] A hacker is a combination of skills, not just a "bug hunter"... Which is probably why good ol' Clark used the popular definition in the first place.
OOoooo SHIT! :-) Somebody gets it? (Score:2)
How smart is he... really (Score:2)
The problem is that we, the constituents, do not elect them for objectivity but for being subjective, stubborn, and close-minded. It's true... that's how you get elected (or stay in office).
So what is Richard Clarke doing here? It is quite possible he is beginning to switch popular perception. Using "hacker" correctly is a good start. And I assume most of us can agree that this is a step in the right direction.
The problem is that too many of the posts in this thread say "He isn't going far enough, therefore its a complete waste of time." because "the end users will never know any better."
Well I hate to say it, but this is how you get the end users informed: slowly start moving in the right direction, educating the masses, letting them put their fears to rest bit by bit. I think Clarke could really start something here IF we, the supposed IT professionals, didn't just discard what he says right off of the bat.
As a sidebar, I always wondered why people don't try for more publicity campaigns to get laws passed... especially in foreign countries. Bush can say no to Kyoto because the American people don't care/want him to. You can't much expect to force a population to do what you want by saying "You are an idiot! Think differently!" (and it hasn't ever worked).
So why don't all concerned parties deluge primetime with an ad campaign? Slowly change popular opinion? Maybe in a year you could get huge differences. The key to remember is that politicians are nothing more than fonts of popular opinion. Clinton proved it. G Dubs is proving it: it doesn't matter what you think it matters what the people believe you think by what you say.
Clarke seems to be doing that but since it isn't the Free Software/Free Beer/Free Nekkid solution so many on
NPR Stream (Score:2, Informative)
Their stream is here. [npr.org]
Good Lord, I've deep-linked to NPR.
Give me a break.... (Score:2, Insightful)
"If there are legal protections they don't have that they need, we need to look at that,"
No hurry!!
How do you become a "Security Professional" (Score:2)
It seems you have to start your first day at the job with absolutely no experience in the field.
I know, it's gonna be a licensed profession like doctors or lawyers, with its own lobby organization, barriers of entry and all the rest. Oh well...
Guh... this debate is pointless. (Score:2)
Of course, after this, they will probably make sure to get a court order forcing you to keep your mouth shut and there won't be a thing you can do about it after that.
At least by public disclosure you can offer the legitimate defense that for a company whose internal affairs are unknown (which would generally be the case except for people who actually worked there), public disclosure is the only way to be sure that they will actively try to fix the problem.
Trying to talk to the company privately first will, more often than not, get you nowhere because the only bugs that a company will bother to fix are the ones that actually _cause_ problems. They have too many other things to worry about to bother to fix things that *MIGHT* be exploited later.
Re:Hackers (not a slippery slope at all) (Score:4, Insightful)
He means responsible hackers who just find the problems and notify the company. Not hack into banks or your computer.
It is perfectly legal for someone to try to defeat their own home security system. While it is not legal for them to break someone elses (unless requested.)
Not a very slippery slope at all if you look closer. All he wants is for people who discover or uncover problems on their own little systems or labs to be allowed to tell the companies. Or even just let these people find the problems on their own. As well, he wants to legislate it a bit more, so while they can notify the companies, they won't be able to release to the public exact details on how to break in.
Just like, if I discovered that my security system on my car was easily breakable. I could tell the company, and let my friends know there is a problem. But I cannot publish a detailed paper explaining how to unlock doors with a screwdriver and some patience.
Re:Hackers (not a slippery slope at all) (Score:2)
You mean something like: DMCA v.1 rev. 1
Re:Hackers (not a slippery slope at all) (Score:2)
Seems like he wasn't really saying that it was okay to hack software in your possession. It really was just you can hack software in your possession if you work for a company involved in computer security.
So what kinds of people is this really aimed at? Seems to be aimed just at campaign contributors who own or run Software Security Companies?
Re:Hackers (Score:4, Informative)
Re:Hackers (Score:3, Interesting)
The difference with homes is that everyone knows what they are, what they're for and the most common routes of security breakage.
When we got a security system installed at my current place, I slinked around and tried to get around without being seen by the motion detectors. Eventually I found a way to get from the back door to my computer without triggering a single motion detector. This resulted in us having them moved around.
Computers, in contract, are big nebulous boxes and most people don't know much about how they work or how to secure them. This is why they should be treated differently than homes with respect to how the security is tested.
Re:Hackers (Score:2)
This case is more like a builder or an engineer (or Bob Villa) testing different building materials, home construction methods, and security products for safety and applicability. Even materials that have been generally approved for use often need to be tested before (and sometimes after) being used in a particular way. You're not breaking into someone else's house, and you're not stealing or destroying someone else's technology. You're simply thoroughly testing something to see if it meets your needs. In general, you should be free to tell others the results of your testing. If it doesn't even stand up to specification, then you're pretty well obliged to warn others (legally so if you're an engineer), including the supplier. In no case should you be prosecuted for telling people that the product doesn't work or shouldn't be used for certain applications, and for telling people why or why not (unless you're being maliciously libel).
This perspective on hacking is much closer to the original sense of the word and is what's done every day by virtually any manufacturing or construction company, as well as individuals, academics, journalists, and consumer groups. I think that the U.S. computer security advisor is simply suggesting that computer products should be treated no differently from building materials so even though companies might want to restrict testing, reverse engineering, and negative publicity, it is not in the interest of public rights and safety. The only grey area is where computer systems include both public and private elements and there is less of a natural distinction between testing and trespassing. In the real world such evaluations might be done by third party audit, but again, the boundaries are much clearer, and as the parent comment mentioned, computer technology is less mature and harder to test exhaustively.
Re:Hackers (Score:2)
mark
Re:Hackers (Score:2)
Re:This is Consistent (Score:2)
Remind me, please -- cite your source. Everything I've read (in mainline newspapers, Union-Tribune and North County Times) indicated that all the recounts indicated the opposite. That's why there was no big media splash; no change is no news.
-Billy
Re:Or maybe it's... (Score:2)
No ACCIDENTAL WEAKNESSES (Score:3, Informative)
The thing is, network security weaknesses are rarely accidental. You can reliably predict the top five causes of security weaknesses:
Re:No ACCIDENTAL WEAKNESSES (Score:2)
Perhaps analogous to building codes for new construction, or FCC requirements for electronics?
Another approach would be certification, such as "Professional Engineer" for software engineers.
I know that many people will lash out at these sort of ideas, but as long as there are strict distinctions between "professional" and "non-professional", everyone should be able to get their way. Hobbyists can still do everything they want, while Software PE's can develop commercial software in the same way as building contractors develop office complexes--disciplined and with checks and balances. The end result is that software projects will cost what they really need to cost, and shoddy commercial software will be much rarer.
It has been this way for a long time in other disciplines, such as Mechanical Engineering or Electrical Engineering; the software industry just hasn't matured to this point, yet.
I would be more careful with words like "crime", however, because hobbyists really need to be protected. "Professional liability" may be more appropriate.
Re:No ACCIDENTAL WEAKNESSES (Score:2)
The thing is, it's the professionals who have been doing it "the unsafe way" for years who will keep on doing the same thing. It's the upstart hobbyists who have a reliable set of utilities that are much more immune to buffer overflows.
Just as an example, on all commercial Unices that I've had a chance to play with I've been able to make the 'pwd' command dump core. The GNU 'pwd' has never dumped core on me, despite my attempts.
The scary thing is, 'pwd' is perhaps one of the simplest shell commands there is. It takes no arguments. Yet it still took many years before the GNU one became as refined as it is today. Compare that to your typical network service and it's nightmare time. How many security patches have there been for vixie-cron? wu-ftpd? Those are relatively simple things!
No language is going to be able to force a programmer to not do stupid things, but things like perl 'taint' mode do help a little. Even then you have to worry about file race conditions in some circumstances.
Re:No ACCIDENTAL WEAKNESSES (Score:2)
It's interesting that I didn't consider Linux or OpenBSD, for example, as hobbyist systems, even though they really are, in a way. Many people at GNU and other free software organizations have been doing professional-quality work, just not often under the professional guise.
In another way, GNU, OpenBSD, XFree86, and KDE, for example, technically aren't made by hobbyists, because they went through the effort to create non-profit organizations, boards of directors, or otherwise distinquished themselves. There is something about them that has elevated them above a "hobbyist" status. Many of them should be considered professionals, even though most of them don't write software for-profit.
Re:No More Buffer Overflows (Score:2)
There are several languages in wide use today where the most idiomatic way to handle strings is immune to buffer overflows. Perl, for example. The worst a buffer-overflow attacker could do against a well-written Perl service is cause the network service to run out of memory and die. Admittedly that is a kind of denial-of-service attack, but it's not the worse thing that could happen.
And I'm sure that a dedicated C programmer could write a Perl program that would be vulnerable to buffer overflows, but only if he departed from "idiomatic Perl" and lapsed back into his bad C habits. Sort-of a variation of "A good Fortran programmer can write spaghetti code in any language!".
But even Perl is no magic bullet. Fix the buffer overflow problem and then the attackers start chiseling away at other stuff, like file race conditions. In the end, there's no substitute for solid software engineering.
For that matter, who set the standard so low that buffer overflows were ever tolerated?
Simple economics. It mostly works, no we didn't test every boundary condition, but the way we wrote it such testing/verification would be impossible, so ship it.
Re:No More Buffer Overflows (Score:2)
It's an attitude thing.
Your classical C programmer regards memory management as something too important for the compiler to take care of.
OTOH your classical Perl programmer regards memory management as too important for the programmer to take care of.