U.S. Computer Security Advisor Encourages Hackers 275
DarklordSatin writes: "According to this Associated Press article, which I was pointed to by the nice guys over at Ars Technica, Richard Clarke, Dubya's Computer Security Advisor, wants to encourage hackers to find security holes in software. Although he feels that the system only works when the hackers show 'good faith' and disclose the holes to the company before the public, he wants to start offering more legal protection to hackers and that is a very good step in the right direction." As the folks at Ars point out, though, "Naturally, Mr. Clark was using the original, more generalized, definition of "hacker", but I guess saying 'Bush Adviser Encourages Discovery of Software Bugs' just didn't have enough zing."
Probably won't last (Score:3, Insightful)
Friend or Foe (Score:2, Insightful)
Disclosing to company vs public (Score:3, Insightful)
Being publicly accountable makes a company more diligent with security and bug testing. The only downside to public announcements is that every hacker out there now knows about it. The upside to THAT is that the company now has a hell of a lot of incentive to patch the hole in a prompt manner. Just my 2c!
Re:break programs? (Score:2, Insightful)
Re:Friend or Foe (Score:3, Insightful)
I suggest that a US citizen send them a fax or two about Hewlett Packard [slashdot.org].
Re:Hackers (not a slippery slope at all) (Score:4, Insightful)
He means responsible hackers who just find the problems and notify the company. Not hack into banks or your computer.
It is perfectly legal for someone to try to defeat their own home security system. While it is not legal for them to break someone elses (unless requested.)
Not a very slippery slope at all if you look closer. All he wants is for people who discover or uncover problems on their own little systems or labs to be allowed to tell the companies. Or even just let these people find the problems on their own. As well, he wants to legislate it a bit more, so while they can notify the companies, they won't be able to release to the public exact details on how to break in.
Just like, if I discovered that my security system on my car was easily breakable. I could tell the company, and let my friends know there is a problem. But I cannot publish a detailed paper explaining how to unlock doors with a screwdriver and some patience.
judgemental (Score:3, Insightful)
who gets to decide what a hacker did was in 'good faith'? These proposed laws mixed with the DMCA should make the credibiliy of the system less than it is currently treading at...
His Definition of Hackers. (Score:2, Insightful)
He encouraged hackers who are also "professionals" to look for bugs like this, and then report the bugs to the government and the software maker. There was no policy about what happens when both moribund entities laugh and sit on it.
Nor did he want the hoi-poli hackers out there looking for software bugs. He was explicit about this: Only Security Professionals Need Apply.
Allow me to take this moment to reassure that he is as disconnected from things as you could ever imagine. This is just the same crud in a new can. He will happily prosecute you if you do something to make the world better and don't wear a suit / this is not your "job" by his lights.
So don't take it too much to heart... he really didn't mean you regular people, folks.
NPR Interview this morning ... (Score:3, Insightful)
Clarke also said he wants "Computer Security Specialists" to hack and not the people doing it for fun. This ambiguity is the problem: how do you define "Computer Security Specialist"? Most of everything I learned about IT came through hacking for fun. Now I'm employed as a "Computer Security Specialist."
Re:Ethics (Score:3, Insightful)
The government can't do much more than tell the company what they already know. I suppose the government could stop using such software, but beyond the operating systems and generic office applictions, I doubt the government makes a widespread use of any other commodity software packages. All the government can really do is make an announcement about the problem, and the "hacker" can probably do the same thing, more efficiently, and in a way that more effectively reaches those that need to know about it.
-Restil
Re:Disclosing to company vs public (Score:2, Insightful)
If a noble, virtuous person can find the vulnerability and report it responsibly - who's to say that an evil, amoral person can't do same? Isn't this blatantly obvious? Shouldn't this be pointed out to companies, the government, and the media? What are the chances that a "Black Hat" is going to be responsible about disclosing the latest vulnerability; especially if it will allow him to gain access to, or wreak havoc on thousands of systems?
I think something to the tune of 30 days for a company to get something rolling before it's released is perfectly reasonable. After all, the hard work of finding the source of the problem is already done for them. If they're not actively trying to find and solve the problem, IMHO they're acting in bad faith and are putting all of their customers in jeopardy. After 30 days, tell the public that a vulnerability exists. Put the heat on them. If they don't budge (or worse -threaten with suit), let a few other white hats know about it so that they too may "discover" the vulnerability.
Now we have proof-of-concepts coming from half dozen different sources. This should be more than ample proof (to the company, media, and court of law) that the probability of somebody exploiting it for their own ends is very real, and more dangerous than somebody who remains quiet (and cooperative) for over a month.
If they haven't acted within a week of the 30-day mark, publish the code to every online forum you can find. SlashDot, ZDNet, ARSTechnica, Kuro5hin<sp?>, your own website, etc.
As to 'full disclosure' - that, IMHO, is not a proper solution. If somebody releases a fully disclosed [BIND/SendMail/Apache/ProFTPD/IMAPD/etc.] vulnerability while I'm asleep, I could be '0wn3d' before I even wake up to read my e-mail.
Most importantly, the public has to be made aware of the fact that software companies do, in fact, stonewall whitehat hackers. This leads to the customers' systems being in danger from blackhats, which is a Very Bad Thing. I thought this would be obvious, but then I've always been of the radical mindset of personal responsibility.
Is that even politically correct?!?
Richard supports the DMCA (Score:3, Insightful)
What is mine? (Score:4, Insightful)
How can I attack my own systems without attacking someone else's 'intellectual property' or some such BS? I can't. But by the terms of the licenses (even the GPL and BSD, I believe) I can't blame the people I got the software from.
Anyone living in the US, connecting to the US, or who has even heard of the US should not be doing computer security. Anyone who is doing even a reasonable job of it is checking into and poking into the products supplied by vendors. But this is illegal. The vendors can't be blamed. Only you. You can be blamed, but you don't legally have the right to do the thing/s that will make your work effective.
Run. Run and hide.
I said it in a response to a journal on this story (posted yesterday, BTW) but I'll say it again: in a fight between this guy and Ashcroft (which is what this essentially is), Ashcroft will win every time. The only way to get around the problem is to invalidate the disclaimer of warranty of merchantibility of a product. If nothing else, computer software must be fit for a specific purpose. At that point, GM and Walmart become aligned with anti-DMCA forces. Then Microsoft and the Senator from Disney get to see REAL political power.
Hacking for "Security Professionals" only (Score:5, Insightful)
This guy is not your friend. He, like the rest of the administration, is solely concerned with corporate interests. What he has in mind here is definitely not exposing exploits and causing bad corporate PR. It is the quiet uncovering of holes and the quiet informing of the software makers so they can issue mystery patches.
The reasoning behind that I suppose is to keep malicious hackers from using public exploits. But in reality, by the time the so called "security experts" stumble on the holes, the real evil hackers have already known about them for a long time. This is just more the "keep the problem secret and it will go away" policy that has gotten us into trouble.
Ethical Responsiability (Score:2, Insightful)
Give me a break.... (Score:2, Insightful)
"If there are legal protections they don't have that they need, we need to look at that,"
No hurry!!
Re:Ethics (Score:5, Insightful)
If some shadetree mechanic is working on his buddie's Camaro, and finds a manufacturing defect that ought to prompt a safety recall - he goes to the manufacturer and most likely gets promptly ignored (for the sake of argument here). He can then go to something like Consumer Affairs, but he sure as hell doesn't have to. He can go straight to the press to warn people that their Camaros (or whatever) are going to fall apart at 88 miles per hour.
That is PRECISELY what the hackers are doing - they're going to the press.
Respected, established, journalistic entities, specializing in the field of computer security. 2600 magazine, BugTraq, etc.
Not publishing a security hole, not being able to report something to the press, THAT is an abridgement of free speech. It's BULLSHIT that someone needs to be an "employed security professional" to have the right to work on computers and find bugs.
Re:Hacking for "Security Professionals" only (Score:3, Insightful)
Two things bother me about this statement:
1) Define "Security Professional". How do I get to be one? Do I have to hire a lawyer? Am I an SP if I config user passwords? Write code? Use keys in doors? Write papers on PKI systems and techniques? Hack my DreamCast to play Simon? Can I get an SP license from somewhere (other than the BAR Assoc.
2) How do you get SP's if "ordinary Joe's" are forbidden from studying and learning the techniques required to do the hacking?
find problems but not prove them? (Score:2, Insightful)