Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Security Government Your Rights Online

Security Researchers Threatened With US Cybercrime Laws 156

An anonymous reader writes "The Guardian reports that many of the security industry's top researchers are being threatened by lawyers and law enforcement over their efforts to track down vulnerabilities in internet infrastructure. 'HD Moore, creator of the ethical hacking tool Metasploit and chief research officer of security consultancy Rapid7, told the Guardian he had been warned by U.S. law enforcement last year over a scanning project called Critical.IO, which he started in 2012. The initiative sought to find widespread vulnerabilities using automated computer programs to uncover the weaknesses across the entire internet. ... Zach Lanier, senior security researcher at Duo Security, said many of his team had "run into possible CFAA issues before in the course of research over the last decade." Lanier said that after finding severe vulnerabilities in an unnamed "embedded device marketed towards children" and reporting them to the manufacturer, he received calls from lawyers threatening him with action."
This discussion has been archived. No new comments can be posted.

Security Researchers Threatened With US Cybercrime Laws

Comments Filter:
  • by Ynot_82 ( 1023749 ) on Friday May 30, 2014 @02:21PM (#47131111)

    ...when ill thought out laws are passed.

    In the UK, it is a crime (under the computer misuse act) to test a 3rd party system for vulnerabilities.

    The Heartbleed incident caused a lot of people to break the law testing whether websites were affected.

  • This is why we can't have nice things. Companies won't audit themselves, and they get bent out of shape if others do it for them...
    • This is why we can't have nice things. Companies won't audit themselves, and they get bent out of shape if others do it for them...

      You discovered my credit card number. You think it is now OK to run a program to try random PINs until you find the correct one?

      Why would you do that?

      • Re: (Score:2, Insightful)

        by Anonymous Coward

        I would say that this is more like:

        You leave your credit card on a table under a wet napkin. I look at the napkin and think I can read the number. I look closer and can indeed read the number and exp date. I tell you that your credit card is easily readable, and you should probably do something about it. You then report me to the police for stealing your credit card number.

        • by Anonymous Coward

          If you are reading my credit card number while it is under a wet napkin on a table inside my house then you can be sure I will call the police.

        • That's a really bad analogy. Peering at someone's credit card - even if it is under a napkin - is quite obviously very bad manners indeed. If you're saying unauthorised penetration testing is like peering at someone's credit card, then it's clearly wrong.

          And speaking as someone who has his own little toy server out in the cloud, I'd very much prefer to do my own damn penetration testing, thank you.

          • Re: See... (Score:5, Insightful)

            by arshat ( 3675763 ) on Friday May 30, 2014 @04:57PM (#47132373)

            That's a really bad analogy.

            It is. It's more like the wet napkin has retained an imprint of the credit card and you have left the napkin behind on the bar. Someone then takes the napkin, hands it to you and says "you want to be careful with these wet napkins, look". You call the police because someone you don't know has your credit card details.

        • More like you left a note on my kitchen table that I shouldn't leave my key under the doormat.

      • by thaylin ( 555395 )
        That is in no way what is being reported here. Possibly the initial discovery of the number because of a poorly designed application, but not the trying pins.. Do you like to come up with fallacies?
  • by Anonymous Coward

    In America any good intentions are met by defensive idiots

    fuck them don't even try to help them anymore use your research to secure the rest of the world and let them rot in the festering cesspool they created

  • NSA (Score:5, Insightful)

    by BradMajors ( 995624 ) on Friday May 30, 2014 @02:22PM (#47131133)

    The NSA and other security services will not want security researchers to find and fix vulnerabilities the security services are exploiting.

  • Company Assets (Score:4, Informative)

    by just_another_sean ( 919159 ) on Friday May 30, 2014 @02:28PM (#47131181) Journal

    Yeah how dare they ask these companies to take their heads out of the sand and do something about their customer's security/privacy!

    I'm appalled at the amount of "Good, they broke the law" comments in this thread...

    • Many of these people are essentially trying to get the company with the vulnerabilities to pay for the service of fixing them. People who've gotten the sorts of emails that say "I broke into your system and I can help you fix it" probably don't end up as fans of these drive-by services.

    • by HiThere ( 15173 )

      Consider who is issuing the posts. Or just assume that they come from astroturfers...you won't be far wrong.

  • by Opportunist ( 166417 ) on Friday May 30, 2014 @02:30PM (#47131199)

    Odd as it may sound, for security research, you have WAY more liberties there.

  • by Kevin Fishburne ( 1296859 ) on Friday May 30, 2014 @02:33PM (#47131239) Homepage
    Why, with all the plenty of cheap resources, technology, entertainment and knowledge, are people still complete assholes? There must be an asshole gene that natural selection has yet to make dormant.
  • Good (Score:5, Funny)

    by nurb432 ( 527695 ) on Friday May 30, 2014 @02:38PM (#47131321) Homepage Journal

    Everything is going according to plan.

  • by russotto ( 537200 ) on Friday May 30, 2014 @02:40PM (#47131337) Journal

    And it's about time the so-called "ethical security researchers" got off their high horses and realized that. There are far too many laws for there to be white hats. If you want to do useful research into vulnerabilities other than those of the company you are a security researcher for, you're going to have to put on the black hat.

  • mock up a few copies and then dare folks to hack it (sort by remote and physical access type hacks)

    when you get something that can stand up to a decent number of hacks (remote hacks that require you to be on the same subnet on a blue moon with Big$ tool between the hours of 22:00 and 23:59 and the product needs to be in mode X and physical hacks that would be obvious don't count) then you as a last check put up a BIG$ bounty on hacks.

    Then you can release a cyber product targeting children.

  • by idontgno ( 624372 ) on Friday May 30, 2014 @03:00PM (#47131541) Journal

    the mayors of several crime-plagued cities release a joint announcement that reporting apparent crimes in progress to police would result in the arrest and summary punishment of the person making the police report.

    "If you losers would stop reporting crimes, we wouldn't have so much crime," one prominent mayor stated to this reporter. "We're going to push down crime rates the only way that works: make it impossible to report a crime."

    When asked for a comment, the aforementioned mayor's Chief of Police muttered "Whaddyawant, I'm busy here" through a mouthful of donut while pocketing a thickly-stuffed brown paper envelope proffered by an unidentifed man flanked by several apparent bodyguards.

  • by ArcadeMan ( 2766669 ) on Friday May 30, 2014 @03:03PM (#47131579)

    I suppose lawyers don't have locks on their homes because there's laws about illegal entry.

  • Identifying the good guys is a question of trust, so you can imagine why lawmakers are hesitant to throw trust around willy-nilly. Building a system that shows how that trust is reciprocated and enforced would be a good start.

    Seems like there could be a law that tries to differentiate "Research Hacking" by setting requirements to qualify as a researcher. They must provide full transparency to prove they have no malicious intent. They inform law enforcement authorities of their activities before and after
    • But how do you trust people? Someone sending you a threatening email that they found a vulnerability that their consulting company can fix for you is not the sort of person likely to be trusted. Just saying "I'm one of the good guys" isn't good enough, as the mafia uses the same argument when selling protection services.

  • Finding and reporting vulnerability is one thing, making working programs to exploit the vulnerability to the mass public is the main problem. They don't belong in the public domain. If a hairdresser needs to get a license to cut hair why in hell do we not demand security researchers be licensed as well?? the answer they should be required to get one and making of tools to exploit vulnerabilitys should only beavailable to licensed researchers. Stop handing over tools to the criminals and stupid teens. That
    • by arshat ( 3675763 )

      tools to exploit vulnerabilitys should only beavailable to licensed researchers. Stop handing over tools to the criminals and stupid teens. That is IMO

      Fair enough, but it's not particularly achievable is it? How would you go about stopping people getting hold of the software or, heaven forbid, from writing their own?

      • by ubrgeek ( 679399 )
        Simple: Classify GCC as a WMD.
      • I don't claim to have all the answers but common since says what is happening right now is because of bad researchers.Those who give no notice like the Google employee. If a hair dresser can have thee industry licensed I don't see why researchers cant be also. And Companies Like Target who don't follow the standard rules must be held accountable as well I don't think placing all the blame on the researchers is fair at all either. The whole world shouldn't be held hostage by a single unaccountable person as
  • Of course security researchers are being targeted by US cybercrime laws.

    Who do you think they were designed to stop? Security researchers, whistleblowers and anyone who wants to see the nation's security apparatus held accountable were always the intended targets of these laws. And anyone who believes the Internet should be free and research that impacts the public welfare should be readily available to all.

    You didn't think these laws were about Estonian hackers, did you?

  • by dave562 ( 969951 ) on Friday May 30, 2014 @04:01PM (#47131983) Journal

    I work for a company that does a lot of forensics work, including collections activities and incident response. The company has to be licensed as a "private investigator" in all of the states that our employees do collections in.

    It seems like a similar licensing regime would be a good place to start for computer security researchers.

    It might also be worth considering making the researchers or their employer carry a bond as collateral against any potential damage that they might inadvertently cause.

    It has been my experience that when people and organizations have something to lose (like forfeiture of a bond or loss of a license / ability to do business), they tend to act in a more predictable manner, and within well established guidelines.

    There might also be some lessons to be learned from maritime law. In a way, researchers are sort of like privateers on the digital oceans. (So yes, once again, pirates ARE better than ninjas. Just in case there was ever any doubt.)

  • by Anonymous Coward

    I'm a student at Naval Postgraduate School, and every single "cyber" security course taught here could be renamed to "How to use Metasploit to [blank]". There are all of a half dozen of the CS students here that came from any kind of background involving coding, making it necessary to dumb things down to "How to be a script kiddie".

    So the makers of the primary tool taught to service members from all branches (Air Force, Marines, etc all attend there), many of which are absolutely dependent upon it, are als

  • Law enforcement doesn't want researchers uncovering their backdoors put into consumer products? Or some sleazy manufacturer with defective crap getting a buddy in the FBI to lean on people who might go public?

  • They are not finding vulnerabilities.

    They are finding features.

Do you suffer painful hallucination? -- Don Juan, cited by Carlos Casteneda

Working...