Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Security Government Your Rights Online

Dutch Gov't Offers Guidance For Responsible Disclosure Practices 37

An anonymous reader sends this quote from an IDG News report: "The Dutch government's cyber security center has published guidelines (in Dutch) that it hopes will encourage ethical hackers to disclose security vulnerabilities in a responsible way. The person who discovers the vulnerability should report it directly and as soon as possible to the owner of the system in a confidential manner, so the leak cannot be abused by others. Furthermore, the ethical hacker will not use social engineering techniques, nor install a backdoor or copy, modify or delete data from the system, the NCSC specified. Alternatively a hacker could make a directory listing in the system, the guidelines said. Hackers should also refrain from altering the system and not repeatedly access the system. Using brute-force techniques to access a system is also discouraged, the NCSC said. The ethical hacker further has to agree that vulnerabilities will only be disclosed after they are fixed and only with consent of the involved organization. The parties can also decide to inform the broader IT community if the vulnerability is new or it is suspected that more systems have the same vulnerability, the NCSC said."
This discussion has been archived. No new comments can be posted.

Dutch Gov't Offers Guidance For Responsible Disclosure Practices

Comments Filter:
  • by terec ( 2797475 ) on Friday January 04, 2013 @02:02PM (#42478395)

    "Responsible disclosure" is nice. But what about holding banks and businesses responsible for the harm they are causing when their security practices fail? What about the worry and wasted time they cause to customers? What about compensating the victims of identity theft due to sloppy security practices? Businesses seem to be able to screw up arbitrarily without a lot of consequences right now.

    • Ftom TFS:

      The ethical hacker further has to agree that vulnerabilities will only be disclosed after they are fixed and only with consent of the involved organization.

      This sounds like NEVER for the disclosure date, because either (i) the vulnerability won't get fixed if it's not known to the public, and (ii) even if it does get fixed, why should a company agree to expose the fact it screwed up?

    • On the flip side of that, if a hacker is caught penetrating a system, they can go to jail, there was a story on here a little while ago about a guy who did just what these guys are talking about and got charges pressed against him by the company who's systems got penetrated by him. Best bet is to publish the vulnerability on forums anonymously & let it circulate around till somebody from the organization catches wind.
  • Been Done (Score:5, Insightful)

    by shawnhcorey ( 1315781 ) on Friday January 04, 2013 @02:07PM (#42478455) Homepage
    "Responsible disclosure" means "We don't want to bother fixing it." If the vulnerability is not make public, it is never fixed. This has been done many times before. The only way to get them fixed is to make them public.
    • Re:Been Done (Score:5, Informative)

      by sofar ( 317980 ) on Friday January 04, 2013 @02:29PM (#42478745) Homepage

      Being a native dutch speaker, I read the entire guidelines in Dutch, and they include disclosure terms to encourage companies to rapidly fix (60 days) issues, and make agreements with the discloser about the disclosure.

      This is common practice and rather well accepted practice already. So, in essence, the document encourages the public disclosure. Any company that wishes to ignore the vulnerability will have their asses handed to them anyway, so this guideline actually helps - security researchers can use it to show to companies that they are acting in good faith as long as companies play by the same rules.

      So personally, I highly encourage governments to do something like this.

      This Dutch variant is interesting in the sense that it creates a possible middle man that can mediate and monitor the disclosure. This protects disclosers, and puts more pressure on companies to abide by these standards. Not the other way around.

      • by CBravo ( 35450 )
        In four years I only once had a security researcher contact me about a problem. I really welcom that. On the other hand I caught dozens of wannabe's, customers and pro's who do not even bother to contact us before they scan our web application (which is not allowed in Holland). A request to our system is either valid or you are trying to do bad ****
  • by Tator Tot ( 1324235 ) on Friday January 04, 2013 @02:08PM (#42478461)

    There are only two things I hate in this world:

    People who are intolerant of other people's cultures... and the Dutch.

    • by sofar ( 317980 )

      Two thoughts on your message:

      1) you must hate yourself.

      2) the Dutch will still love you. :^D

      • Wooooossssshhhhh

        http://www.imdb.com/title/tt0295178/quotes [imdb.com]

        Nigel Powers: All right Goldmember. Don't play the laughing boy. There's only two things I hate in this world. People who are intolerant of other people's cultures and the Dutch.
        Goldmember: What? Take the fahza away! Dutch hater! And now, it is time to say goodbye. Dr. Evil's orders. Which, for you, is bad news bears,
        [talks in a deep vioce]
        Goldmember: Walter Matthau.

  • by the_B0fh ( 208483 ) on Friday January 04, 2013 @02:08PM (#42478469) Homepage

    Seriously? Who's going to consent?

    Also, where's the responsibility on the part of the organization to show that they *HAVE* a secure coding practice, they don't simply outsource $2 coders, and they have a program in place to review security issues?

    • by sofar ( 317980 )

      The documents create a neutral middle-man organization that can mediate between companies refusing to cooperate and disclosers. It effectively puts irresponsible companies directly in the line of sight of the government and thus legal action. What's not to like?

      • by Len ( 89493 )
        What's not to like? How about a government agency enforcing a company's decision to conceal and not fix vulnerabilities? As far as I can see, there's no requirement that a company must agree to disclose at all; and white-hats who don't follow the guidelines are not offered any legal protection.

        We've seen time and again what happens when "responsible disclosure" is abused to allow security holes to go unfixed and exploited. This is big step backwards.

        • by sofar ( 317980 )

          The problem is that enforcing public disclosure by the organization itself is equivalent to self-incrimination. Think about that for a second. Do you really want to put that in law? In the US, it would be thrown out immediately as unconstitutional.

          • The 5th Amendment (which, for those who may not know, is the US Constitutional amendment forbidding the government from forcing individuals to self-incriminate) only applies to people.

            Oh, wait. Corporations are "people" now, too. It's official. The world is insane. I think I shall have to build an asylum to keep it in and prevent it from hurting itself.

            How do I change my /. username to Wonko?

    • by Anonymous Coward

      All big Dutch telco's have such a policy in place already. I've spoken to quite a few organisations that are interested in adopting a responsible disclosure policy including banks and a security company. It is my experience that most companies who value security want this.

  • by Todd Knarr ( 15451 ) on Friday January 04, 2013 @02:58PM (#42479171) Homepage

    The only way I'd accept the "only disclose to the owner" condition is if it included a time limit within which the owner must either fix the vulnerability or disclose the vulnerability to the public, and if the owner fails to meet the limit the confidentiality is lifted and the hacker is free to disclose the vulnerability himself. If software makers want their mistakes kept confidential then the cost is a binding commitment to fixing those mistakes, and the penalty for failing to meet their commitment is that the hackers are freed from theirs.

  • by wabrandsma ( 2551008 ) on Friday January 04, 2013 @03:49PM (#42479857)
    The problem is that the definition for hacking is overly broad. If you enter an URL in the address bar, and change just a serial number in the URL, it is considered hacking. Like finding Queen Beatrix's Christmas speech before it was officially published http://www.nrc.nl/nieuws/2012/12/25/hacker-kersttoespraak-van-geen-kwaad-bewust-tijdens-strafbare-actie/ [www.nrc.nl] (in Dutch). Or proving access to medical files by MP Henk Krol http://nos.nl/artikel/447718-krol-vervolgd-om-hacken-dossiers.html [nos.nl] (in Dutch).
    IT journalist Brenno de Winter calls the guidance useless. "If hackers first have to report the vulnerability, they lose their anonymity without having a guarantee that they will not be prosecuted. And even if a company promises that it will not press charges, the Public Prosecutions Department can start a case." Link here: http://www.trouw.nl/tr/nl/5133/Media-technologie/article/detail/3372108/2013/01/04/Richtlijn-ethisch-hacken-lost-niets-op.dhtml [trouw.nl] (in Dutch).
    • The problem is that the definition for hacking is overly broad.

      It is clearly advised by the published guidelines that an organisation should define for themselves what they consider acceptable and what's not acceptable. An organisation might, for example, rule out social engineering attacks or DDoS.

      IT journalist Brenno de Winter calls the guidance useless. "If hackers first have to report the vulnerability, they lose their anonymity without having a guarantee that they will not be prosecuted. And even if a company promises that it will not press charges, the Public Prosecutions Department can start a case."

      A published responsible disclosure policy is a legaly binding document. If a organisation states that it find's certain behavior acceptable and even clearly states that it won't take legal action against people holding themselves to that document they have to follow that pr

  • by Anonymous Coward

    That's the only thing that really counts, and it's missing. The Public Prosecutions Department has absolutely no obligation to follow these guidelines.

  • by Anonymous Coward

    Companies will never give consent, and will only use that, to hide from customers, how shitty their software is! In other words: FRAUD!

    And I want to know that there is a security hole, and close it, *right when it is found*! Not half a year later, when the fix is out, after some Russian kid used it for his botnet for *months*! That is *deliberately* endangering me! Another CRIME. (And on top of it, assistance in the botnet operator's crimes!)

    NOT ACCEPTABLE! From the security standpoint of the customer.

  • The article lists a number of actions that the hacker shall not do. Most are to be expected, such as not modifying the system, not bringing it down, not exposing private information. The first and last points in the list are strange though:

    • Not using social engineering to gain access
    • Not using brute forcing to gain access

    Eh? Why are these not valid attack vectors?

Dennis Ritchie is twice as bright as Steve Jobs, and only half wrong. -- Jim Gettys

Working...