Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Privacy Security IT

Full Disclosure and Why Vendors Hate It 91

An anonymous reader writes "Well known iPhone hacker Jonathan Zdziarski gave a talk at O'Reilly's Ignite Boston 3 this week in which he called for the iPhone hacking community to embrace full disclosure and stop keeping secrets that were leading to the iPhone's demise. He has followed up with an article about full disclosure and why vendors hate it. He argues that vendor-only disclosure protects the vendors and not the consumer, and that vendors easily abuse this to downplay privacy concerns while continuing to sell insecure products. In contrast, he paints full disclosure as a capitalist means to keep the vendor accountable, and describes how public outcry can be one of the best motivating factors to get a vulnerability addressed."
This discussion has been archived. No new comments can be posted.

Full Disclosure and Why Vendors Hate It

Comments Filter:
  • Well of course (Score:4, Insightful)

    by eneville ( 745111 ) on Saturday May 31, 2008 @09:46AM (#23609535) Homepage
    It's pretty obvious since vendors have to do more work and package another release to fix bugs. It's easier to keep this information secret and just bundle all the bug fixes into a bulk package when it suits the vendor (I expect money comes into this equation somewhere).
    • Re: (Score:3, Funny)

      by peragrin ( 659227 )
      That's only if they ever fix it to begin with. Some companies can't actually fix their software as it is broken to begin with and the security hole is what allows the software to run at all. years of software developers dealing with MSFT has set this mindset in place.

      Why fix the problem when you can gloss over it with a fresh coat of paint?

      note While MSFT is guilty of doing this, Third party developers are the real guilty party.
    • Re:Well of course (Score:5, Insightful)

      by manwal ( 648106 ) on Saturday May 31, 2008 @10:07AM (#23609685)
      It's only about money. With few or no public security flaws/fixes, your company, product and brand look safe. With many, they look dangerous. It doesn't matter that security often works the other way around.
      • Re: (Score:3, Interesting)

        by davester666 ( 731373 )
        It's not just about security. It's also about features. Things like the broadcast flag. Like the analog bit that accidentally got set by NBC that Microsoft implemented support for to disable recording some shows. Hell, both MS and NBC said it was a mistake that the flag was turned on. But even though there is no legal basis for even noticing that flag, Microsoft did NOT say "we'll update our software to ignore that flag".

        You don't know what agreements have been made between Microsoft, Tivo, other DVR m
    • Re:Well of course (Score:4, Interesting)

      by Adambomb ( 118938 ) on Saturday May 31, 2008 @10:14AM (#23609729) Journal

      (I expect money comes into this equation somewhere)
      Development costs for the fixes and effectively retooling costs for the production line. I would expect that making a new master and swapping it up in pressing wouldn't be the big portion of the cost, but its there.

      Of course companies hate the concept of full disclosure. That would not allow them to make patch timetables based on business needs as opposed to customer needs. But then, I'll never understand why consumers accept the concept that businesses need to keep such secrecy in the name of security through obfuscation, and then smile and nod when things fall apart that "yep dealing with computers for you".

      Why in the hell has this become one of the few fields where its considered normal to have a broken product? Granted its nigh impossible to have a 100% bug-free product, but the standards seem to keep falling and falling.
  • by NDPTAL85 ( 260093 ) on Saturday May 31, 2008 @09:52AM (#23609589)
    This guy really thinks highly of himself. He claims the iPhone's "secrecy" or Apple's inattention to the "privacy flaws" have hurt the product.

    Ridiculous.

    The biggest complaints about the iPhone are the lack of 3G, lack of GPS and no current support for cut and paste or MMS.

    I've never seen someone anywhere complain that its insecure and vulnerable to hackers.
    • Re: (Score:3, Informative)

      by JustNilt ( 984644 )

      The biggest complaints about the iPhone are the lack of 3G, lack of GPS and no current support for cut and paste or MMS.

      This is somewhat true. The average consumer simply isn't aware of the security issues with most things they use. It doesn't matter whether it's their phone, their computer or their front door locks. This is actually kind of the guy's point. Companies are able to keep people in the dark at will, generally.

      I've never seen someone anywhere complain that its insecure and vulnerable to hackers.

      That's funny. Here's a link to a Forbes article from last summer regarding a lack of security.
      http://tinyurl.com/2huxru [tinyurl.com]

      Here's another link regarding an actual exploit vector, reported by the N

      • by samkass ( 174571 )
        What's interesting is most people that actually own an iPhone don't seem to give a rat's ass about security on it.

        Exactly. Which proves this article's premise completely wrong. The only people who ARE interested are the malicious folks, which will be almost your entire "full disclosure" audience. Full disclosure is a great way to give the malicious folks a head start, and won't do one tiny little thing towards linking a product's popularity to its security.

        • by RAMMS+EIN ( 578166 ) on Saturday May 31, 2008 @11:04AM (#23610093) Homepage Journal
          ``Which proves this article's premise completely wrong. The only people who ARE interested are the malicious folks, which will be almost your entire "full disclosure" audience. Full disclosure is a great way to give the malicious folks a head start, and won't do one tiny little thing towards linking a product's popularity to its security.''

          I am offended by your comment. I am in favor of full disclosure, and I am not a black hat. I know there are many people like me.

          Also, your analysis is wrong on both counts. Full disclosure doesn't give anyone a head start. On the contrary, it informs everybody of the flaw at the same time. That does indeed include the black hats, but also the vendor and the users. This allows the black hats to develop exploits, but it also allows the vendor to work on a fix, and the users to implement temporary stopgaps. The alternative is, pretty much, not informing the users of the flaw - thereby leaving them unaware that a vulnerability has been discovered. As for the black hats: they work hard to find security flaws and avoid full disclosure - after all, as long as only they know the flaw exists, they can exploit it for fun and profit.

          With regard to linking a product's popularity to its security: I know of two things that will do that. The first is users feeling victimized by the bad security of the product they have. The other is making actual and potential users aware of the security risks of a product. Full discloruse brings the insecurity of a product out in the open, which is a step towards the latter and can also help with the former. Of course, the effect is going to be rather limited as long as users don't care very much, but I can tell you that the effect is there.
          • by ibbie ( 647332 )

            On the contrary, it informs everybody of the flaw at the same time. That does indeed include the black hats, but also the vendor and the users.

            I have to agree, for the same reason I'd rather know if a product I buy might set my house on fire. Or, perhaps a better analogy, if the product might allow someone else to set my house on fire.

            Knowing a product has a security flaw might prevent me from purchasing it, yes, but only if there isn't a reliable way to prevent the flaw from being exploited until the vendor can release a patch.

          • Re: (Score:3, Insightful)

            by Kjella ( 173770 )

            This allows the black hats to develop exploits, but it also allows the vendor to work on a fix, and the users to implement temporary stopgaps. The alternative is, pretty much, not informing the users of the flaw - thereby leaving them unaware that a vulnerability has been discovered. As for the black hats: they work hard to find security flaws and avoid full disclosure - after all, as long as only they know the flaw exists, they can exploit it for fun and profit.

            To take the last sentence first, I don't see your point as obiviously black hats would not disclose anything to anybody, the question would be if black hats prefer white hats to do vendor disclosure or full disclosure then do the opposite. The question is, how often can users do any meaningful stopgaps? Let's say for example there's a spoofing bug in Firefox (or IE) and a parsing bug in OpenOffice (or MS Office), and there's no meaningful fix except "don't use the product". Well, people aren't going to sto

          • by samkass ( 174571 )
            I am offended by your comment. I am in favor of full disclosure, and I am not a black hat. I know there are many people like me.

            You need to grow a thicker skin. I'm sorry you're offended but any reasonable person should realize my comment was not a personal attack, but an observation about the tendencies of the market. Unless you consider yourself as an individual an entire "market segment", my point that full disclosure will do nothing to tie product popularity to security stands. As does the assertion
      • His point is wrong though. Its usually not the companies that keep people in the dark but the peoples own lack of care.

        I'm not saying iPhone exploits don't exist I'm just saying they're not a big deal. For an exploit finder every exploit is a big deal. The thing is most people don't spend their days trying to crack their phones. And every device has exploits. Blackberries, Windows Mobile, Symbian, Palm OS, Linux based phones....etc. No device is immune.

        Ease of use trumps security. Windows XP should prove th
      • I very much doubt anyone on Slashdot is stupid enough to click on tinyurl links. Perhaps you should consider learning how to make links [w3.org].
    • by thegnu ( 557446 )

      This guy really thinks highly of himself. He claims the iPhone's "secrecy" or Apple's inattention to the "privacy flaws" have hurt the product.

      Ridiculous.

      The biggest complaints about the iPhone are the lack of 3G, lack of GPS and no current support for cut and paste or MMS.

      So you're saying that since the biggest problems aren't the closed platform and security issues, that they haven't hurt the product. Good use of the argument "A is true, therefore R"

      And anyway, about cut and paste, if the platform weren't locked down (going off-topic a bit), there would've been a third party cut and paste app within a couple weeks of release for $10. Within a couple months, there would've been 20 free options.

      Not the biggest problem with the iPhone, but do we need to approach problems in

    • This guy really thinks highly of himself. He claims the iPhone's "secrecy" or Apple's inattention to the "privacy flaws" have hurt the product.
      I am with you on your opinions, my friend. According to this fellow's logic, T-mobile and RIM would have been gone long ago. So would Microsoft.

      If I took investment advice from hackers, I would be one broke Piggy.
  • Peaks (Score:5, Funny)

    by Gracenotes ( 1001843 ) <wikigracenotes@gma i l . com> on Saturday May 31, 2008 @10:04AM (#23609653)

    One of Apple's greatest marketing strengths is this ability to add hype around their products by peaking the curiosity of the common geek.
    As an aforementioned common geek, the misspelling in this sentence is enough to put me in a peak!
    • That's the funniest post I've read in a week, as I write this on my desk made of teak
      • Re: (Score:3, Funny)

        by Dun Malg ( 230075 )

        That's the funniest post I've read in a week, as I write this on my desk made of teak
        Isn't that spelled "tique"?

        yeah, I think the joke is probably dead...
        • by thegnu ( 557446 )

          That's the funniest post I've read in a week, as I write this on my desk made of teak
          Isn't that spelled "tique"?

          yeah, I think the joke is probably dead...
          It's only dead in your hea...no, no...you're right.
  • by unixan ( 800014 ) on Saturday May 31, 2008 @10:05AM (#23609657)
    I work for a vendor and so I get to see the view from the inside out on this.

    Most times, when a vulnerability is discovered by a professional security group or an upstream vendor, they both tell us what it is, and propose an "embargo" date for when they plan to make it public.

    This gives vendors time to react properly but still serves the public with disclosure.
    • by Anonymous Coward
      Say a week elapses between the reporting a vulnerability and the passing of the embargo. That's another week that software I use is vulnerable without my knowledge thereof. If I would know that there is a problem, I would be able to take appropriate precautions while waiting for a fix, but if I don't know what's going on I'm the proverbial sitting duck. As a consumer, I demand full disclosure, and not only that, I demand to get it as soon as possible.
      • You don't actually need full disclosure for this. You can simply publicly disclose that you have found one arbitrary code execution vulnerability in product X. Either the vendor confirms it and specifies a date when they will release a fix, or you disclose the nature of the vulnerability publicly.
    • by Zoop ( 59907 ) on Saturday May 31, 2008 @05:34PM (#23612945)
      As someone who manages an open source product, I get notified (despite ample ways for the "researchers" to contact me) because I have Google alerts for our product's name. I have never, not once, been contacted by the discoverer of a vulnerability or the security groups who publicize exploits.

      This has left me with a very dim view of the security community, and I sincerely doubt the earnestness of the discoverers. They act more like script kiddies out to tag something with their graffiti rather than someone concerned about the consumer.

      Maybe for Apple there are more concerned people out there, but I don't have Apple's resources and would appreciate a couple of weeks to get a fix in and tested before you expose my users to more black hats (as opposed to the black hats who knew about it before).

      I WANT TO KNOW. I WANT TO FIX IT. But the experience I've had so far is that I care more about my users than the security companies and script kiddies masquerading as "researchers" do.
      • by michield ( 71625 )
        my experience is exactly the opposite. I've been contacted several times by security researchers who warned me about issues (in my open source project) and gave me sufficient time to fix it and release a new version before making the information public.

  • by fermion ( 181285 ) on Saturday May 31, 2008 @10:14AM (#23609733) Homepage Journal
    Cyptogram has a discussion of this issue in relation to the oft used argument that only people who have are committing crimes should be afraid of full disclosure. The issue in the note, iirc, related to data mining and video surveillance. The counter example to the statement was the police apparent unwillingness to give tapes of traffic stops, for example, to those private parties involved. It seems that the tapes are there to protect the cops, which is good, but no one is willing to protect the citizen. We see this even in the taping of the very occasional police overreaction.

    Almost no one is comfortable with full disclosure, and the ultimate arrogance and hypocrisy is demanding it in other, while fabricating excuses why your yourself cannot comply. We see this in the current US presidential campaign, where it is typical to release tax returns, but some people feel too above everyone else to so do. This includes other cases where persons who are, like the police, are paid by the american taxpayer, but refuse to fully account for their work hours to the american tax payer. the examples, private and public, are endless.

    So why would geeks, even those that never put on a tinfoil hat, demand full disclosure, especially in a market place where we have the option to simply not spend the money. In this case, if there are significant security issues with the iphone, don't buy one. It sounds trite, and everyone always complains about the philosophy, but it works. MS is a target for viruses, even if it not inherently less secure, so I don't use it on a regular basis. SUVs are less secure as they are not inherently stuck to the ground through the tire patches, and require computer intervention to keep them for tipping over, so I don't buy them. I don't shop at stores with affinity cards. If an iPhone is an attack against security, buy something else.

    Back to the issue of security, there is one serious misconception that I believe many people make. Just because one does not publish ones security details on the internet does not mean that one is practicing security by obscurity. Just because I do not publish my path to work on the net, and my schedule, and the times and places that my stuff is most venerable to theft, does not mean I practice security by obscurity or have a ideological hate of full disclosure. And giving a vendor time to fix an issue, even if everyone except the average consumer knows about it, is not unreasonable. If the vendor does nothing about it in a fairly short time frame, then the equation shifts.

    Which is why the most secure system may be open source. If something is discovered, then an slightly above average user may be able to fix it, and no one has to wait on the vendor. But open source solutions do not seem to have traction in the marketplace, so we are where we are.

    • So why would geeks, even those that never put on a tinfoil hat, demand full disclosure, especially in a market place where we have the option to simply not spend the money. In this case, if there are significant security issues with the iphone, don't buy one. It sounds trite, and everyone always complains about the philosophy, but it works.

      1. Without full disclosure, your options for determining whether there are security risks preventing you from buying one are more limited than they would otherwise be.

      2. If you mean "don't buy one if you think lack of full disclosure is itself a problem" then that seems equivalent to demanding full disclosure.

      • by Kjella ( 173770 )

        1. Without full disclosure, your options for determining whether there are security risks preventing you from buying one are more limited than they would otherwise be.

        To some degree, on the other hand disclosure notices can have their own spin. Actually exploited vunerabilities in the wild aren't such a bad metric, and they're much harder to hide... Just to throw one monkey wrench out there, say debian experimental is more like a development snapshot than anything else. Is it fair to count one exploit made, discovered and fixed there the same as one exploit in Windows? Or would it be fairer to compare debian stable to windows? In the end, it's not going to be perfect no matter how you do the math.

    • Re: (Score:2, Insightful)

      by risinganger ( 586395 )

      So why would geeks, even those that never put on a tinfoil hat, demand full disclosure, especially in a market place where we have the option to simply not spend the money. In this case, if there are significant security issues with the iphone, don't buy one.

      Without disclosure how will you know if there are significant security issues? The author wants disclosure so consumers can say "hey, your product is insecure I'm taking my money elsewhere".

      And giving a vendor time to fix an issue, even if everyone except the average consumer knows about it, is not unreasonable. If the vendor does nothing about it in a fairly short time frame, then the equation shifts.

      Why shouldn't the consumer be allowed the choice of continuing to use (or not) an insecure product while waiting for a patch? Take the recent Flash vulnerability. I'd much rather know straight away to not leave myself at risk while they work on a patch than to discover it after my machine has been compromised. Without

    • And giving a vendor time to fix an issue, even if everyone except the average consumer knows about it, is not unreasonable.

      TFA is not saying that "giving a vendor time to fix an issue" is unreasonable, merely that it does not produce the best results for everyone.

      Which is why the most secure system may be open source. If something is discovered, then an slightly above average user may be able to fix it, and no one has to wait on the vendor.

      OSS is not more secure because some "above average user" can fix it, it is supposedly more secure because many people have vetted it.

      Even that has shown to be the exception and not the rule.
      Decade old bugs pop up in all kinds of 'mature' OSS software

      Security is a moving target.

      • TFA is not saying that "giving a vendor time to fix an issue" is unreasonable

        The vendor had plenty of time to fix the issue when they wrote the software

        And again when they tested it.

        And again when they did their pre-release security audit of the software

        So why, after the software has been released and widely-installed for years, is it necessary to start another arbitrary clock for them to make it secure? Their patch is already overdue by time()-releasedate, and they need to explain their previous installat
  • You need to give responsible vendors a reasonable period of time to get a fix out. Defining "reasonable" isn't always easy but if the vulnerability truly is known only by the discoverer, a good time window will be a few hours to a few weeks, depending on the complexity of the fix, the damage that can be caused, and the risk of independent discovery.

    For vendors who have proven themselves irresponsible by not delivering fixes in a timely manner, there's no point in waiting, just publish it and let market for
  • by Animaether ( 411575 ) on Saturday May 31, 2008 @10:53AM (#23609991) Journal
    Full Disclosure is great - but inform the vendor first.. if they don't take any action in, say, 3 days (I've used that number before - I'm sticking with it) to alleviate it, then hit the internets with it.

    But too often these types are calling for Full Disclosure - immediately! Don't even bother to inform the vendor! RAR! Cry havoc, and let loose the scriptkiddies!

    "The bad guy is already going to test and exploit these vulnerabilities long before the public even discovers them - the good guys ought to have a crack at verifying it too."
    That is an assumption. The assumption that bad guys know about the vulnerability -before- the 'public discoverer' went with full disclosure. Plus the assumption that the bad guys' work would be as bad, or worse than, what script kiddies would do in the time between your discovery and your disclosure. I don't think those are assumption that can be made, based on - admittedly anecdotal - evidence (crashing mIRC 6.something users' IRC application on large IRC networks using a malformed DCC command only became a problem once it was disclosed and everybody and their dog started doing it, while the developer was already in the process of fixing.)

    There's a middle ground - I put it at 3 days. Where do you put it, Jonathan Zdziarski? Your article seems to indicate "0 day", but I can't imagine you being that irresponsible.
    • This is a pointless line-in-the-sand to draw.

      When a malicious party finds out about a hole, there is no 3 day grace. Vendors want to treat security in a casual manner, and that is truly irresponsible.

      There is no good reason to put anyone above the public.
      • I'd rather they took their time over 3 days and fix it properly than break functionality or introduce another security issue because of a mad rush. I'm just trying to be realistic.
  • by plasticsquirrel ( 637166 ) on Saturday May 31, 2008 @11:08AM (#23610121)
    The issue that he raises is a flaw in capitalism, not specific to this case. Capitalism assumes that consumers have accurate information about their purchases. Making this information readily available is not encouraging capitalism, but rather trying to deny that the flaw exists.

    If anything, this has the trappings of libertarian or democratic socialism. The idea of democracy taking a role in putting moral standards on powerful economic institutions, is not traditionally capitalist.
    • The idea of democracy taking a role in putting moral standards on powerful economic institutions, is not traditionally capitalist.

      What are you talking about?

      We have all sorts of laws in place for consumer protection which place restrictions and obligations on companies. Surely by your definition there are no capitalist countries.

      Forcing companies into full disclosure would merely be another level of consumer protection/empowerment regardless of if you agree with it or not.

    • Capitalism doesn't assume anything. It's the one economic system that does the right thing no matter how many people fuck up.
  • When it comes to vulnerabilities, the vendors only care in so far as it causes a PR problem for them. However, I don't believe that these "Security Experts" are crusading for consumers either. They tell the company that there is a problem, but if the company doesn't decided that the flaw "They" found is the most important thing then the "Security Expert" throws a hissy fit and tells everyone about it in revenge. The "Security Expert" appears to me to be hoping someone will develop some malware that utili
  • Verizon's open platform is going to have more full disclosure.
  • He argues that if there is not full disclosure and you just notify the vendors that the bad guys will "probably" know about it too. If that was wide spread then you wouldn't have people reverse engineering the monthly Windows patches to figure out what was patched. There are many crackers doing this every time Microsoft comes out with new updates and then they use that information to exploit people who haven't patched.

    If the exploits were fully disclosed instead then most likely there would be even more
  • Vendor first doesn't mean "vendor only", and nobody says you need to sit on a flaw forever if the vendor doesn't fix it. You're giving them advance notice, not carte blanche.

The trouble with being punctual is that nobody's there to appreciate it. -- Franklin P. Jones

Working...