Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Crime Security IT Technology

New Attack Steals SSNs, E-mail Addresses, and More From HTTPS Pages (arstechnica.com) 102

Security researchers at KU Leuven have discovered an attack technique, dubbed HEIST (HTTP Encrypted Information can be Stolen Through TCP-Windows), which can exploit an encrypted website using only a JavaScript file hidden in a maliciously crafted ad or page. ArsTechnica reports: Once attackers know the size of an encrypted response, they are free to use one of two previously devised exploits to ferret out the plaintext contained inside it. Both the BREACH and the CRIME exploits are able to decrypt payloads by manipulating the file compression that sites use to make pages load more quickly. HEIST will be demonstrated for the first time on Wednesday at the Black Hat security conference in Las Vegas. "HEIST makes a number of attacks much easier to execute," Tom Van Goethem, one of the researchers who devised the technique, told Ars. "Before, the attacker needed to be in a Man-in-the-Middle position to perform attacks such as CRIME and BREACH. Now, by simply visiting a website owned by a malicious party, you are placing your online security at risk." Using HEIST in combination with BREACH allows attackers to pluck out and decrypt e-mail addresses, social security numbers, and other small pieces of data included in an encrypted response. BREACH achieves this feat by including intelligent guesses -- say, @gmail.com, in the case of an e-mail address -- in an HTTPS request that gets echoed in the response. Because the compression used by just about every website works by eliminating repetitions of text strings, correct guesses result in no appreciable increase in data size while incorrect guesses cause the response to grow larger.
This discussion has been archived. No new comments can be posted.

New Attack Steals SSNs, E-mail Addresses, and More From HTTPS Pages

Comments Filter:
  • by Anonymous Coward

    Keep getting more stupid by the hour

    • by Maritz ( 1829006 )
      I'll just pre-empt the next few here... THIEF / 5TEAL/ T4KEY / O0-I-WANT / GIMME-MINE /
      • by Anonymous Coward

        You forgot the new defense protocol: N4CH0 K3YZ.

  • so once again... (Score:4, Insightful)

    by Anonymous Coward on Thursday August 04, 2016 @11:15AM (#52644843)

    using only a JavaScript file hidden in a maliciously crafted ad or page

    So we learn for the 1940390155th time that if you let a remote site run arbitrary scripts on your machine, that remote site might do things that are not in your best interest. Surprise surprise.

    Look: we get a constant stream of these things, at least one or two per week, literally for over 10 years. They're all the same. "Run javascript, get pwned". If you care AT ALL about security, you need to block javascript by default and white-list a few sites you care about, like your bank.

    If you are still running javascript by default, in 2016, that's on you. You've had over a decade to learn your lesson. This is like someone walking through the worst part of town at 3am flashing jewels and expensive watches. Then they get mugged. Is it the muggers fault? Yeah, of course it is. But the person doing this is still a bloody idiot, especially after it happens for the 10th time, and then the 20th, and the 50th, and the 100th. Eventually, they need to learn from experience.

    Whitelist selected javascript, and disable everything else. It's time. It was a bad idea just like ActiveX was. The internet is not your friend. Random domains are not trustworthy. Stop letting them run code in your browser. Ignorance stops being a reasonable excuse after endless repetitions of "See this incredible new exploit(*)! (*) that requires you let the attacker run code on your computer."

    • by Anonymous Coward

      I'm not so sure. It is easy for 99.9% of the population to understand why they got mugged. It is not as easy for people to understand why their information was stolen. There isn't a guy walking up with "javascript" written on his shirt that takes your information and then you go "ah ha - I better not allow javascript around me".

      Not everyone reads this site (in fact, most people don't) so while you think everyone knows about this, they don't. Most people couldn't tell you what javascript is.

      • by Anonymous Coward

        Maybe so. But that is why we need some level of basic technical literacy, just like we expect people to have some basic awareness of geography or their own health. It does not mean they have to be hardcore technical experts, but having at least some notion of what's going on will lead to them fairing better than pure ignorance alone will.

        Rather than teaching rote memorization of how to use Microsoft Word in schools, we should be teaching concepts and fundamentals, so people can make educated choices in th

        • just like we expect people to have some basic awareness of geography

          Since when do we expect that? Plenty of surveys have found that random Americans can't even point out Texas on a map of the US.

          http://news.nationalgeographic... [nationalgeographic.com]

          http://www.salon.com/2007/08/3... [salon.com]

          According to this Salon article from 2007, only 94% of young Americans could even find America on a map! That means 6% of our young population can't! And 12% can't figure out where Mexico is! That means, if you go to a place where 18-20 year-old

          • by locotx ( 559059 )
            Correct and stupid uninformed people are the ones that are usually are the victims right?
        • by EvilSS ( 557649 )

          ... just like we expect people to have some basic awareness of geography or their own health.

          Man, you better aim WAY higher than that for your baseline.

      • by locotx ( 559059 )
        So if you replace "javascript" with "black person" then I see your point . . .
    • by Frosty Piss ( 770223 ) * on Thursday August 04, 2016 @11:58AM (#52645253)

      "Run javascript, get pwned". If you care AT ALL about security, you need to block javascript by default and white-list a few sites you care about, like your bank.

      I understand, really I do. But for most people this approach is not practical because 99.999 percent of the web sites out there that most people visit use JavaScript for functionality.

      • by Threni ( 635302 )

        Sure, and some people are too lazy to lock their doors and shut their windows, and those people are going to be spending more time on average shopping for replacement phones and TVs. You can't save people from themselves.

      • So here's a compromise: block only third-party domains -- not just javascript, everything (advertising, tracking, etc). With an extension like RequestPolicy, a site that is stored on multiple servers might require an action to be usable, but it's pretty obvious what to allow. And you need to whitelist requests only on the first visit.

        • It's not pretty obvious; it's not obvious at all. For example, I just logged in to comment and realized it was time I changed my password.

          I use uMatrix and (kind of) know what I'm doing. I ended up allowing all third-party domains temporarily just to change my password. Go take a look yourself at the ridiculous number of domains NOT named slashdot.org that slashdot.org uses.

      • by antdude ( 79039 )

        Yeah and there are users who surf go to various web sites often. :(

    • by WaffleMonster ( 969671 ) on Thursday August 04, 2016 @12:11PM (#52645359)

      So we learn for the 1940390155th time that if you let a remote site run arbitrary scripts on your machine, that remote site might do things that are not in your best interest. Surprise surprise.

      Look: we get a constant stream of these things, at least one or two per week, literally for over 10 years. They're all the same. "Run javascript, get pwned". If you care AT ALL about security, you need to block javascript by default and white-list a few sites you care about, like your bank.

      If you are still running javascript by default, in 2016, that's on you. You've had over a decade to learn your lesson.

      No we learned that compression is vulnerable to side channel attacks something we all knew and nothing more.

      Your view is strange given the unfortunate nature of many top sites employing CDNs to pipe out all manner of java frameworks and half the content of their sites and crap. What you are essentially advocating is a nonstarter. NOTHING works without JavaScript today and expecting people to make judgments about validity of specific script files is a complete nonstarter.

      There are persistent streams of javascript implementation bugs and browser implementation bugs and style sheet implementation bugs and operating system implementation bugs which regularly require attention to prevent exploitation. It is easy to pull the plug and declare all security problems solved yet this course of action does not actually help anyone.

      • Your view is strange given the unfortunate nature of many top sites employing CDNs to pipe out all manner of java frameworks and half the content of their sites and crap. What you are essentially advocating is a nonstarter. NOTHING works without JavaScript today and expecting people to make judgments about validity of specific script files is a complete nonstarter.

        After whitelisting sites like /. and my bank I find 90% of the net works just fine. When I go to a site and nothing happens I can at least look at the URL, take into account where the link came from, and decide whether or not I want to let the page run javascript temporarily. Although I don't keep track, I'd guess 90% don't get to run JS. Sites like time.com will likely get it, those like joeswebpage won't.

        • You mean the 90% of the web that you use, because that isn't the case for lots and lots of sites. When you give the temporary permission, what do you give it to? Everything on the page (which means that time.com is a danger) or the domain of the URL (which breaks quite a few sites nowadays)?

      • by I4ko ( 695382 )
        Usenet and IRC work really fine without JavaScript. Native apps (e.g. on mobile) too, though that's another beehive.
    • I want to boost your comment with my "moderator points", but my script blocker only allows the obvious /. domains (slashdot,org, and slashdotmedia.com), so the script running the moderator system coincidentally doesn't work here.

      I suppose while I'm here, I should inject my newbie /. request to learn what domain(s) I should unblock for scripting purposes here, so plz feel free to enlighten me on this front.

      I also suppose, along my starting sentences with "I" run, that probably everyone reading /. knows about

      • I want to boost your comment with my "moderator points", but my script blocker only allows the obvious /. domains (slashdot,org, and slashdotmedia.com), so the script running the moderator system coincidentally doesn't work here.

        I suppose while I'm here, I should inject my newbie /. request to learn what domain(s) I should unblock for scripting purposes here, so plz feel free to enlighten me on this front.

        I also suppose, along my starting sentences with "I" run, that probably everyone reading /. knows about script blocking, but security issues among the mainstream public are seriously problematic (at least in terms of potential attack vectors). It's not a knock against your valid point, but just a reasonably relevant informational nook where people can feel free to discuss it fwiw.

        a.fsdn.com , and possibly the XHR from slashdot.org

      • by I4ko ( 695382 )
        Doing it by domains is wrong. That's why hosts don't work, that's why uBlock doesn't work in all cases. You should have a master operator console on the interpreter, inside all that DOM shite, and be able to confirm execution of each script separately, even those that is inline in the original HTML.

        But.... browsers are in a disk measuring contest on how fast (convenient) they can load a page, so the average joe (luser) says "Whoa.. this browser is fast".. For fucks sake, there is an idiot at Chrome who di
        • But.... browsers are in a disk measuring contest on how fast (convenient) they can load a page, so the average joe (luser) says "Whoa.. this browser is fast".. For fucks sake, there is an idiot at Chrome who disabled certificate revocation checking, because going and fetching CRLs and OCSP and actually making the check costs about 200ms per host that is in the page. The did provide some small CRLs for few big CAs, inside the browser itself, but if you wanted any sort of checking for those they didn't you actually had to go and make changes into chrome://settings/. It might have changed later on, but that was the time I dropped that stupid browser.

          Revocation is a fantasy. Never worked and my guess too few care enough this will ever change. Personally I think the world is on balance better off without it.

          - Low probability of any discernible benefit to average user
          - Privacy nightmare
          - Unnecessary resource consumption and delays
          - Single point of failure / DDOS magnet

          • by I4ko ( 695382 )
            Revocation isn't a fantasy. All of the US government will beg to disagree with HSPD-12 and CACs.


            You are right about negligible discernible benefit to average users, because the browsers, in their infinite wisdom and dick measuring are shipping root and intermediate CAs trust certificates by the truckload.
            You are really wrong about the delay being unnecessary - it is necessary for security, but you are right, the average luser is "gimme, gimme, gimme", even there is a label on something that it is hack
    • Whitelist selected javascript, and disable everything else.

      That's getting really hard to do. I used to rely on NoScript for my main defense, and when I wanted to use a site I'd allow the scripts through that came from that domain, and everything worked fairly well. In the last few years, I found that more and more sites are using scripts from somewhere else, with no clear idea of what "somewhere else" means. This means that, to get a site to work, it is necessary to enable script sources more or less r

  • ... but the intro made me afraid to click the link.

  • Comment removed based on user account deletion
  • by Calydor ( 739835 ) on Thursday August 04, 2016 @11:23AM (#52644921)

    Yet another reason to never, NEVER turn off AdBlock, NoScript, Ghostery etc.

    Advertisers and site operators, I don't CARE about your precious earnings if they come at a threat to my property.

    • by Anonymous Coward

      This. A thousand times, this.

      Most web sites actually work okay with JavaScript disabled (eg via NoScript). For a few you do need to whitelist something -- but it's amazing how freaking many pull in .js from half the freaking internet. No, just...no. (I also see a lot of even legit business sites linking in crap like googletags. Nope, not going to let you have that one either, and usually the site still works for what I need it to do.)

      If you've got javascript you need me to run to make your website work

      • If you've got javascript you need me to run to make your website work (why?)

        I agree that for sites presenting only static information, JavaScript ought to be unnecessary. But for browser-based video games, it's the lesser of two evils. Games like Cookie Clicker and Pirates Love Daisies could instead have been written in Flash; would that have been a better choice?

    • The internet is shit these days because of ad sponsored content and SEO anyway. Fuck those guys.

    • by antdude ( 79039 )

      Even if the web site/page doesn't work due to false positives? :(

    • Comment removed based on user account deletion
  • by TheCarp ( 96830 ) <sjc AT carpanet DOT net> on Thursday August 04, 2016 @11:25AM (#52644949) Homepage

    Once again proven that browsing the web is like going to a diner party in a world where the handshake has been replaced with unprotected anal sex.

    Sure, many people you meet may be offended when you insist on a condom (plugins like requestpolicy, and noscript) and say its some right of theirs to not let you sit at their table because of it, or rant on about how they need to get paid....

    but at the end of the day.... its basic security. Loading and running code from random third party sites is not safe. It doesn't matter if its inside a restricted environment, its a risk. Its a risk website owners are in the habbit of irresponsibly magnifying for all of their viewers without a second thought

    You should protect yourself. Wear condoms unless you really know your partner. Get some here:
    https://requestpolicycontinued... [github.io]

    https://noscript.net/ [noscript.net]

    If you have a browser other than firefox, you will need something else, I don't know what they are but, bottom line...protect yourself.

    • I'm...I'm interested in both the culture that gives rise to your metaphorical society as well as its outcomes. I'm interested in how you'd position that as an elevator pitch for a movie. I just...I have so many questions right now.
      • by TheCarp ( 96830 )

        Well...just look at the current culture of web browsing and you have your answer. All you really need is a lack of sexual taboo and no awareness of sexually transmitted disease; which is pretty much the case for most web users and what their browsers do.

        A man ejaculating in your ass _is_ leaving executable code behind. No different from when your browser blindly loads scripts from any site pointed to. All a virus is, biologically, is a form of automated root kit.

        It really is a perfect analogy for every day

    • by AHuxley ( 892839 )
      Ty for requestpolicy :) Installed.
  • Synopsis (Score:5, Informative)

    by grmoc ( 57943 ) on Thursday August 04, 2016 @11:44AM (#52645111)

    I'm not a fan of that article summary.

    New summary:
    It is the same as CRIME, but we're using your browser's performance timing JS API as the man-in-the-middle.

    A review:
    Stick sensitive info into compressed stuff, and you make that sensitive info less private. If the encryption is zlib-like, then the attacker can guess the information quite quickly-- a good compressor compresses substrings, not just the whole thing.
    That means that if you have a SSN in there, the attacker can guess some substrings of your SSN, and the response won't be much bigger.
    Guesses that don't share substrings with your SSN will be larger-- the attacker can reject those as bad guesses and not try those substrings again.

    With HTTP2's HPACK compressor (only used for info in the headers), this side-channel is eliminated-- only an exact guess of the data will allow this to happen.This is completely unrelated, however, to someone using entity-body compression with HTTP2. If you mix sensitive data with everything else in the compressed-entity body... side channel attacks galore!

    A mitigation: Don't put the sensitive data in the same resource as the non-sensitive data, and then don't compress the sensitive data.
    HTTP2 makes this cheaper. If sites do this, then these attacks simply do not work any better than the brute-force guessing would.
    Ensuring that this happens (no sensitive data compressed) isn't necessarily the most easy thing...

    Another obvious one is disable the timing API for 3rd party stuff. This is not as effective theoretically, but it is way easier to deploy and makes these kinds of attacks require an external 3rd party.

    • by tepples ( 727027 )

      A mitigation: Don't put the sensitive data in the same resource as the non-sensitive data

      A mitigation separating the template from the data is feasible if JavaScript is enabled, but not for users who block JavaScript by default.

    • by Rob Y. ( 110975 )

      How about padding the compressed data with a random length string of random stuff. Wouldn't add much to the payload size, but it'd screw up the ability to manipulate the compression to help you guess the contents.

    • I've mentioned this before when CRIME & BREACH were in the news, the compression used by browsers does support inserting blocks of uncompressed bytes. I'm not saying that it would be easy, but it would be possible to build API's to mark which bytes of the stream are sensitive and should not be compressed.
  • by Anonymous Coward

    ... the most noteworthy thing is the craftily cute name.

  • by WaffleMonster ( 969671 ) on Thursday August 04, 2016 @11:57AM (#52645245)

    The takeaway we all learned many many years ago compression can be used as a side channel attack and therefore should probably never be used in conjunction with any stream containing sensitive data.

    There is no need to invent different names based on where that compression occurs (CRIME, BREACH...etc.) or to assign even more aliases (HEIST) to the same damn thing. Wow you found a new set of metrics to enhance a side channel we all already knew about... so what?

    This is one of the things I always hated about Defcon at least in the early days there were all kinds of talks about different ways to exploit this and that when everyone knew they weren't secure in the first place... like the old joke about someone discovering you can mount an unencrypted drive on another operating system and access all your files without knowing the password!!

    It often boiled down to nothing more than implementing what everyone understood was possible anyway. Not very useful in my opinion.

    At this point in the game anyone with important information to protect still vulnerable to compression attacks should probably do everyone a favor and look for a new line of work. There really isn't a valid excuse at this point.

    • The takeaway we all learned many many years ago compression can be used as a side channel attack and therefore should probably never be used in conjunction with any stream containing sensitive data.

      AFAICT, the vulnerability isn't compression in general, but compressing sensitive data along with data controlled by an attacker. Just compressing the sensitive data by itself won't leak much; it gives away how compressible the data is, of course, but that isn't very much to go on by itself, and the same data would always compress to the same size.

      What we need is a structured format where data from different sources can be compressed separately. Classic MVC design, in other words; the sensitive data (the mo

      • AFAICT, the vulnerability isn't compression in general, but compressing sensitive data along with data controlled by an attacker. Just compressing the sensitive data by itself won't leak much

        Yes it is compression in general that leaks relationships between length and content. You don't even need to influence channel to benefit from dependency between content and length.

        People have for example demonstrated recovery of useful information from encrypted voice communications simply by use of complex codecs without having to compromise encryption or wield any influence on in-band messages.

        Obviously the more intermediate data you can collect and the more you can influence channel (often possible in

        • People have for example demonstrated recovery of useful information from encrypted voice communications simply by use of complex codecs without having to compromise encryption or wield any influence on in-band messages.

          That I can believe, but it seems to be more a matter of analyzing the real-time bandwidth (many samples of compressability over time) rather than the overall compressed length. If all you had was a recording of the complete conversation, compressed and then encrypted, I doubt much could be inferred from the length of the ciphertext. There wouldn't be enough entropy in the length to represent more than a few words even if the compression algorithm were specifically designed to encode the content into the len

  • Anybody who willingly allows external code like ads or third party Javascript solutions should know that they are intentionally injecting potentially malicious code.
  • Nuclear powered attack submarines have been stolen? Oh my.

The goal of Computer Science is to build something that will last at least until we've finished building it.

Working...