Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Encryption Privacy IT

Schneier: We Don't Need SHA-3 143

Trailrunner7 writes with this excerpt from Threatpost: "For the last five years, NIST, the government body charged with developing new standards for computer security, among other things, has been searching for a new hash function to replace the aging SHA-2 function. Five years is a long time, but this is the federal government and things move at their own pace in Washington, but NIST soon will be announcing the winner from the five finalists that were chosen last year. Despite the problems that have cropped up with some versions of SHA-2 in the past and the long wait for the new function, there doesn't seem to be much in the way of breathless anticipation for this announcement. So much so, in fact, that Bruce Schneier, a co-author of one of the finalists not only isn't hoping that his entry wins, he's hoping that none of them wins. ... It's not because Schneier doesn't think the finalists are worthy of winning. In fact, he says, they're all good and fast and perfectly capable. The problem is, he doesn't think that the world needs a new hash function standard at all. SHA-512, the stronger version of the SHA-2 function that's been in use for more than a decade, is still holding up fine, Schneier said, which was not what cryptographers anticipated would be the case when the SHA-3 competition was conceived. 'I expect SHA-2 to be still acceptable for the foreseeable future. That's the problem. It's not like AES. Everyone knew that DES was dead — and triple-DES was too slow and clunky — and we needed something new. So when AES appeared, people switched as soon as they could. This will be different,' Schneier said via email."
This discussion has been archived. No new comments can be posted.

Schneier: We Don't Need SHA-3

Comments Filter:
  • Useful replacement (Score:4, Insightful)

    by l2718 ( 514756 ) on Tuesday September 25, 2012 @04:15AM (#41447479)
    Faster computation of hash functions is very important, especially to low-power devices. In other words, even if the improvements in cryptographic strength are irrelevant I'd expect the new standard to be adopted quickly.
    • Faster computation of cryptographic hashes add weaknesses as they make bruteforce collision finding faster as one can try possibilities quicker.

      • by Goaway ( 82658 )

        This is only a problem for one single use of hashes, namely storing passwords in a database, and there are perfectly satisfactory solutions to it.

        It is not a problem for most other uses of hashes.

        • There are a few uses, but yes it only affects certain types of collision. But it is a weakness in those use cases. Does it matter if the hashing is slightly slower for checking the HMAC from a security standpoint? Yes from a usability standpoint I don't want to be waiting 5 minutes while computer decrypts a webpage, but it doesn't detract or add to the security of the algorithm in such use cases.

      • by nzac ( 1822298 )

        I think all the finalists are 512 or more bit hashes that make collisions far harder than the current bit lengths.

        If you are just meaning passwords then chose a more suited hash function as this is not what SHA-3 is for.

      • by Anonymous Coward

        No they don't. Hashes that can be brute-forced if you can only calculate them fast enough are weak per se. 512-bit hashes cannot be brute-forced even if you can calculate 2^64 per second, so it is advantageous that they can be evaluated quickly for the mainline case of using them to hash things.

      • by delt0r ( 999393 )
        At 512 bits far a hash we need on average 2^256 hashes to find a collision using the birthday paradox/trick. Lets assume you have 100,000 cores each that can do 4 billion hash per sec (far faster than we can do right now). It will billions of trillions of zillions of gazillions of times longer than the current age of the universe to find this hash collision. Never mind that there are not enough fundamental particles in the universe for the required storage. Even at a hash per particle.

        Hashes should be
      • by fatphil ( 181876 )
        Not *brute force* attacks - dictionary attacks. The search-space is too large for brute force to be a realistic method for finding collisions.

        Assuming you're salted and not in plain text (big assumption, alas), nothing can sensibly defend against weak keys - but that's because all the security is in the key. Weak keys are weak, end of.
      • by amorsen ( 7485 )

        Go right ahead then, pick any of the contestants and bruteforce a collision. You'll be famous.

        We cannot even design a computer to COUNT to 2^128, so for any even minimally secure hash function of 256 bits or more, brute force won't happen. Not even if your GPU with 10000 cores can do a hash in one clock cycle.

        • That's the point, it doesn't have to, each of those 10000 cores each do a hash attempt in a few clock cycles each, my nvidia gpu (GTX 465) which is 2 years old now can handle 2.75 million sha1 hashes a second (I tested it). Being the geforce consumer model for graphics, only half the processing power is available for use in CUDA general purpose computing. Salted md5 passwords upto 12 characters can be brute force cracked in about a month with ~$40,000 worth of off the shelf hardware (I dread to think how fa

          • 2.75 million sha1 hashes a second (I tested it)

            Now compare the 2^ complexity of SHA1 to SHA512.

            (I dread to think how fast NSA or GCHQ could do it on their top secret supercomputers with classified performance specs).

            NSA is a government agency. Figure their costs to do anything are 3x that of industry (I'm being generous).

            Now, figure out what they need to spend to out-R&D Intel and look at their budget. If they have working quantum supercomputers, are they building that massive western data center just

    • by commlinx ( 1068272 ) on Tuesday September 25, 2012 @05:34AM (#41447671) Homepage Journal

      True, I normally use a 8-bit checksum for my hashing for best performance. On passwords in particular some people think hashing and password recovery are incompatible, but on the server I simply maintain a list of 256 complex looking passwords so a match can be quickly looked up and e-mailed back.

      Does anyone know if that idea has been thought of before, maybe I should take a patent?

    • by beezly ( 197427 )

      SHA-512 is a cryptographic hash function. Faster computation of hashes is exactly what you *don't* want.

      • by amorsen ( 7485 )

        Why wouldn't you want faster cryptographic hashes? It is trivial to slow a hash down as much as you want, but when you need it to go fast, it is very difficult to speed it up.

      • by fatphil ( 181876 )
        So, what do you know about crypto that NIST don't?

        They *specifically* stated that "computational efficiency" was important in their official announcement of the FIPS 180-3 competition. That was second on their list, right after "security".
  • However, SHA-2 could be broken tomorrow, and this time we won't have a decade's wait while a suitable replacement is designed.

    • by AHuxley ( 892839 )
      When the US gov says it does not "need" a new code for the unwashed masses, a good reason usually gets hinted at 10's of years later.
      http://en.wikipedia.org/wiki/Data_Encryption_Standard#NSA.27s_involvement_in_the_design [wikipedia.org]
      • Re:Future proofing (Score:5, Interesting)

        by jd ( 1658 ) <imipak&yahoo,com> on Tuesday September 25, 2012 @05:44AM (#41447701) Homepage Journal

        To be fair, the NSA don't seem to have caused problems with the S-Boxes and differential analysis doesn't seem to have worked too well. On the other hand, COCACABANA et al were glorified 1940s-era Colossus machines - cracking codes via a massively parallel architecture. To me, that's the scary part. Turing's work on cryptography and massively parallel code breakers was 100% applicable to the design of DES because the keylength was so incredibly short. You could build enough machines to effectively break it.

        How many DES engines do you think someone could have crammed onto a wafer in the 1980s? (Remember, each die can have multiple engines, and then the dies that work can be hooked together.) Link up a bunch of such wafers and you end up with a crypto engine from hell. It would have been VERY expensive, but I would imagine it perfectly plausible that a sufficiently detemined and rich organization (I would imagine the NSA might have been one such) could have potentially built such a machine when the rest of us still thought the 6502 was a really neat idea.

        Doesn't mean anyone ever did. People could have reached Mars in the 1980s, so "could have" and "did" are obviously very different things. What people actually did is anyone's guess, though "nothing" sounds about right.

        Had they built such a device, though, then near-real-time breaking of DES would have been possible at the time it was in mainstream use. Certainly, there were claims circulating that such devices existed, but a claim like that without proof is hard to accept. All I can say is that it's demonstrably not impossible, merely unlikely.

        Back to SHA-2. Are we in the same boat? Are there ways to build something today, even if nobody is likely to have actually built it yet, that could endanger SHA-2? (To me, THAT is the measure of security, not whether anyone actually has, because they're not likely to tell you when they have.) Quantum computing is the obvious threat, since 512 bits is a lot of security, too much to attack in parallel with a classical architecture. Quantum computing, though, should let you scale up non-linearly. The question is whether it's enough. (I'm assuming here that there are no issues with preimages or timing that can be exploited to reduce the problem to a scale QC can solve even if classical machines can't.)

        There have been a few murmurs that suggest SHA's security isn't as strong as the bitlength implies. Would that be enough? If Japan can build a vector machine the size of a US football stadium, then it is not physically impossible to scale a machine to those sizes. Nobody has scaled a quantum computer beyond a few bits, but I repeat, I don't care what people have publicly done, it is what is within the capacity of people TO build whether publicly or not that matters.

        If you're not 100% certain that not even a quantum computer on such a scale, where all nodes were designed at the hardware level to perform JUST the task trying to break the has, then the hash is not safe for 20+ years. It may be unlikely, but there's nothing to say it might not be vulnerable right now. There's nothing physically impossible about it (as shown), it's merely a hard problem. And hard problems get solved. What you need in a crypto hash is something you can be sure WILL be impossible to break in a 20 year window, which means what you need is a crypto hash that is beyond anything where the components can be prototyped today. For a 30 year window, it needs to be beyond detailed theory. A 50 year window can be achieved if it's beyond any machine ANY existing theory can describe.

        (It takes time to go from theory to prototype to working system to working system on the right scale. The intervals seem to be fairly deterministic in each subject. I believe this to indicate a mathematical model that underpins things like Moore's Law and which is independent of field. Know that model and you know when Moore's Law will fail. Moore's Law is merely the equivalent of Hooke's Constant for computing, failure is inevita

        • Had they built such a device, though, then near-real-time breaking of DES would have been possible at the time it was in mainstream use. Certainly, there were claims circulating that such devices existed, but a claim like that without proof is hard to accept. All I can say is that it's demonstrably not impossible, merely unlikely.

          Speculation like this needs to take historical context into account. At that time, very little important information worth the effort were stored on a computer. And far less of it was connected to the internet where the NSA or CIA could access the data stream in real time. Such a machine may have been created, but I would hardly think there'd be a use for it.

          • by jd ( 1658 )

            True, for computer information, but plenty of data was sent via radio - it was simplicity itself to tune into civilian and military digital chatter. (See "The Hacker's Handbook", by "Hugo Cornwall" - pseudonym of Peter Sommer, an expert in information systems security.) For military purposes, it was much much easier to teach people to type messages into a portable machine which would digitize it and blast the digital form wirelessly (and encrypted) than to get them to key properly. Keying in morse was also

          • Also, DES was an American cipher, so the attacking organization was more likely to be the KGB than the NSA.
        • by pthisis ( 27352 )

          To be fair, the NSA don't seem to have caused problems with the S-Boxes and differential analysis doesn't seem to have worked too well

          In fact, the NSA's changes to the S-boxes made DES stronger against differential cryptanalysis; it appears that they and IBM knew about diff crypto back in the 1970s and designed an algorithm to resist it even though the technique wouldn't be widely known for another 15-20 years.

          Diffential crypto only "doesn't seem to have worked out so well" because it's known and algorithms

        • There is no evidence that quantum computing will break hash functions or block ciphers beyond the square-root advantage you get from Grover's algorithm. This effectively halves the bit length of any hash function in terms of collision resistance making it something like 512 / 2 / 2 = 128 bits of security. This is considered far out of the reach of any technology for the next few decades.
          • by jd ( 1658 )

            SHA2 supports 256 bit modes, which gives you 64 bits of security, which is WELL within the reach of modern technology, and part of the debate is whether SHA3 is needed at all. Clearly it is.

            128 bits might be "out of reach" of technology for the next few decades, but that is not enough. Nowhere near. Classified information has to be secure for 50 years and SHA3 must be strong enough to support that requirement for at least as long as it will take to create a SHA4 (which, to judge from SHA3, might easily be a

      • Only in this case the US government (NSA) is saying they want a new one... and Schneier is saying he doesn't think one is strictly necessary at this point.
    • by Chrisq ( 894406 )

      However, SHA-2 could be broken tomorrow, and this time we won't have a decade's wait while a suitable replacement is designed.

      And SHA-3 could be broken the day after. Or some mathematical breakthrough could undermine an assumption that both use.

      • All remaining SHA-3 candidates use a different mathematical assumptions to the SHA-2 algorithms. So breaking one won't just break the other.

      • Re:Future proofing (Score:5, Interesting)

        by jd ( 1658 ) <imipak&yahoo,com> on Tuesday September 25, 2012 @05:53AM (#41447729) Homepage Journal

        Very true. Which is why I'm anxious SHA-3 has as little (ideally nothing) in common with SHA-2, be it algorithmically or in terms of the underpinning mathematical problems used that are assumed to be hard.

        I would have preferred Blue Midnight Wish to be still in the running (well, it's got a cool name, but more importantly it has a very different design).

        I ALSO wish Bruce and the others would pay attention to those of us on the SHA-3 mailing list advocating a SHA-3a and SHA-3b where -3a has the best compromise between speed and security, and -3b has absolutely b. all compromise and is as secure as you can get. Why? Because that meets Bruce's objections. -3a may will be broken before SHA-2 is so threatened that it is unusable, because of all the compromises NIST want to include. -3b, because it refuses to bow to such compromises, should remain secure for much longer. You can afford to stick it in the freezer and let it sit there for a decade, because it should still be fresh BECAUSE no compromises were made. By then, computers would be able to run it as fast, or faster, than -3a could be run now.

        So I have ZERO sympathy with Schneier. He is complaining about a problem that he is, in part, responsible for making. Other views WERE expressed, he thought he knew better, but his path now leads to a solution he believes useless. So, to NIST, Bruce, et al, I say "next time, leave your bloody arrogance at home, there's no room for it, doubly so when you've got mine to contend with as well".

        • by Chrisq ( 894406 )

          Because that meets Bruce's objections. -3a may will be broken before SHA-2 is so threatened that it is unusable, because of all the compromises NIST want to include. -3b, because it refuses to bow to such compromises, should remain secure for much longer. You can afford to stick it in the freezer and let it sit there for a decade, because it should still be fresh BECAUSE no compromises were made.

          There are some applications where this is very important, for example the electronic signing of documents for copyright purposes (i.e hash published to prove authorship), public time-stamping of documents etc. If someone can come back in 10 years time with an alternative document that produces the same hash you no longer have absolute proof!

        • by chill ( 34294 )

          Is this still possible? Considering SHA-2 is really a take-your-pick suite of SHA-224, -256, -384 & -512, NIST could do the same with SHA-3 and create a family.

          Hell, SHA-1 is still kosher according to FIPS 180-4 as of March 2012. I expect SHA-2 to hang around for many years to come.

          I admit I have not been following the mailing lists and they might have nixed this idea totally. Thus, my question to you which is probably quicker than trying to dig thru the archives.

          • by jd ( 1658 )

            Oh, it should indeed still be possible to produce a best-of-breed class as well as a best-all-round class, but the closer we get to the deadline, the more apathy and office politics subsumes the process.

            It would be great to have a family. Since SHA-3 entries were to produce a fixed-sized hash, the family would consist of different breeds of hash rather than different output lengths. I don't see a problem with that. People can then use what is correct for the problem, rather than changing the problem to make

    • Re:Future proofing (Score:5, Interesting)

      by plover ( 150551 ) on Tuesday September 25, 2012 @06:55AM (#41447997) Homepage Journal

      Bruce's argument is essentially "the devil you know." Five years ago it seemed like SHA-2 might succumb to an attack. However, it's been five years, and those attacks never materialized on SHA-512. That's five more years of convincing evidence that SHA-2 is fairly secure. None of the SHA-3 finalists have had the same level of scrutiny. Sure, they've been looked at seriously, but nothing like the widespread amount of attention that an actual standard gets.

      Another consideration is practicality. If a new standard is published, millions of dollars will be expended changing systems around the world. Will all that money have been well spent? If there was no cryptographic reason for the change, all that money and effort was wasted.

      And what about security? Will every replacement be perfect? I personally doubt it; mistakes are made and people screw up implementations all the time. An organization that hired a cryptographer to design and implement a secure solution in 2007 might feel they can do the work themselves today. But we know that cryptographic protocols are notoriously finicky when it comes to unintended information leakage or security. If a secure design is modified in any way, the potential to introduce a security bug means the risk of change is much higher than the risk of sticking with SHA-2.

  • by Anonymous Coward

    I think the next hash should be called B455 DR0PP3R 2K12

  • I have an idea (Score:5, Informative)

    by diamondmagic ( 877411 ) on Tuesday September 25, 2012 @04:40AM (#41447537) Homepage

    How about we link to Schneier's actual blog post? https://www.schneier.com/blog/archives/2012/09/sha-3_will_be_a.html [schneier.com]

  • by Certhas ( 2310124 ) on Tuesday September 25, 2012 @05:11AM (#41447603)

    Is it really necessary to have a snide remark at supposed government inefficiency there? Can't we bury this ideological attacks that are not really supported by facts or data, add nothing to the point and are in fact grossly misleading?

    This is a hard mathematical problem. Ordinary research papers in mathematics often spend a year or more in peer review in order to verify their correctness. If you're building a key component of security infrastructure a couple of years of review is not at all unreasonable.

    • by Goaway ( 82658 ) on Tuesday September 25, 2012 @05:52AM (#41447725) Homepage

      Yeah, that bit of snark really showed the author has no clue at all what goes into a process like this. Those years are there to give researchers time to really try and break all the candidates. You don't want to rush that part only to find out someone managed to break the winner the next year.

  • So much work from everyone involved and we just throw it away??

    This is a standard for many years in the future. SHA-1 is still used in some current applications and is considered secure and people are still using MD5.

    Everyone can just ignore the new standard and the researcher can have a decade or two to try to break it before its needed. Where is the harm?

  • SHA-2 Veriditas, Neato!
  • As the saying goes. Why don't we continue to run candidates in parallel to SHA-2 usage and when there are signs that SHA-2 is about to be compromised or obsolete we'll just switch to whatever candidate was the best afterwards. Naturally if SHA-3 is significantly better in speed, security and compression etc then we have already made SHA-2 obsolete and need not wait until it "breaks". My two pence.
  • Has anyone ever actually read NIST 800? I just had to review 800 30 and 800 39 yesterday. Hand to god they're designed to put you in a coma. There is not enough ritalin on the the planet for that.

  • Really, any increase in key-length or change in algorithm ought to be done to save us from the issues that could arise from things like quantum computers, super computer bot nets, or further into the future quantum computer bot nets. I mean we don't have those things yet, but we can kinda see them coming, and ought to be thinking long and hard about how to break that issue permanently.

  • Nist started the SHA-3 competition when SHA-1 was proven weak, and no one was sure how long SHA-2 would last,
    no one liked the idea of relying solely on the wide pipe SHA-512 when the underlying building blocks have been proved week, (using SHA-512 is a bit like using triple-DES).
    However it is difficult to predict advances in cryptography, and though SHA-512 is not nearly as weak as we predicted it would be a few years ago, we don't know what new cryptanalysis will show up tomorrow, forcing us to leave SHA-2

    • by Bronster ( 13157 )

      How is that different from just picking another one of those 5 and calling it SHA-4? It's not like they magically go away because one has been given a version number all of its own.

  • by dwheeler ( 321049 ) on Tuesday September 25, 2012 @11:40AM (#41451169) Homepage Journal

    I disagree. You don't wait to build a fire escape until the building is on fire. Similarly, we need a good alternative hash algorithm now, not when disaster strikes.

    I believe that, in general, we should always have two widely-implemented crypto algorithms for any important purpose. That way, if one breaks, everyone can just switch their configuration to the other one. If you only have one algorithm... you have nothing to switch to. It can take a very long time to deploy things "everywhere", and it takes far longer to get agreement on what the alternatives should be. Doing it in a calm, careful way is far more likely to produce good results.

    The history of cryptography has not been kind, in the sense that many algorithms that were once considered secure have been found not to be. Always having 2 algorithms seem prudent, given that history. And yes, it's possible that a future break will break both common algorithms. But if the algorithms are intentionally chosen to use different approaches, that is much less likely.

    Today, symmetric key encryption is widely implemented in AES. But lots of people still implement other algorithms, such as 3DES. 3DES is really slow, but there's no known MAJOR break in it, so in a pinch people could switch to it. There are other encryption algorithms obviously; the important point is that all sending and receiving parties have to implement the same algorithms for a given message BEFORE they can be used.

    Similarly, we have known concerns about SHA-2, SHA-256, and SHA-512. Maybe there will never be a problem. So what? Build the fire escape NOW, thank you.

"The only way for a reporter to look at a politician is down." -- H.L. Mencken

Working...