Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Encryption Privacy

Academics Should Not Remain Silent On Government Hacking 135

ananyo writes "The Guardian's technology editor, Charles Arthur, asks why researchers have remained largely silent in the wake of the revelation that the U.S. National Institute of Standards and Technology's standard for random numbers used for cryptography had been weakened by the NSA: 'The nature of the subversions sounds abstruse: the random-number generator, the 'Dual EC DRBG' standard, had been hacked by the NSA and the UK's GCHQ so that its output would not be as random as it should have been. That might not sound like much, but if you are trying to break an encrypted message, the knowledge that it is hundreds or thousands of times weaker than advertised is a great encouragement.' Arthur attributes the silence of UK academics, at least, to pressure from GCHQ. He goes on to say: 'For those who do care, White and Matthew Green, who teaches cryptography at Johns Hopkins University in Baltimore, Maryland, have embarked on an ambitious effort to clean up the mess — one that needs help. They have created a non-profit organization called OpenAudit.org, which aims to recruit experts to provide technical assistance for security projects in the public interest, especially open-source security software.'"
This discussion has been archived. No new comments can be posted.

Academics Should Not Remain Silent On Government Hacking

Comments Filter:
  • I'm curious, who SHOULD remain silent on government hacking?
  • The government can put a heck of a lot of pressure on an individual for not cooperating. Everything from being blacklisted to jailed on spurious charges. Truthfully, I don't think I would have said a word once they started turning the thumb screws; metaphorically or otherwise.

  • Grants. (Period) (Score:4, Interesting)

    by xxxJonBoyxxx ( 565205 ) on Wednesday December 18, 2013 @11:51AM (#45727103)

    >> why researchers have remained largely silent in the wake of the revelation that (US gov does bad thing)

    Just follow the money to...federal research grants. Even if an individual professor decided to rock the boat, their local university would do their best to shut them up.

    Related: http://www.aipg.org/students/pdf/So%20you%20want%20to%20be%20a%20professor.pdf [aipg.org]
    George D. Klein - former professor: " A professor is viewed as a profit center by university administrators...Faculty members will survive as
    a professor if she/he is awarded lots of grants with lots of over-head for the university coffers..."

    Also: http://www.academicmatters.ca/2012/11/the-quiet-campus-the-anatomy-of-dissent-at-canadian-universities/ [academicmatters.ca]
     

    • Basically, University administrators have become academic political officers, responsible for implementing party^H^H^H^H^Hcorporate directive and ensuring that research, teaching, and indeed knowledge itself conforms with orthodox sovie^H^H^H^H^Hcorporate ideology.

  • Given that no documents have been released showing that the NSA intentionally weakened Dual_EC_DRBG other then Bruce Schnier swearing up and down he totally saw it (he may be a good cryptographer, how good is he at reading and understanding internal memos of an intelligence agency? Why have these memos not been released, even in a redacted form if they contain other sensitive info?)

    Everyone already knew it was weak. The NIST specification notes (circumspectly) that there's a possible weakness and outlines m

    • by crymeph0 ( 682581 ) on Wednesday December 18, 2013 @01:24PM (#45728183)

      This NSA document [nytimes.com] published at the NYT states explicitly that the NSA is attempting to "Influence policies, standards and specifications" for public key encryption, and given that the project described in that same document is about expanding the NSA's access to data, rather than increasing the security of that data, this proves that the NSA is working to weaken, not enhance, public key crypto. That NSA document doesn't specifically mention DUAL EC DRBG, but this NYT story [nytimes.com] does say that the Snowden documents somewhere list DUAL EC DRBG as one backdoored technology.

      Of course DUAL EC DRBG is only one algorithm. How many other algorithms has NSA contributed to? At this point, they're all suspect, because it's obvious now that the NSA is more worried about decrypting communications it intercepts rather than protecting any communications transmitted. So what academics should be doing is independently vetting all widely used encryption technology, starting with anything the NSA is known to be involved with, even peripherally. That is a tall order, and it used to be tin-foil-hat thinking, but like a police officer caught lying under oath causing decades worth of court cases to be thoroughly redone or thrown out, there is no alternative if we want to be sure that nothing else got through.

      • by EdIII ( 1114411 )

        I don't understand why this is news at all.

        Like I said in another post, The Scorpion and the Frog.

        The NSA exists to protect us on their terms, not ours. It serves their purposes to weaken encryption far more than it does to protect it.

        They are acting in accordance with their nature, and in the complete absence of any real regulations on their activities. Those that would regulate it are shut up quite nicely regardless of which of the 3 branches of government they are in..

        Big Data and all that fancy analysis

        • by rtb61 ( 674572 )

          Weakening encryption, means it weakens it for every single attacker not just the NSA part of the portfolio is defending and protecting those institutions and individuals that will using that encryption, so how it's something, totally and utterly idiotic and mind bogglingly stupid to do. Seriously what kind of shit brained professionally paranoid idiots would weaken security to make things easy to hack, when they are meant to be defending and preventing those things from being hacked. I surprised one sectio

  • IFIP [ifip.org] has issued a statement [ifip.org] on the issue. Excerpt:
    There is reason to believe that major practical pillars of trust in the Internet, e.g. the trustworthiness of relevant communication nodes and the security of cryptographic implementations in the web such as SSL, are being intentionally weakened in a systematic way at the behest of government organizations and other major actors in the field. This is endangering and undermining the fabric of the Internet and the Information Society, and contradicts the cla
  • One possible reason for the silence is that academic researchers get most of their money from the Government, and certainly cryptographers are funded through the three letter agencies (DoD, DHS, CIA, NSA, etc.). Why on Earth would they speak up and potentially ruin their academic careers? You can be outspoken when you are high-profile and have already made full Professor and have a comfortable life. If you're an assistant or associate professor, you're not rocking the boat too much if you want to keep going
    • DoD, DHS, CIA, NSA, etc will all require that you keep your research secret. They do fund cryptographers, but they are all in-house and not part of the academic community.
  • The open-audit link seems to be to a piece of software that is unrelated to the text of the summary. Rather than being anything related to people willing to help audit software, it's actually a tool for auditing your network.

    In addition, whilst it does seem to be GPL, you need to provide a name, and email to download it.

  • by WOOFYGOOFY ( 1334993 ) on Wednesday December 18, 2013 @12:45PM (#45727719)

    Every single one of us has felt the hesitation to speak out agianst what the NSA is doing lest be experience some sort of retaliation, typically being mechanically put on a "list" what is used in other contexts for other decisions. The most basic one is getting on the "no fly list" but one imagines that other lists exist also, for instance, the "do not fund research" list.

    THAT'S what a chilling effect is. It's a self perpetuating thing, because the more dissent is stifled, the more the faux consensous becomes reality, the more license the chillers see themselves as having been given by society.

    I'll never forget the CIA film of Saddam Hussein assembling Anyone Who's Anyone In Iraq into an auditorium then calling out names of individuals, who , when they appropriately stood up having been addressed, were escorted away by security personnel to their summary executions.

    As soon as the luminaries understood what was happening, they all stood and started to applaud this monster, chanting his name, swearing fidelity at the top of their lungs, hoping that such would make it less likely that they would ever appear on any such list and, if their name was on The List, they might somehow induce a last minute change of mind.

    That's the chilling effect of compiling lists of people and assigning them properties- "enemy", "hub", "individual of special concern should X Y or Z be happening".

    Every single one of us, whether we admit it or not, has felt a pause, a fear, the need to calculate and perhaps somehow soften or even self censor what we're saying WRT the government and the NSA for fear of such lists and their possible future consequences.

    This is one of the most insidious and well documented effects of surveillance and no one is immune, and- and this is significant- they know it.

    This is why the ability to spy on anyone all the time without anyone outside of people you command, or who fear you, knowing what you're doing has to go. This is why total transparency into who does what when why for how long without a scintilla of exception needs to be implemented into the spy agencies. We need spies and spying because we have real enemies who really want to do unspeakably evil things and will given the chance. We have to stop those people. In order to achieve that, we need to stop the spy agencies using the spy agencies to undermine their own democracy however inadvertently. If they were capable of doing this, then they wouldn't have hounded Binney and Drake and Kiriakou ; they would have listened to them.

    http://dissenter.firedoglake.com/2013/08/07/imprisoned-cia-whistleblower-john-kiriakou-totality-of-punishment-is-not-limited-to-a-prison-sentence/ [firedoglake.com]

    Right now, the biggest threat to the continued effectiveness of our spy agencies is the culture which has ascended and become the dominant one in the those spy agencies.

     

    • I'll never forget the CIA film of Saddam Hussein assembling Anyone Who's Anyone In Iraq into an auditorium then calling out names of individuals, who , when they appropriately stood up having been addressed, were escorted away by security personnel to their summary executions.

      CIA film? That was Saddam's film!

  • "Yes Academics..." (Score:5, Insightful)

    by GameboyRMH ( 1153867 ) <gameboyrmh@@@gmail...com> on Wednesday December 18, 2013 @01:01PM (#45727903) Journal

    "Tell us how you feel. If you have any criticisms you've been bottling up, then please write about them in journals, or better yet discuss them with your friends over an electronic communications medium. We'd hate to go unaware of any people with 'interesting' viewpoints." - The NSA.

  • Not silent. (Score:3, Interesting)

    by Anonymous Coward on Wednesday December 18, 2013 @01:23PM (#45728173)

    Not silent: drafting. Planning. Analysing. Discussing. Coding. Working.

    There's a lot more to it than Dual_EC_DRBG: that's just the most obvious, neon-sign "HEY LOOK AT ME I'M A BACKDOOR" backdoor. Funding document leaked by Snowden specifically states public key algorithms. I can match that directly: The NIST/SecP curves were generated by Jerry Solinas of the NSA from SHA-1 hashes with no known source. Yeah, they're totally dodgy, although not perhaps in the obvious ways (it may be that they're specified in ways that make them excruciatingly hard to implement correctly without fucking up: timing attacks; random source; curve point validation; perhaps unknown attacks associated with discriminants? NSA do have a head-start on EC). ECDSA and DSA too; the big hole is very simply that pesky random number thing - as Sony know to their cost. Makes me wonder about a couple of other things too.

    djb and Tanja Lange are working on new algorithms to replace them, which are 100% non-NSA and actually also really really fast. http://safecurves.cr.yp.to/ - EdDSA with Ed25519 (aka Curve25519 - same Montgomery curve, more efficient/useful Twisted Edwards representation) and/or Curve3617 (pure Edwards, 414-bit size, 200-bit security) are absolutely top contenders to replace ECDSA and ECDHE and are better in every single way. Tanja's even come up with a way to turn encode curve points, on the safecurves, in a way that the output looks truly random, if we need that (explicitly stated use case: censorship/protocol fingerprinting evasion, could see uses in other areas too).

    AES-128-GCM seems okay, if you have hardware implementations, but has difficulties running in O(1) free of timing attacks in software. ChaCha20_Poly1305 is a djb ciphersuite proposed by Adam Langley which does much better - it's a replacement for RC4 (which is definitely 100% no contest broken in realtime by Nation State Adversaries, we're quite sure about that now) and is probably a better AEAD, to be honest, than GCM, in my opinion. Live on Google servers, and in Chrome dev builds, right now. Draft in consideration. ChaCha20's Salsa20's successor, an eSTREAM finalist; ChaCha20 was used to build the SHA-3 finalist BLAKE and got a lot of cryptanalytic attention through that. Best attack (truncated differential cryptanalysis) 6 of 20 rounds, surprisingly simple ARX construct easy to analyse with no nasty surprises, fast as hell especially with any kind of SIMD, great differential and linear resistance: can't say fairer than that!

    These are among the algorithms we're going to use to rebuild all this. We have trust models to fix, too, which are closer to the root of the Big Problem - more transparent auditing, pinning with DANE and DNSSEC, there's a lot of possible things we could do to harden all that, and we're going to do all that. And the protocols can be improved significantly too, and we're working on that: things like encrypting the ClientHello from the very beginning WILL be part of TLS 1.3, flying pigs be damned.

    We have a lot of work ahead of us rebuilding all this, and it's going to take a long time. Needs to be done very carefully, openly, and transparently. Feel free to come and help (if you know what the fuck you're doing).

    I'm not pulling my punches. (I haven't so far, as you might see if you look me up.) I want to make sure the right choices are made for the right reasons, and GCHQ and NSA can kiss my ass. As engineers, we try to be non-political, but that doesn't mean we can't be really pissed at literally being double-crossed and lied to, and attacked by the people (ostensibly) whose job it is to protect us. I am very, very bitterly angry with them for fucking up their COMSEC mission to further their SIGINT mission: it's nothing less than a billion-dollar funded betrayal of national and international security, from a technical perspective, and one that we have to go and fix - because they won't, and we could never trust them to anyway.

    We're not silent. We're busy. Other people can talk about political ramifications. We'll work to solve the technical ones. We'll meet in the middle. /akr

  • So should academics use the same research and release standards with security bugs/exploits as government "hacks"?

    The general consensus is that researchers that find vulnerabilities in commercial software at least notify the vendor and give them a chance to mitigate the problem before releasing a potential attack blue-print to the universe. Should researchers notify government entities before releasing their research on their "hacks", or are they to treat the government as a black-hat?

    It's an interesting c

  • What a non-story. The flaws in Dual EC DRBG were widely published shortly after release.

    The backdoor was first published by Dan Shumow and Niels Ferguson [cr.yp.to] in August 2007.

    Bruce Schneier [wired.com] wrote the same year:

    My recommendation, if you're in need of a random-number generator, is not to use Dual_EC_DRBG under any circumstances. If you have to use something in SP 800-90, use CTR_DRBG or Hash_DRBG.

    This was common knowledge if you had more than a passing interest in cryptography. I think TFA is mistaken when it says that it didn't get enough attention. The reason academics didn't take it more seriously is that it was seen as so obvious, it was mostly harmless shenanigans.

    You would only use it in a ser

Research is to see what everybody else has seen, and think what nobody else has thought.

Working...