Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Privacy Government Open Source Security Software IT

NSA Backdoors In Open Source and Open Standards: What Are the Odds? 407

New submitter quarrelinastraw writes "For years, users have conjectured that the NSA may have placed backdoors in security projects such as SELinux and in cryptography standards such as AES. However, I have yet to have seen a serious scientific analysis of this question, as discussions rarely get beyond general paranoia facing off against a general belief that government incompetence plus public scrutiny make backdoors unlikely. In light of the recent NSA revelations about the PRISM surveillance program, and that Microsoft tells the NSA about bugs before fixing them, how concerned should we be? And if there is reason for concern, what steps should we take individually or as a community?" Read more below for some of the background that inspires these questions.
quarrelinastraw "History seems relevant here, so to seed the discussion I'll point out the following for those who may not be familiar. The NSA opposed giving the public access to strong cryptography in the '90s because it feared cryptography would interfere with wiretaps. They proposed a key escrow program so that they would have everybody's encryption keys. They developed a cryptography chipset called the "clipper chip" that gave a backdoor to law enforcement and which is still used in the US government. Prior to this, in the 1970s, NSA tried to change the cryptography standard DES (the precursor to AES) to reduce keylength effectively making the standard weaker against brute force attacks of the sort the NSA would have used.

Since the late '90s, the NSA appears to have stopped its opposition to public cryptography and instead (appears to be) actively encouraging its development and strengthening. The NSA released the first version of SELinux in 2000, 4 years after they canceled the clipper chip program due to the public's lack of interest. It is possible that the NSA simply gave up on their fight against public access to cryptography, but it is also possible that they simply moved their resources into social engineering — getting the public to voluntarily install backdoors that are inadvertently endorsed by security experts because they appear in GPLed code. Is this pure fantasy? Or is there something to worry about here?"
This discussion has been archived. No new comments can be posted.

NSA Backdoors In Open Source and Open Standards: What Are the Odds?

Comments Filter:
  • by Viol8 ( 599362 ) on Tuesday July 02, 2013 @08:54AM (#44164255) Homepage

    .... but unless they have the worlds top obfuscating coders working there (quite possible) , how long do you think it would be until someone spots the suspect code especially in something as well trodden as the Linux kernel or GNU utilities? I would guess not too long.

  • by feld ( 980784 ) on Tuesday July 02, 2013 @08:58AM (#44164301)

    http://marc.info/?l=openbsd-tech&m=129236621626462&w=2 [marc.info]

    Some guy claimed to have put backdoors in the OpenBSD IPSEC stack for the FBI, but a full audit proved no such thing ever happened.

    I seriously doubt this is happening in open source.

  • by cHiphead ( 17854 ) on Tuesday July 02, 2013 @09:00AM (#44164329)

    Why not just insert something at the compiler level. Perhaps they have compromised GCC itself or something at a different, less obvious point in the process of development?

  • The Clipper chip (Score:5, Interesting)

    by Vintermann ( 400722 ) on Tuesday July 02, 2013 @09:01AM (#44164333) Homepage

    You mention the Clipper chip and its key escrow system guaranteeing government access, but what you should remember is that the cryptosystem that chip used was

    1. Foolishly kept secret by the NSA, although it has long been understood that academic scrutiny is far more important than security through obscurity, and

    2. The symmetric cipher the chip used, Skipjack, was subject to a devastating attack on its first day of declassification (breaking half the rounds) and by now is completely broken. That remains rare for any seriously proposed cipher...

    Since presumably the NSA did not try to make a broken cryptosystem (why, to help other spies? They themselves had the keys anyway!) this illustrates that yes, incompetence is a concern even at super-funded, super-powerful agencies like the NSA.

  • by zerro ( 1820876 ) on Tuesday July 02, 2013 @09:15AM (#44164465)

    Why backdoor just one brand of compiler (since there are several), when you could backdoor the architecture?
    I'm pretty sure there is a special sequence of intel instructions which open the unicorn gate, and pipe a copy of all memory writes to NSA's server.

  • Re:This is stupid (Score:5, Interesting)

    by Hatta ( 162192 ) on Tuesday July 02, 2013 @09:19AM (#44164525) Journal

    Encryption algorithms may be secure, but how sure are you that your implementation is? Debian was generating entirely insecure SSL keys for a couple years before anyone noticed. Couldn't the NSA do something like that, but perhaps a bit more clever, and remain unnoticed?

  • by Anonymous Coward on Tuesday July 02, 2013 @09:22AM (#44164559)
  • Re:This is stupid (Score:3, Interesting)

    by mitcheli ( 894743 ) on Tuesday July 02, 2013 @09:25AM (#44164607)
    The bigger threat to encryption isn't the pre-computed magic numbers that the NSA may or may not have placed into key algorithms, it is the advance of technology and the subsequent rendering useless of the models we currently use today [wikipedia.org].
  • Depends (Score:5, Interesting)

    by Sycraft-fu ( 314770 ) on Tuesday July 02, 2013 @09:30AM (#44164653)

    Check out the Underhanded C contest (http://underhanded.xcott.com/). There are great examples of code that look innocuous, but aren't. What's more, some of them look like legit mistakes that people might make programming.

    So that is always a possibility. Evil_Programmer_A who works for whatever Evil Group that wants to be able to hack things introduces a patch for some OSS item. However, there's a security hole coded in purposely. It is hard to see, and if discovered will just look like a fuckup. Eventually it'll probably get found and patched, but nobody suspects Evil_Programmer_A of any malfeasance, I mean shit security issues happen all the time. People make mistakes.

    In terms of how long to spot? Depends on how subtle it is. If you think all bugs get found real fast in OSS you've never kept up on security vulnerabilities. Sometimes, they find one that's been around for a LONG time. I remember back in 2000 when there was a BIND vulnerability that applied to basically every version of BIND ever. It has been lurking for years and nobody had caught it. Worse, it was a "day-0" kind of thing and people were exploiting it already. Caused a lot of grief for my roommate. By the time he heard about it (which was pretty quick, he subscribed to that kind of thing), their server at work had already been owned.

    Don't think that just because the code is open that it means that it gets heavily audited by experts. Also don't think that just because an expert looks at it they'll notice something. It turns out a lot of security issues are still found in the runtime, not by a code audit. Everyone looks at the code and says "Ya, looks good to me," and only when later running it and testing how it reacts do they discover an unintended interaction.

    I'm not trying to claim this is common, or even happening at all, but it is certainly possible. I think people put WAY too much faith in the "many eyes" thing of OSS. They think that if the code is open, well then people MUST see the bugs! All one has to do is follow a bug track site to see how false that is. Were it true, there'd be no bugs, ever, in release OSS code. Thing is, it is all written and audited by humans are humans are fallible. Mistakes happen, shit slips through.

  • by time961 ( 618278 ) on Tuesday July 02, 2013 @09:34AM (#44164691)
    Biham and Shamir, Differential Cryptanalysis of the Data Encryption Standard, at CRYPTO '92. They showed that the S-boxes were about as strong as possible given other design constraints.

    Subsequently, Don Coppersmith, who had discovered differential cryptanalysis while working (as a summer intern) at IBM during the development of DES in the early 1970's, published a brief paper (1994, IBM J. of R&D) saying "Yep, we figured out this technique for breaking our DES candidates, and strengthened them against it. We told the NSA, and they said 'we already know, and we're glad you've made these improvements, but we'd prefer you not say anything about this'." And he didn't, for twenty years.

    Interestingly, when Matsui published his (even more effective) DES Linear Cryptanalysis in 1994, he observed that DES was just average in resistance, and opined that linear cryptanalysis had not been considered in the design of DES.

    I think it's fair to say that NSA encouraged DES to be better. But how much they knew at the time, and whether they could have done better still, will likely remain a mystery for many years. They certainly didn't make it worse by any metric available today.
  • Re:This is stupid (Score:3, Interesting)

    by Anonymous Coward on Tuesday July 02, 2013 @09:38AM (#44164745)

    Mosty paranoia then....

    Misdirection rather than paranoia. They're trying to point the finger at Linux etc when it's SecureBoot that's the vulnerability.

    When you use a board with SecureBot, you're using pre-compromised hardware. Even when you install a secure OS, the underlying hardware hides the backdoor.

  • Re:This is stupid (Score:5, Interesting)

    by SilenceBE ( 1439827 ) on Tuesday July 02, 2013 @09:46AM (#44164855)
    The majority of the people in Belgium speaks Flemish (which is related to Dutch) and dialects. The french and germans are a smaller language group. The dialect that I locally speak (West Flemish) is even more related to the languages from Northern France then the Netherlands.

    But what bothers me the most with Belgian mistaken identity is that a lot of American companies or websites serves everything in French when it detects I'm from Belgium. It is like if the rest of the world would detect that you are from the States and serve everything in Spanish because there is a big Hispanic community.

    It took Microsoft years to get it in their head that most people here speaks Flemish. For years everything on Xbox live (that had a french localization) was served in French.
  • Re:This is stupid (Score:4, Interesting)

    by Joce640k ( 829181 ) on Tuesday July 02, 2013 @09:51AM (#44164911) Homepage

    It's not so easy to make sure that a program is a correct implementation of a mathematical algorithm or of an open standard.

    There's a huge list of test vectors for AES published by NIST (among others): http://csrc.nist.gov/archive/aes/rijndael/wsdindex.html [nist.gov]

    The chances of being able to write some code which reproduces those values but ISN'T AES are less than the reciprocal of the number of atoms in the universe.

  • by s.petry ( 762400 ) on Tuesday July 02, 2013 @10:06AM (#44165057)

    As a followup to my other response, if this magical backdoor into every Windows system on the planet is so great, then why was there a need for Stuxnet to ever come into existence?

    The NSA should have built-in access to every Iranian Windows computer without the need for highly complex malware package!

    You fail to understand the difference between a back door and spyware. A back door would allow the installation of such a piece of software, but would not be the spyware itself. This code worked around normal protection in Windows for security and privilege escalation, as well as avoided malicious software detection from AV software. In addition, there has been information leaked that told you that there are back doors in Windows for the US Government (and perhaps other Governments). The part that was not clear is whether NSA has people working at MS to ensure that they have and know about back doors, or MS employees facilitate their whims for creating these back doors.

  • Not likely (Score:4, Interesting)

    by amck ( 34780 ) on Tuesday July 02, 2013 @10:09AM (#44165095) Homepage

    This misses the dual goals of the NSA:
    (1) Break other peoples communications.
    (2) Protect US (govt?) ones.

    The trouble with backdoors is that they can be used by others to break US systems. So this is not the preferred solution from the NSA's perspective.

    A good lesson in this is the DES cipher. The DES cipher was a 56-bit cipher based on IBMs original 128-bit Lucifer algorithm. When it was released everybody worried about the S-boxes and design and wondered if the NSA has created a backdoor for themselves. As attacks on Fiestel network ciphers (such as DES) were found, it was apparent that DES was already hardened against these: the NSA knew of these attacks and had produced the hardest 56-bit cipher possible. Their strategy became apparent: by setting the strength at 56-bits, they created a cipher they could break because they had the processing power, but no-one else could (at the time).

    Similarly today: its apparent that 22 years after PGP was created, mail is not encrypted by default. The NSA's strategy is to help push the design of open standards to suit their goals: small -enough quantities of encryption that it is possible for brute-force or black-bag jobs to be used as required.

  • Re:This is stupid (Score:4, Interesting)

    by interval1066 ( 668936 ) on Tuesday July 02, 2013 @11:09AM (#44165893) Journal

    Hell, the State of California practically does that now.

    Practically? In some parts of S. California I could walk outside my front door and not be able to read the commercial signs. You'd never know the official langauge of the country was English.

  • by johnrpenner ( 40054 ) on Tuesday July 02, 2013 @11:37AM (#44166267) Homepage

    yeah — like ken thompson's C compiler exploit:

    http://scienceblogs.com/goodmath/2007/04/15/strange-loops-dennis-ritchie-a/ [scienceblogs.com]

    For debugging purposes, Thompson put a back-door into “login”. The way he did it was by modifying the C compiler. He took the code pattern for password verification, and embedded it into the C compiler, so that when it saw that pattern, it would actually generate code
    that accepted either the correct password for the username, or Thompson’s special debugging password. In pseudo-Python:

        def compile(code):
            if (looksLikeLoginCode(code)):
                generateLoginWithBackDoor()
            else:
                compileNormally(code)
    With that in the C compiler, any time that anyone compiles login,
    the code generated by the compiler will include Ritchie’s back door.

    Now comes the really clever part. Obviously, if anyone saw code like what’s in that
    example, they’d throw a fit. That’s insanely insecure, and any manager who saw that would immediately demand that it be removed. So, how can you keep the back door, but get rid of the danger of someone noticing it in the source code for the C compiler? You hack the C compiler itself:

        def compile(code):
            if (looksLikeLoginCode(code)):
                generateLoginWithBackDoor(code)
            elif (looksLikeCompilerCode(code)):
                generateCompilerWithBackDoorDetection(code)
            else:
                compileNormally(code)
    What happens here is that you modify the C compiler code so that when it compiles itelf, it inserts the back-door code. So now when the C compiler compiles login, it will insert the back door code; and when it compiles
    the C compiler, it will insert the code that inserts the code into both login and the C compiler.

    Now, you compile the C compiler with itself – getting a C compiler that includes the back-door generation code explicitly. Then you delete the back-door code from the C compiler source. But it’s in the binary. So when you use that binary to produce a new version of the compiler from the source, it will insert the back-door code into
    the new version.

    So you’ve now got a C compiler that inserts back-door code when it compiles itself – and that code appears nowhere in the source code of the compiler.

The hardest part of climbing the ladder of success is getting through the crowd at the bottom.

Working...