Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Microsoft Cloud Government Security

Microsoft Opens 'Transparency Center' For Governments To Review Source Code 178

MojoKid writes with news that Microsoft has announced the opening of a 'Transparency Center' at their Redmond campus, a place where governments who use Microsoft software can come to review the source code in order to make sure it's not compromised by outside agencies. (The company is planning another Transparency Center for Brussels in Belgium.) In addition, Microsoft announced security improvements to several of its cloud products: As of now, Outlook.com uses TLS (Transport Layer Security) to provide end-to-end encryption for inbound and outbound email — assuming that the provider on the other end also uses TLS. The TLS standard has been in the news fairly recently after discovery of a major security flaw in one popular package (gnuTLS), but Microsoft notes that it worked with multiple international companies to secure its version of the standard. Second, OneDrive now uses Perfect Forward Secrecy (PFS). Microsoft refers to this as a type of encryption, but PFS isn't a standard like AES or 3DES — instead, it's a particular method of ensuring that an attacker who intercepts a particular key cannot use that information to break the entire key sequence. Even if you manage to gain access to one file or folder, in other words, that information can't be used to compromise the entire account.
This discussion has been archived. No new comments can be posted.

Microsoft Opens 'Transparency Center' For Governments To Review Source Code

Comments Filter:
  • What's the point? (Score:5, Insightful)

    by Anonymous Coward on Tuesday July 01, 2014 @11:16PM (#47366143)

    Governments shouldn't be using closed source garbage to begin with. It just locks them into a specific company and keeps them at their mercy, not to mention that even if the government reviews the source, the public can't do the same. Not a good message to send.

    • Re: (Score:2, Insightful)

      by Anonymous Coward
      The alternative is for governments to use open source software and manage software development and maintenance themselves (or contract it out). Looking at fumbling attempts at any IT project from just about any government I wouldn't trust their competence enough to extend them more responsibilities.
      • Re:What's the point? (Score:5, Interesting)

        by AHuxley ( 892839 ) on Tuesday July 01, 2014 @11:39PM (#47366235) Journal
        At least then its your own countries option. No colonial box or product to buy, then rent support for and beg for fixes.
        A domestic IT project at least offers your best experts to set standards and review the code.
        Other nations do not all fail at complex math, code, design or funding.
        Other nations may try to keep 5+ other countries out of a networked product as delivered.
        • At least then its your own countries option.

          Isn't it already their option?

        • A domestic IT project at least offers your best experts to set standards and review the code.

          Providing you pay them enough and they want to. Unless you run a dictatorship of course and can force people to work for the government.

        • At least then its your own countries option.

          I sympathise with the sentiment, but the well-now-it-just-doesn't-work-at-all problem is real. A large proportion of IT projects fail. Government IT projects are no different. (If anything I assume they're worse, but I don't have numbers.) Pursuing a low-risk route, even if it means depending on Microsoft, isn't necessarily a mistake.

          Other nations do not all fail at complex math, code, design or funding.

          I presume you are writing as an American. You are quite mistaken [bbc.co.uk].

          Other nations may try to keep 5+ other countries out of a networked product as delivered.

          What?

          • by AHuxley ( 892839 )
            Other nations may try to keep 5+ other countries out of a networked product as delivered.
            Re "What?"
            Recall:
            "Microsoft handed the NSA access to encrypted messages" (12 July 2013)
            http://www.theguardian.com/wor... [theguardian.com]
            Outlook.com encryption unlocked even before official launch
            "...collaborated closely with US intelligence services to allow users' communications to be intercepted, including helping the National Security Agency to circumvent the company's own encryption, according to top-secret documents obta
        • by donaldm ( 919619 ) on Wednesday July 02, 2014 @08:42AM (#47368027)
          Providing the source code for Microsoft software to governments, sounds like a PR exercise. You would need the appropriate government representatives to be able to understand the source code for starters as well as being able to test it and to certify that a specific build and updates are actually from that source code. Personally I can't see that actually happening especially if said representatives have to sign a None Disclosure Contract.

          Still I am quite sure Microsoft PR will state that this is our source code and "Trust Us" this compiles to make the binaries you are using and I am quite sure many government representatives will will be quite satisfied with this since they are effectively "locked in" to using Microsoft products anyway and it (to them) is a better alternative to using that "Communist" Linux thingy :)
    • Re:What's the point? (Score:5, Interesting)

      by dotancohen ( 1015143 ) on Wednesday July 02, 2014 @12:58AM (#47366501) Homepage

      Governments shouldn't be using closed source garbage to begin with. It just locks them into a specific company and keeps them at their mercy, not to mention that even if the government reviews the source, the public can't do the same. Not a good message to send.

      Actually, the _real_ point here is that Microsoft is now implying, quite strongly, that open-source software is preferable for security, privacy, and other sensitive purposes.

      I hope the governments and other entities that this program targets are smart enough to read between the lines.

      • Re:What's the point? (Score:4, Interesting)

        by viperidaenz ( 2515578 ) on Wednesday July 02, 2014 @02:07AM (#47366659)

        Microsoft isn't implying that. They trying to convince customers they don't have NSA backdoors.

        • by Dr_Barnowl ( 709838 ) on Wednesday July 02, 2014 @02:42AM (#47366713)

          And who says they build their binaries from those sources? The backdoors are probably kept in a separate branch and merged with the release branch at build time...

          • Re:What's the point? (Score:5, Informative)

            by jenningsthecat ( 1525947 ) on Wednesday July 02, 2014 @06:16AM (#47367209)

            And who says they build their binaries from those sources? The backdoors are probably kept in a separate branch and merged with the release branch at build time...

            This, exactly. Now if Microsoft allowed governments to build their own binaries from the source they had just finished reviewing, there might be some reassurance that this isn't just a smoke-and-mirrors act. Then again, the toolchain might be compromised. Somehow I don't think MS will allow governments to have access to the toolchain sources as well. And even if they did, I suspect most governments don't have the resources to conduct such a comprehensive review.

          • I said trying to convince, not proving.

        • By showing them the source code. See how that works?

        • "Microsoft isn't implying that. They trying to convince customers they don't have NSA backdoors."

          Yes this smells more like a PR move than anything else. Any government serious about security will roll out its own software stack, which unlike hardware costs practicallly nothing after the initial development. This will limit the attack vector to rogue chips.

          • So you're saying no government is serious about security, because they all use proprietary software to some degree. Most use Windows on their desktops and at least some of their servers.

        • Microsoft isn't implying that. They trying to convince customers they don't have NSA backdoors. ... by opening the source to their products (to certain parties). In other words, you cannot trust software unless you can see the source. It's a pretty clear implication to me.

      • Actually, the _real_ point here is that Microsoft is now implying, quite strongly, that open-source software is preferable for security, privacy, and other sensitive purposes.

        You're spinning it quite strongly.

        • You're spinning it quite strongly.

          You must be new here :)

        • It's not a spin at all. In order for Microsoft to prove to governments that their software does what it's supposed to, they are willing to share the source, because that's the only way you can really trust software. It's not spin to say that you cannot truly trust software unless you can see the source (and understand it, and be able to build it, etc., etc.)

    • If they can review the source, it's not really closed is it?

    • More to the point - how do they know that's the code they're running?

      Unless they can compile their own binaries with their own compilers it could be all smoke and mirrors anyway.

  • by UrsaMajor987 ( 3604759 ) on Tuesday July 01, 2014 @11:17PM (#47366145)
    Ken Thompson on trusting trust. http://cm.bell-labs.com/who/ke... [bell-labs.com]
    • by Anubis IV ( 1279820 ) on Tuesday July 01, 2014 @11:53PM (#47366307)

      The TL;DR version for folks who haven't seen it before or don't want to read it (which you really should do): just because the source is trustworthy doesn't mean the binaries are. The process to accomplish this sort of attack is fairly straightforward:
      1) Modify, say, the compiler's source code so that it adds backdoors to some/all of the code it compiles.
      2) Compile it, then replace the clean binary for the compiler with this new, tainted binary.
      3) Revert the changes to the compiler's source code, erasing any evidence of wrongdoing.

      By itself, that doesn't create a backdoor, but anything compiled using the tainted binary could potentially have a backdoor secretly added, even though the source code for both that code and the compiler would appear to be perfectly clean. The problem could be very hard to discover or pin down as well, only manifesting when a particular file is getting compiled, or even a particular line of code.

      I think most of us are already familiar with this sort of attack, but it's worth repeating, since it's exactly the sort of thing that Microsoft's "Transparency Centers" don't address, and exactly the sort of thing we'd be expecting a government to be doing.

      • by AHuxley ( 892839 )
        A decade later in some distant country: "but the checksums matched"
      • I've seen this repeated a few times. It looks like a great attack vector. While it's certainly within the realms of possibility I wonder however if it is in the realm or practicality.

        I mean something like this, hard to find, which creates an almost undetectable security flaw (imagine the team going through the source trying to find the bug) would be hell desirable which makes me wonder why this attack hasn't been seen in the wild before.

        • I mean something like this, hard to find, which creates an almost undetectable security flaw [...]

          [...] makes me wonder why this attack hasn't been seen in the wild before.

          Seems like you answered your own question.

          Besides which, the flaw may be ephemeral. Ideally, the flaw would be done in such a way that when the compiler tried to recompile itself (e.g.updating it with new features), it would re-add the flaw to its own binary, perpetuating the cycle. But the bad guys don't have to do it that way. They could just as easily leave out that code and only allow the flaw to exist in one version of the compiler. Whenever the compiler gets updated, the backdoor logic in the compiler

        • This hole is easily found and defeated, provided you have two independent compilers. You don't actually have to trust either, only that they aren't jiggered in the same way.

          Say you suspect compiler A. Take its source (A') and compile it with compilers A and B. Let's call the results A(A') and B(A'). Since A and B doubtless do different things, there's likely to be a whole lot of differences, so you won't be able to tell if there's a backdoor in A, although if there is it will be in A(A') and not B(A'

          • This hole is easily found and defeated, provided you have two independent compilers. You don't actually have to trust either, only that they aren't jiggered in the same way.

            Say you suspect compiler A. Take its source (A') and compile it with compilers A and B. Let's call the results A(A') and B(A'). Since A and B doubtless do different things, there's likely to be a whole lot of differences, so you won't be able to tell if there's a backdoor in A, although if there is it will be in A(A') and not B(A').

            Thing is, since A' is the source for a compiler, A(A') and B(A') are compilers, and since both A and B compile the same language they should do the same thing, agreeing with the language semantics of A' - assuming, of course, that there's nothing underhanded going on. Therefore, we can use these newly compiled compilers to compile A', getting (A(A'))A' and (B(A'))A'. These should be identical, since they were compiled by compilers that ostensibly do the same thing. If they're not, you've found a problem, and you can use (B(A'))A' instead of A in the future.

            IIRC per bulding GCC, I believe GCC does some of that as part of its build process - it builds a version of itself to build itself so the executable you finally get is built by itself not the system compiler. It's been a while, but I believe its doing that even if you are not doing cross-platform builds.

            Sure you don't have a second compiler to compare against, but it's a pretty good guarantee that the compiler is what the code said it is.

      • By itself, that doesn't create a backdoor, but anything compiled using the tainted binary could potentially have a backdoor secretly added, even though the source code for both that code and the compiler would appear to be perfectly clean.

        ...And solutions against this do exist:

        A. Deterministic building.
        All software were security is important (Tor, Truecrypt, Bitcoin, to mention a few who practicise this approach) have clear procedures designed to compile a binary in a perfectly repeatable form. A rogue compiler would be easy to detect, because it won't create the same binary as everybody else.

        B. Comparing compilers.
        Use a small collection of different compilers (a few version of GCC, a few other of LLVM, etc) to compile a compiler whose sourc

    • First thing I thought of too. It's only a 30 year old problem :)

    • by mellon ( 7048 )

      You don't even have to do the Ken Thompson trick. They're showing you source, sure, but is it the actual source from which your binary distro was compiled? Get real. Even if they have good intentions, chances are they don't have a reproducible build process.

  • by Anonymous Coward on Tuesday July 01, 2014 @11:20PM (#47366157)

    Who cares if you can look at the code? What matters is what you're running.

    Looking at the code gives you nothing if you can't compile it to the exact same binary that you are running.

    And even if they let you do that... you still need to trust the compiler, and the compiler that compiled that compiler, etc.

  • by dbIII ( 701233 ) on Tuesday July 01, 2014 @11:22PM (#47366167)
    Perfect Forward Secrecy? Why not call it Excessive Hubris Before Fuckup? Eventually something is going to be more "perfect" even if the thing is quite good.
    • People who get paid to study cryptography come up with the name.

      Eventually something is going to be more "perfect" even if the thing is quite good.

      Actually in this case, perfect refers to the fact that compromising one session's key provides no advantage in cracking another session. You cannot improve that aspect of it, if it is implemented properly.

      Ironic you should speak of hubris.

      • The summary's description of PFS is a complete clusterfuck, of course (this is /. so *obviously* the summary is going to be technically inaccurate, right?). Yours (LordLimecat) is more accurate, but the full concept isn't that hard so I'll explain it below.

        First, some quick basics of TLS (I'm leaving out a lot of details; do *NOT* try to implement this yourself!):

        • A server has a public key and a private key for an asymmetric cipher, such as RSA.
        • When a client connects, the server sends their public key to the client. The public key is used to authenticate the server, so the client knows their connection wasn't intercepted or redirected.
        • The client can also encrypt messages using the public key, and only the holder of the private key (the server) can decrypt those messages.
        • Because RSA and similar ciphers are slow, TLS uses a fast, symmetric cipher (like AES or RC4) for bulk data.
        • Before bulk data can be sent, the client and the server need to agree on a symmetric cipher and what key to use.
        • The process of ensuring that both parties have the same symmetric key is called Key Exchange.
        • Obviously, the key exchange itself needs to be protected; if the key is ever sent in plaintext, an attacker can decrypt the whole session.

        Here's the scenario where PFS matters, and why it is "perfect":

        • SSL/TLS (same concept, just different versions of the protocol really) is being used to secure connections.
        • An attacker (think NSA) has been recording the encrypted traffic, and wants to decrypt it.
        • The attacker has a way to get the private key from the server (a bug like Heartbleed, or possibly just a NSL).

        Here's where it gets interesting:

        • Without PFS (normal SSL/TLS key exchanges), the key exchange is protected using the same kind of public-key crypto used to authenticate the server. Therefore, without PFS, our attacker could use the private key material to either decrypt or re-create the key, and decrypt all the traffic.
        • With PFS, the key exchange is done using randomly generated ephemeral (non-persistent) public and private parameters (Diffie-Hellman key exchange [wikipedia.org]). Once the client and server each clear their private parameters, it is not possible for anybody to reconstruct the symmetric key, even if they later compromise the server's persistent public/private key pair (the one used for authentication).

        It is this property, where the secrets needed to recover an encryption key are destroyed and cannot be recovered even if one party cooperates with the attacker, which is termed Perfect Forward Secrecy. Note that PFS doesn't make any guarantees if the crypto is attacked while a session is in progress (in this case, the attacker could simply steal the symmetric key) or if the attacker compromises one side before the session begins (in which case they can impersonate that party, typically the server). It is only perfect secrecy going forward.

      • People who get paid to study cryptography come up with the name.

        Just like with "wired equivalent privacy" that we laugh at now? I'd say both have the stench of marketing and excessive hubris.

        • No, its not "just like that", theyre two entirely different things. WEP was a cipher, and as with all ciphers (other than XOR OTP) can have weaknesses, and will eventually be reduced in complexity by improvements in computation. It was also a remarkably weak cipher.

          PFS is not a cipher, its a principle that isolates the encryption keys between sessions so that getting a court order and sniffing traffic may compromise one session, but you will need to do that for each session because they all use different,

          • The AC gave a good answer without the childish "kick the puppy" bullshit you two exhibited of making sure you showed you got some insults in on somebody from a different field who was asking a question about this one. Perhaps you should learn from that AC.

            No, its not "just like that", theyre two entirely different things. WEP was a cipher

            I was giving an example of a name that became inappropriate and reading between the lines beyond that is a fools game.

            • You're criticizing the name of an appropriately named cryptographic technique with no knowledge of what it does, why it was named, or who named it. I would say that that deserves criticism; slashdot does not need more armchair experts weighing in on things they dont understand-- theres way too much BS as it is now.

    • by Anonymous Coward on Tuesday July 01, 2014 @11:58PM (#47366327)

      "Perfect Forward Secrecy" is a standard term in cryptography. It seems to have been introduced by Diffie, van Oorschot, and Weiner in their paper Authentication and Autheticated Key Exchanges [carleton.ca].

      The description of Perfect Forward Secrecy in the summary seems pretty confused. A cryptographic protocol has perfect forward secrecy if the only long-term key pair is used solely for authentication; that is to protect against man in the middle attacks and the like. Since you can't perform a man in the middle attack once the message has been transmitted, this means that compromise of the private key only jeopardizes future communications. In contrast, if a service uses RSA or ElGamal in the usual manner, then once the private key is compromised (e.g. via a Heartbleed like vulnerability), then all messages ever transmitted can be decrypted using this private key.

      • by dbIII ( 701233 )
        Now that's a very useful answer unlike the sniping from cbhacking and LordLimecat.
    • by icebike ( 68054 )

      Further more, why is Microsoft bragging about how secure OneDrive is when the NSA documents leaked by Snowden already show that the NSA has total access to your OneDrive?

  • As plain text on a US branded OS at the end of the fancy new encryption.
    With all the legal obligations in the telco sector all products have to be wiretap-friendly.
    CALEA obligations should be very clear to the rest of the world by now. The options presented under CISPA should have been noted too.
    Your email, video chat, text, chat will end up as a neat industry standard format for law enforcement use. There will be no going dark on any US product shipped.
    "FBI: We need wiretap-ready Web sites - now" (5
  • Don't force bloatware on hapless customers. XP was 1.2GB. XP with SP2 was about 2GB. XP with SP3 is about 7GB. And now Microsoft claims XP is so insecure it cannot be patched anymore, so customers have to buy a new OS which weighs in at 20GB.

    Cut all the crap and come clean. Release the entire source code for XP if you are not going to patch it. Or keep quiet and prepare to be unbelieved even if you speak the truth.

    • by Yunzil ( 181064 )

      Whoa, a [b]whole[/b] twenty gigabytes? Man, that's about a $1.20 in hard drive space.

  • ...and.. (Score:5, Insightful)

    by JustNiz ( 692889 ) on Tuesday July 01, 2014 @11:35PM (#47366219)

    >> a place where governments who use Microsoft software can come to review the source code

    Where's the proof that the source code you see is exactly the same as that which gets compiled to make the Windows you buy?

    Also does anyone else find it as highly suspicious as me that this center is only open to governments?

    • Re:...and.. (Score:4, Interesting)

      by AHuxley ( 892839 ) on Tuesday July 01, 2014 @11:47PM (#47366265) Journal
      re Where's the proof that the source code you see is exactly the same as that which gets compiled to make the Windows you buy?
      Your experts compile/test the code as they wish over time at the site. The end result is then known.
      A magic number is then produced as to the tested product on site. The application/suit as shipped then matches that same end test numbers.
      ie the applications do not have ~extra code added.
      • Comment removed based on user account deletion
        • by AHuxley ( 892839 )
          The fun of the magic numbers :)
          • It wouldn't work at all - there's nothing magic about them numbers.

            The only way to be sure that you got a copy of binaries that corresponded to the source code would be for each agency concerned to get it's own copy of the source, and build Windows for itself, using it's own audited compiler toolchain. This is not something that MS will allow to happen.

      • A magic number is then produced as to the tested product on site. The application/suit as shipped then matches that same end test numbers.

        And who writes the program that does the test? Who writes the compiler that compiles the test? Who controls the build farm that creates and compares everything? Everything is under the strict, untrustworthy, iron handed control of the very same criminal (yes, convicted of multiple felonies in multiple courts) organization against whom Governments are trying to protect themselves.

        Sorry, but this is all just smoke and mirrors from Microsoft.

      • Sounds good in theory. In practice, what really happens is that some code is shown to you. You are not given a compilation environment. You are specifically prohibited from compiling at all actually. So even if you wanted to build it, you can not. If you think grabbing source files and compiling them without the same options and libs that were used in the distributed binary will get you a binary that can be md5'ed, I have a bridge in Brooklyn to sell to you.

        I have seen the source for Windows before (NT4 and

      • by JustNiz ( 692889 )

        >> Your experts compile/test the code as they wish over time at the site. The end result is then known.

        You can never guarantee any amount of code coverage (especially not 100%) just from black box testing.

  • by Vellmont ( 569020 ) on Tuesday July 01, 2014 @11:37PM (#47366225) Homepage

    So.. Microsoft let governments of the world look at the source code at your special center, and then double-dog-swears that there's nothing fishy going on between then, and compiling the source code, like say a patch applied somewhere in the build process? Riiiight.

    If you WERE to put a backdoor in, that's probably how it'd be done. Would you really want a backdoor explicitly in the code for a developer to find? Of course not, you'd put in something only a few people know about. The secret to secret keeping is limiting the amount of people who know.

    The other way to hide the backdoor is to make it a hard to find bug. Plausible deniability is quite high.

    I have to believe this is good news though. It means a lot of foreign governments are suspicious of closed source software, to the point where Microsoft has had to announce a plan to make their code however less closed source.

    • by AHuxley ( 892839 )
      Its the old crypto hardware trick. You can look at all the messages as sent you like. Its encryption perfection for that decade/generation.
      The plain text is from the tempest (emission security) friendly keyboard.
      The only magic is getting your gov to buy the system and then use it for years :)
      ie buying the system is the way in. Every trapdoor and backdoor is crafted around what the buyer might be aware of.
    • The secret to secret keeping is limiting the amount of people who know.

      Not much of a secret now, is it?

    • The other way to hide the backdoor is to make it a hard to find bug. Plausible deniability is quite high.

      Reading a huge codebase is an unlikely way to spot backdoors anyway. After a few thousand lines the reader's eyes would glaze over, and anything subtle would be missed. This isn't as easy as looking for two-digit year fields a la Y2K reviews.

      Besides, the Heartbleed bug should have been a clue that open source alone doesn't make security issues "transparent". Somebody has to both read and understand the code to detect these things, and an OS like WIndows is so huge that nobody can understand the whole th

  • by Anonymous Coward

    This is nothing more than security theater. We know of the NSA_KEY in Windows 95. All they need to do is to give Microsoft an NSA letter to install backdoors and they will do so. Just like Google and everyone else. I am surprised that anyone would fall for this.

    • No such thing as an 'NSA letter' - you're thinking of a National Security Letter, the super-secret demands that are so classified recipients aren't even permitted to tell their own lawyers they received one.

  • That is a great PR move, since the US government has recently been as effective as the New Coke campaign at promoting US companies abroad.
  • So they opened a transparency center for governments. What about some transparency for everyone else who uses their software? Or are we going to continue to be left in the dark?

    To give some context into user's response to Microsoft's products, Windows 8 market share just decreased. [betanews.com] Comparative figures showed that Windows XP share went up. That's right, the just discontinued OS is doing better then they current system.

    I can't help but point that this is one of a painful series of mistakes that all happen w

  • Comment removed based on user account deletion
    • No-one involved can reasonably ever work on a comparable OSS project again either.

      For example, contributing to Mono isn't really allowed if you view the sources that MS provide for their .NET runtimes (to help with debugging).

  • Seriously? (Score:5, Insightful)

    by NewtonsLaw ( 409638 ) on Wednesday July 02, 2014 @12:22AM (#47366401)

    Who the hell is going to sit down and scan a few million lines of source code with Microsoft looking over your shoulder and hope to spot a backdoor or two in the process?

    Even then, how can you be sure that the source code they show you is the stuff you're actually running?

    What a PR stunt this is!

  • by erroneus ( 253617 ) on Wednesday July 02, 2014 @12:55AM (#47366495) Homepage

    1. Government shouldn't use anything proprietary and the US should follow its own rules (AMD exists because gov't rules requirements, why not Microsoft compatible-competitors?)
    2. Vendor lock-in always leads to over-pricing and government waste (also, see #1)
    3. Microsoft did a deal with the devil (US Government) and now wants to regain trust. Sorry Microsoft. Not going to work.

    And did anyone miss the work facebook has been doing with government? Holy crap. Not only is their censorship completely to the left, they are conducting psych experiments at the request of the US government. I personally avoid the social networking sites and [almost] always have.

    (I have used LinkedIn due in no small part to my previous employer reducing its staff by over 90% Oh yeah, now I can talk about it too! Turns out the Fukushima incident and subsequent lies, deception, inaccuracies and omissions run pretty deep and even found its way to my former employer, a Mitsubishi company. Anyway, LinkedIn... i was checking that from my mobile device and it made mobile pages unusable through CSS and insisted I use an app. I loaded the app and agreed to whatever and the next thing I knew LinkedIn grabbed my whole addressbook and pulled it into their servers. I can't say whether they used the data to spam others, but I can say they used it to "suggest links" to my profile. That's pretty dirty and disgusting.)

    Trust is a difficult thing these days... a fragile thing. And I hope companies everywhere, large and small, learn that lesson. They can learn the hard way or they can be good and decent people asking themselves "would I want someone doing this to me?!" (Just like government gun confiscation -- the answer is NO. The government wouldn't allow the citizens to take their guns, so why should the citizens allow government to take theirs?) Of course, too few people care about golden rules of morality because the world is run by psychopaths. Psychopaths think they can just buy trust. That may have been true, but the pendulum has reached its furthest point and is about to swing back the other way. Microsoft and others are only now figuring that out.

  • This is nothing but a feel-good publicity stunt, designed to offset international suspicions that Microsoft works a little too closely with the NSA.

    Pick your favorite product: Windows 7? Office? SQL Server? IIS? It doesn't matter, you are talking about millions of lines of source code. No government, or government contractor will have the expertise, time an money to analyze such a mass of code. They will be utterly dependent on Microsoft to point them to the core routines responsible for whatever they're in

  • Microsoft is giving other governments the possibility to install their own backdoors by cooperating in special "transparency centers", provided they pay for it and are buying enough Microsoft products instead of switching to open source alternatives.

  • by Antique Geekmeister ( 740220 ) on Wednesday July 02, 2014 @04:23AM (#47366921)

    For highly reliable code, knowing that the code you review is the code you compile with is vital both for stability and security. This can't be done by visual inspection: it requires good provenance at every stage of the game.

    This is actually a security problems with many opensource and freeware code repositories. The authors fail to provide GPG signatures for their tarballs, or to GPG sign tags for their code. So anyone who can steal access can alter the code at whim. And anyone who can forge an SSL certificate can replace the HTTPS based websites and cause innocent users to download corrupted, surreptitiously patched code or tarballs.

    I'm actually concerned for the day that someone sets up a proxy in front of github.com for a localized man-in-the-middle attack to manipulate various targeted projects.

  • Unless they let you compile your own binaries and distribute them, this is utterly useless.

  • If it's the government's job to review code, why not use OSS and have control as well as peace of mind? If they have experts capable of reviewing/understanding code, then wouldn't it be more productive to be using OSS so they could make changes that benefit themselves? Or BSD so they could own the solution? Being forced to review code to make sure it's safe pretty much eliminates the benefit* of the closed source software anyway.

    *The benefit being that someone else is supposedly reliably curating the co
  • Unless governments can rebuild the released version of Microsoft products with said source code, they'll be fed a sanitized version of that source code, but not the original full code base needed to build the final binaries. Backdoors could still be added later at build time, so what's the point?

"Being against torture ought to be sort of a multipartisan thing." -- Karl Lehenbauer, as amended by Jeff Daiell, a Libertarian

Working...