Microsoft Opens 'Transparency Center' For Governments To Review Source Code 178
MojoKid writes with news that Microsoft has announced the opening of a 'Transparency Center' at their Redmond campus, a place where governments who use Microsoft software can come to review the source code in order to make sure it's not compromised by outside agencies. (The company is planning another Transparency Center for Brussels in Belgium.) In addition, Microsoft announced security improvements to several of its cloud products:
As of now, Outlook.com uses TLS (Transport Layer Security) to provide end-to-end encryption for inbound and outbound email — assuming that the provider on the other end also uses TLS. The TLS standard has been in the news fairly recently after discovery of a major security flaw in one popular package (gnuTLS), but Microsoft notes that it worked with multiple international companies to secure its version of the standard. Second, OneDrive now uses Perfect Forward Secrecy (PFS). Microsoft refers to this as a type of encryption, but PFS isn't a standard like AES or 3DES — instead, it's a particular method of ensuring that an attacker who intercepts a particular key cannot use that information to break the entire key sequence. Even if you manage to gain access to one file or folder, in other words, that information can't be used to compromise the entire account.
What's the point? (Score:5, Insightful)
Governments shouldn't be using closed source garbage to begin with. It just locks them into a specific company and keeps them at their mercy, not to mention that even if the government reviews the source, the public can't do the same. Not a good message to send.
Re: (Score:2, Insightful)
Re:What's the point? (Score:5, Interesting)
A domestic IT project at least offers your best experts to set standards and review the code.
Other nations do not all fail at complex math, code, design or funding.
Other nations may try to keep 5+ other countries out of a networked product as delivered.
Re: (Score:2)
At least then its your own countries option.
Isn't it already their option?
Re: (Score:3)
Not with complex trade deals demanding equal consideration to fully imported systems. The reality that a product line is open to 5+ other nations security services is not really allowed to stop consideration early.
Re: (Score:2)
UNTRUSTED PARTY OFFERS TOOLS FOR TRUST VALIDATION (Score:2)
:-)
No text.
Re: (Score:2)
A domestic IT project at least offers your best experts to set standards and review the code.
Providing you pay them enough and they want to. Unless you run a dictatorship of course and can force people to work for the government.
Re: (Score:2)
At least then its your own countries option.
I sympathise with the sentiment, but the well-now-it-just-doesn't-work-at-all problem is real. A large proportion of IT projects fail. Government IT projects are no different. (If anything I assume they're worse, but I don't have numbers.) Pursuing a low-risk route, even if it means depending on Microsoft, isn't necessarily a mistake.
Other nations do not all fail at complex math, code, design or funding.
I presume you are writing as an American. You are quite mistaken [bbc.co.uk].
Other nations may try to keep 5+ other countries out of a networked product as delivered.
What?
Re: (Score:2)
Re "What?"
Recall:
"Microsoft handed the NSA access to encrypted messages" (12 July 2013)
http://www.theguardian.com/wor... [theguardian.com]
Outlook.com encryption unlocked even before official launch
"...collaborated closely with US intelligence services to allow users' communications to be intercepted, including helping the National Security Agency to circumvent the company's own encryption, according to top-secret documents obta
Re:What's the point? (Score:4, Insightful)
Still I am quite sure Microsoft PR will state that this is our source code and "Trust Us" this compiles to make the binaries you are using and I am quite sure many government representatives will will be quite satisfied with this since they are effectively "locked in" to using Microsoft products anyway and it (to them) is a better alternative to using that "Communist" Linux thingy
Re: What's the point? (Score:4, Insightful)
Some of the most expensive IT failures in history have come from contracting it out to the amazingly efficient do no wrong private sector.
Re:What's the point? (Score:5, Interesting)
Governments shouldn't be using closed source garbage to begin with. It just locks them into a specific company and keeps them at their mercy, not to mention that even if the government reviews the source, the public can't do the same. Not a good message to send.
Actually, the _real_ point here is that Microsoft is now implying, quite strongly, that open-source software is preferable for security, privacy, and other sensitive purposes.
I hope the governments and other entities that this program targets are smart enough to read between the lines.
Re:What's the point? (Score:4, Interesting)
Microsoft isn't implying that. They trying to convince customers they don't have NSA backdoors.
Re:What's the point? (Score:5, Insightful)
And who says they build their binaries from those sources? The backdoors are probably kept in a separate branch and merged with the release branch at build time...
Re:What's the point? (Score:5, Informative)
And who says they build their binaries from those sources? The backdoors are probably kept in a separate branch and merged with the release branch at build time...
This, exactly. Now if Microsoft allowed governments to build their own binaries from the source they had just finished reviewing, there might be some reassurance that this isn't just a smoke-and-mirrors act. Then again, the toolchain might be compromised. Somehow I don't think MS will allow governments to have access to the toolchain sources as well. And even if they did, I suspect most governments don't have the resources to conduct such a comprehensive review.
Re: (Score:2)
I said trying to convince, not proving.
Re: (Score:2)
By showing them the source code. See how that works?
PR move (Score:2)
"Microsoft isn't implying that. They trying to convince customers they don't have NSA backdoors."
Yes this smells more like a PR move than anything else. Any government serious about security will roll out its own software stack, which unlike hardware costs practicallly nothing after the initial development. This will limit the attack vector to rogue chips.
Re: (Score:2)
So you're saying no government is serious about security, because they all use proprietary software to some degree. Most use Windows on their desktops and at least some of their servers.
Re: (Score:2)
Microsoft isn't implying that. They trying to convince customers they don't have NSA backdoors. ... by opening the source to their products (to certain parties). In other words, you cannot trust software unless you can see the source. It's a pretty clear implication to me.
Re: (Score:2)
Actually, the _real_ point here is that Microsoft is now implying, quite strongly, that open-source software is preferable for security, privacy, and other sensitive purposes.
You're spinning it quite strongly.
Re: (Score:2)
You're spinning it quite strongly.
You must be new here :)
Re: (Score:2)
It's not a spin at all. In order for Microsoft to prove to governments that their software does what it's supposed to, they are willing to share the source, because that's the only way you can really trust software. It's not spin to say that you cannot truly trust software unless you can see the source (and understand it, and be able to build it, etc., etc.)
Re: (Score:2)
If they can review the source, it's not really closed is it?
Re:What's the point? (Score:5, Interesting)
If you ask any IT team lead, the real reason is the usability and it-just-works qualities of the software.
If you ask most IT team leads, the real reason is that they know that users in general treat computers like voodoo - perform a particular ritual a particular way, and you get the desired outcome. This lack of mental flexibility means that when someone learns a particular GUI they are not keen to change to a new one - which is the reason you get exactly the same inertia about switching to a new version of MS Office (vis: all that Ribbon hoo-hah) that you do for switching to another OS (with it's other applications with other GUIs).
This is the "usability" part of that statement. That's the reason that people railed so heaviliy against Windows 8. Why do you think MS invest so heavily in giving copies of their software to schools? Get those GUI rituals in peoples heads.
As for it-just-works... MS software does plenty of infuriating and irritating does-not-just-work things.
* Linux : I can move a file while I have it open in an editor, and saving the file in the editor saves to the new location
* Windows : Won't let you move the file
Microsoft would solely have to lean on selling support and consultation services after that.
I can imagine that terrifies them ; presently, even if you pay for support, you get very little. You get better support for Windows and other MS software from the community. With popular OSS projects, you typically get good support from both the community and the authors, AND you get the ability to look at the source code to understand your problem better or even fix it (or hire a contractor to do this). This is one of the cornerstones of why I use OSS wherever possible in my technology stack - the larger the software company gets, the less my problems matter to them. IBM manages just fine in this model.
Windows works today, out of the box.
This is so untrue on so many levels.
When I install Linux, it usually takes about 20 minutes, with no driver downloads (because I do my homework and buy compatible hardware). Most distro's leave you with a machine that has a bunch of useful applications, out of the box.
With Windows, I've had to hunt for drivers, download drivers, slipstream special drivers into special install disk images (so that the install can proceed far enough for the real drivers to be installed...). This is for machines that were sold with Windows and provided with install images. It literally took me all night to reinstall my wife's laptop (reboot! reboot! reboot!) after her office decided that because the Linux install didn't support their proprietary disk encryption program it wasn't suitable (never mind that it had perfectly good encryption on it anyway). And that's just for the core OS, never mind the vast list of applications that you have to add to make it even marginally useful.
At that moment, the Linux guy will still be applying various fancy patches and trying out different distro and desktop environment combinations to see which works best.
I use Linux for all my real, productive work on a daily basis, use stock packages for the vast majority of things, use the standard Ubuntu image, again, out of the box, without doing anything to it bar installing packages and configuring a few of the options a little.
Unlike Windows, I don't need to tweak my install ; If I move to another machine (say, a hardware replacement cycle), I can literally move the disk from one machine to another and keep on trucking - Windows throws the most epic tantrum imaginable if you try that. If I want to go crazy and upgrade to a new version of the OS, I back up my home folder, install the new OS, install the packages I had before with a single command, restore my home folder and move over most of my files and config folders... and I'm off again. Again, if you try that on Windows, you're screwed, because mo
Re: (Score:3)
More to the point - how do they know that's the code they're running?
Unless they can compile their own binaries with their own compilers it could be all smoke and mirrors anyway.
Somebody has to do it (Score:3, Interesting)
Re:Somebody has to do it (Score:5, Informative)
The TL;DR version for folks who haven't seen it before or don't want to read it (which you really should do): just because the source is trustworthy doesn't mean the binaries are. The process to accomplish this sort of attack is fairly straightforward:
1) Modify, say, the compiler's source code so that it adds backdoors to some/all of the code it compiles.
2) Compile it, then replace the clean binary for the compiler with this new, tainted binary.
3) Revert the changes to the compiler's source code, erasing any evidence of wrongdoing.
By itself, that doesn't create a backdoor, but anything compiled using the tainted binary could potentially have a backdoor secretly added, even though the source code for both that code and the compiler would appear to be perfectly clean. The problem could be very hard to discover or pin down as well, only manifesting when a particular file is getting compiled, or even a particular line of code.
I think most of us are already familiar with this sort of attack, but it's worth repeating, since it's exactly the sort of thing that Microsoft's "Transparency Centers" don't address, and exactly the sort of thing we'd be expecting a government to be doing.
Re: (Score:2)
Re: (Score:2)
I've seen this repeated a few times. It looks like a great attack vector. While it's certainly within the realms of possibility I wonder however if it is in the realm or practicality.
I mean something like this, hard to find, which creates an almost undetectable security flaw (imagine the team going through the source trying to find the bug) would be hell desirable which makes me wonder why this attack hasn't been seen in the wild before.
Re: (Score:2)
I mean something like this, hard to find, which creates an almost undetectable security flaw [...]
[...] makes me wonder why this attack hasn't been seen in the wild before.
Seems like you answered your own question.
Besides which, the flaw may be ephemeral. Ideally, the flaw would be done in such a way that when the compiler tried to recompile itself (e.g.updating it with new features), it would re-add the flaw to its own binary, perpetuating the cycle. But the bad guys don't have to do it that way. They could just as easily leave out that code and only allow the flaw to exist in one version of the compiler. Whenever the compiler gets updated, the backdoor logic in the compiler
Re: (Score:2)
This hole is easily found and defeated, provided you have two independent compilers. You don't actually have to trust either, only that they aren't jiggered in the same way.
Say you suspect compiler A. Take its source (A') and compile it with compilers A and B. Let's call the results A(A') and B(A'). Since A and B doubtless do different things, there's likely to be a whole lot of differences, so you won't be able to tell if there's a backdoor in A, although if there is it will be in A(A') and not B(A'
Re: (Score:2)
This hole is easily found and defeated, provided you have two independent compilers. You don't actually have to trust either, only that they aren't jiggered in the same way.
Say you suspect compiler A. Take its source (A') and compile it with compilers A and B. Let's call the results A(A') and B(A'). Since A and B doubtless do different things, there's likely to be a whole lot of differences, so you won't be able to tell if there's a backdoor in A, although if there is it will be in A(A') and not B(A').
Thing is, since A' is the source for a compiler, A(A') and B(A') are compilers, and since both A and B compile the same language they should do the same thing, agreeing with the language semantics of A' - assuming, of course, that there's nothing underhanded going on. Therefore, we can use these newly compiled compilers to compile A', getting (A(A'))A' and (B(A'))A'. These should be identical, since they were compiled by compilers that ostensibly do the same thing. If they're not, you've found a problem, and you can use (B(A'))A' instead of A in the future.
IIRC per bulding GCC, I believe GCC does some of that as part of its build process - it builds a version of itself to build itself so the executable you finally get is built by itself not the system compiler. It's been a while, but I believe its doing that even if you are not doing cross-platform builds.
Sure you don't have a second compiler to compare against, but it's a pretty good guarantee that the compiler is what the code said it is.
Deterministic building (Score:3)
By itself, that doesn't create a backdoor, but anything compiled using the tainted binary could potentially have a backdoor secretly added, even though the source code for both that code and the compiler would appear to be perfectly clean.
...And solutions against this do exist:
A. Deterministic building.
All software were security is important (Tor, Truecrypt, Bitcoin, to mention a few who practicise this approach) have clear procedures designed to compile a binary in a perfectly repeatable form. A rogue compiler would be easy to detect, because it won't create the same binary as everybody else.
B. Comparing compilers.
Use a small collection of different compilers (a few version of GCC, a few other of LLVM, etc) to compile a compiler whose sourc
Re: (Score:2)
First thing I thought of too. It's only a 30 year old problem :)
Re: (Score:3)
You don't even have to do the Ken Thompson trick. They're showing you source, sure, but is it the actual source from which your binary distro was compiled? Get real. Even if they have good intentions, chances are they don't have a reproducible build process.
Code vs Binaries: Big Difference (Score:5, Insightful)
Who cares if you can look at the code? What matters is what you're running.
Looking at the code gives you nothing if you can't compile it to the exact same binary that you are running.
And even if they let you do that... you still need to trust the compiler, and the compiler that compiled that compiler, etc.
Opensource (Score:4, Informative)
The main advantages of free/libre open-source software is:
- source is available to review and hack upon for a WAY MUCH LARGER audience. It's "a few security reviewers cherry picked by a government" vs. "virtually anybody who has the time and resource to invest in it".
So you have a bigger pool from which to pick somebody who "is going to understand everything at every layer", or at least understand big enough parts of it, at a large enough number of layers, with enough overlap with the other "somebodies".
- the whole echo system is open. You can review lots of other stuff (compilers, libraries, etc.) You can have deterministic building to check if you really have the code that really produced the official binaries (that's already something that Tor, Truecrypt, Bitcoin, etc. are doing).
There's lot of things that you can do to check every piece of software that you need to trust.
Well of course, that's a lot work required. So in the end, you'll end up having to trust multiplt other people anyway. But at least, with opensource, that's a choice, and in any case you can do the checks your serlf (or more reallistically: ask someone you actually trust to do it for you. As in the current ongoing review of TrueCrypt, for example).
Whereas, no matter how motivated, with closed source software you'll always hit a wall. (Well microsoft gives you a peek at the windows code, but not necessarily all the rest needed to check full security).
Who thinks up these names? (Score:3)
Re: (Score:2)
People who get paid to study cryptography come up with the name.
Eventually something is going to be more "perfect" even if the thing is quite good.
Actually in this case, perfect refers to the fact that compromising one session's key provides no advantage in cracking another session. You cannot improve that aspect of it, if it is implemented properly.
Ironic you should speak of hubris.
Somebody much smarter than you, dbIII (Score:5, Informative)
The summary's description of PFS is a complete clusterfuck, of course (this is /. so *obviously* the summary is going to be technically inaccurate, right?). Yours (LordLimecat) is more accurate, but the full concept isn't that hard so I'll explain it below.
First, some quick basics of TLS (I'm leaving out a lot of details; do *NOT* try to implement this yourself!):
Here's the scenario where PFS matters, and why it is "perfect":
Here's where it gets interesting:
It is this property, where the secrets needed to recover an encryption key are destroyed and cannot be recovered even if one party cooperates with the attacker, which is termed Perfect Forward Secrecy. Note that PFS doesn't make any guarantees if the crypto is attacked while a session is in progress (in this case, the attacker could simply steal the symmetric key) or if the attacker compromises one side before the session begins (in which case they can impersonate that party, typically the server). It is only perfect secrecy going forward.
More likely people in marketing (Score:2)
Just like with "wired equivalent privacy" that we laugh at now? I'd say both have the stench of marketing and excessive hubris.
Re: (Score:2)
No, its not "just like that", theyre two entirely different things. WEP was a cipher, and as with all ciphers (other than XOR OTP) can have weaknesses, and will eventually be reduced in complexity by improvements in computation. It was also a remarkably weak cipher.
PFS is not a cipher, its a principle that isolates the encryption keys between sessions so that getting a court order and sniffing traffic may compromise one session, but you will need to do that for each session because they all use different,
A question is a question (Score:2)
I was giving an example of a name that became inappropriate and reading between the lines beyond that is a fools game.
Re: (Score:2)
You're criticizing the name of an appropriately named cryptographic technique with no knowledge of what it does, why it was named, or who named it. I would say that that deserves criticism; slashdot does not need more armchair experts weighing in on things they dont understand-- theres way too much BS as it is now.
Re:Who thinks up these names? (Score:4, Informative)
"Perfect Forward Secrecy" is a standard term in cryptography. It seems to have been introduced by Diffie, van Oorschot, and Weiner in their paper Authentication and Autheticated Key Exchanges [carleton.ca].
The description of Perfect Forward Secrecy in the summary seems pretty confused. A cryptographic protocol has perfect forward secrecy if the only long-term key pair is used solely for authentication; that is to protect against man in the middle attacks and the like. Since you can't perform a man in the middle attack once the message has been transmitted, this means that compromise of the private key only jeopardizes future communications. In contrast, if a service uses RSA or ElGamal in the usual manner, then once the private key is compromised (e.g. via a Heartbleed like vulnerability), then all messages ever transmitted can be decrypted using this private key.
Re: (Score:2)
Re: (Score:2)
Further more, why is Microsoft bragging about how secure OneDrive is when the NSA documents leaked by Snowden already show that the NSA has total access to your OneDrive?
Re: (Score:2)
http://www.wired.co.uk/news/ar... [wired.co.uk]
Google: nsa skydrive skype
It all ends up (Score:2)
With all the legal obligations in the telco sector all products have to be wiretap-friendly.
CALEA obligations should be very clear to the rest of the world by now. The options presented under CISPA should have been noted too.
Your email, video chat, text, chat will end up as a neat industry standard format for law enforcement use. There will be no going dark on any US product shipped.
"FBI: We need wiretap-ready Web sites - now" (5
Better way for Microsoft to earn trust (Score:1, Offtopic)
Don't force bloatware on hapless customers. XP was 1.2GB. XP with SP2 was about 2GB. XP with SP3 is about 7GB. And now Microsoft claims XP is so insecure it cannot be patched anymore, so customers have to buy a new OS which weighs in at 20GB.
Cut all the crap and come clean. Release the entire source code for XP if you are not going to patch it. Or keep quiet and prepare to be unbelieved even if you speak the truth.
Re: (Score:2)
Whoa, a [b]whole[/b] twenty gigabytes? Man, that's about a $1.20 in hard drive space.
Re: (Score:2)
Nope; I'm not sure what the worst thing leaving XP can do, but there are things worse than losing formatting.
Some devices have computers in them running XP. Some of these are certified (medical devices, say), and updating the OS would require a complete and expensive recertification. Some companies have vital software that was written as ActiveX for IE6, and don't have the source code. Yeah, I'm not impressed by the foresight in either case, but when you're sitting there in 2014 with a really expensiv
Re: (Score:2)
Re: (Score:2, Interesting)
If you want to buy 20 machines today with a Windows OS, the only choice is Windows 8. Even though almost a billion PCs run XP, it is not possible to get a new machine with a legal licensed copy of XP without jumping through numerous hoops and shelling out loads of cash.
Microsoft wants us to trust their word that it is not feasible to offer or support XP on new machines. This is not believable. Opening up the source code is the only way to prove or disprove Microsoft's version of the facts.
Whether you agree
Re:Better way for Microsoft to earn trust (Score:5, Insightful)
Hundreds of legacy code developed for Windows platform using Windows development tools run only on XP and are not supported by 7 or 8.
So not only have you tied yourself to a particular version of a proprietary OS that - as we all know from previous experience - has a limited lifetime but you chose to do that by using proprietary software that won't run on anything else and you didn't think there might be a problem with that? Seriously? If you cut corners then you're going to get burned.
Re: (Score:3)
Hundreds of legacy code developed for Windows platform using Windows development tools run only on XP and are not supported by 7 or 8.
This is generally because they were really badly written and do things that have been recommended against for years - like storing settings in the same folder as the program, which means that in some cases non-admin users can't even use the program because they don't have permission to create the initial settings file. I'd like to say this is generally confined to amateur developers but I've seen it so many times from so-called professionals that it's sad.
It's not something specific to Windows, but not some
Re: (Score:2)
Re: (Score:2)
If you want to buy 20 machines today with a Windows OS, the only choice is Windows 8. Even though almost a billion PCs run XP, it is not possible to get a new machine with a legal licensed copy of XP without jumping through numerous hoops and shelling out loads of cash.
Odd, because the very first link I went to on Dell's website for business showed machines with Windows 7. I didn't even have to search.
Microsoft wants us to trust their word that it is not feasible to offer or support XP on new machines. Thi
...and.. (Score:5, Insightful)
>> a place where governments who use Microsoft software can come to review the source code
Where's the proof that the source code you see is exactly the same as that which gets compiled to make the Windows you buy?
Also does anyone else find it as highly suspicious as me that this center is only open to governments?
Re:...and.. (Score:4, Interesting)
Your experts compile/test the code as they wish over time at the site. The end result is then known.
A magic number is then produced as to the tested product on site. The application/suit as shipped then matches that same end test numbers.
ie the applications do not have ~extra code added.
Re: (Score:3)
Re: (Score:2)
Re: (Score:2)
It wouldn't work at all - there's nothing magic about them numbers.
The only way to be sure that you got a copy of binaries that corresponded to the source code would be for each agency concerned to get it's own copy of the source, and build Windows for itself, using it's own audited compiler toolchain. This is not something that MS will allow to happen.
Re: (Score:2)
A magic number is then produced as to the tested product on site. The application/suit as shipped then matches that same end test numbers.
And who writes the program that does the test? Who writes the compiler that compiles the test? Who controls the build farm that creates and compares everything? Everything is under the strict, untrustworthy, iron handed control of the very same criminal (yes, convicted of multiple felonies in multiple courts) organization against whom Governments are trying to protect themselves.
Sorry, but this is all just smoke and mirrors from Microsoft.
Re: (Score:2)
Sounds good in theory. In practice, what really happens is that some code is shown to you. You are not given a compilation environment. You are specifically prohibited from compiling at all actually. So even if you wanted to build it, you can not. If you think grabbing source files and compiling them without the same options and libs that were used in the distributed binary will get you a binary that can be md5'ed, I have a bridge in Brooklyn to sell to you.
I have seen the source for Windows before (NT4 and
Re: (Score:2)
>> Your experts compile/test the code as they wish over time at the site. The end result is then known.
You can never guarantee any amount of code coverage (especially not 100%) just from black box testing.
How to prove the source code maps to the binary? (Score:5, Insightful)
So.. Microsoft let governments of the world look at the source code at your special center, and then double-dog-swears that there's nothing fishy going on between then, and compiling the source code, like say a patch applied somewhere in the build process? Riiiight.
If you WERE to put a backdoor in, that's probably how it'd be done. Would you really want a backdoor explicitly in the code for a developer to find? Of course not, you'd put in something only a few people know about. The secret to secret keeping is limiting the amount of people who know.
The other way to hide the backdoor is to make it a hard to find bug. Plausible deniability is quite high.
I have to believe this is good news though. It means a lot of foreign governments are suspicious of closed source software, to the point where Microsoft has had to announce a plan to make their code however less closed source.
Re: (Score:2)
The plain text is from the tempest (emission security) friendly keyboard.
The only magic is getting your gov to buy the system and then use it for years
ie buying the system is the way in. Every trapdoor and backdoor is crafted around what the buyer might be aware of.
Re: (Score:2)
The secret to secret keeping is limiting the amount of people who know.
Not much of a secret now, is it?
Re: (Score:2)
The other way to hide the backdoor is to make it a hard to find bug. Plausible deniability is quite high.
Reading a huge codebase is an unlikely way to spot backdoors anyway. After a few thousand lines the reader's eyes would glaze over, and anything subtle would be missed. This isn't as easy as looking for two-digit year fields a la Y2K reviews.
Besides, the Heartbleed bug should have been a clue that open source alone doesn't make security issues "transparent". Somebody has to both read and understand the code to detect these things, and an OS like WIndows is so huge that nobody can understand the whole th
One NSA letter will negate all of this (Score:1)
This is nothing more than security theater. We know of the NSA_KEY in Windows 95. All they need to do is to give Microsoft an NSA letter to install backdoors and they will do so. Just like Google and everyone else. I am surprised that anyone would fall for this.
Re: (Score:2)
No such thing as an 'NSA letter' - you're thinking of a National Security Letter, the super-secret demands that are so classified recipients aren't even permitted to tell their own lawyers they received one.
Re: (Score:2)
Nice Strategy Attempt (Score:2)
What about everyone else? (Score:2)
To give some context into user's response to Microsoft's products, Windows 8 market share just decreased. [betanews.com] Comparative figures showed that Windows XP share went up. That's right, the just discontinued OS is doing better then they current system.
I can't help but point that this is one of a painful series of mistakes that all happen w
Re: (Score:1)
Re: (Score:2)
No-one involved can reasonably ever work on a comparable OSS project again either.
For example, contributing to Mono isn't really allowed if you view the sources that MS provide for their .NET runtimes (to help with debugging).
Re: (Score:2)
I suspect it'd be about legal liability. If I work on .NET for Microsoft, and then leave and go to Xamarin to work on Mono, any Mono code I produce is a legal liability, since it's hard to prove a priori that I didn't copy Microsoft code. This means MS could at any time claim copyright infringement and start an expensive legal process. Copyright violation doesn't even have to be deliberate: it's entirely possible to remember something and then introduce it into code (or music, or literature) in the bel
Re: (Score:2)
Both EULAs I think, but also from the POV of the projects involved, they don't want to take the risk of contributions from someone with any significant chance of having MS code in their head, because it could open them up to a potential lawsuit later.
Seriously? (Score:5, Insightful)
Who the hell is going to sit down and scan a few million lines of source code with Microsoft looking over your shoulder and hope to spot a backdoor or two in the process?
Even then, how can you be sure that the source code they show you is the stuff you're actually running?
What a PR stunt this is!
Too little too late? (Score:3)
1. Government shouldn't use anything proprietary and the US should follow its own rules (AMD exists because gov't rules requirements, why not Microsoft compatible-competitors?)
2. Vendor lock-in always leads to over-pricing and government waste (also, see #1)
3. Microsoft did a deal with the devil (US Government) and now wants to regain trust. Sorry Microsoft. Not going to work.
And did anyone miss the work facebook has been doing with government? Holy crap. Not only is their censorship completely to the left, they are conducting psych experiments at the request of the US government. I personally avoid the social networking sites and [almost] always have.
(I have used LinkedIn due in no small part to my previous employer reducing its staff by over 90% Oh yeah, now I can talk about it too! Turns out the Fukushima incident and subsequent lies, deception, inaccuracies and omissions run pretty deep and even found its way to my former employer, a Mitsubishi company. Anyway, LinkedIn... i was checking that from my mobile device and it made mobile pages unusable through CSS and insisted I use an app. I loaded the app and agreed to whatever and the next thing I knew LinkedIn grabbed my whole addressbook and pulled it into their servers. I can't say whether they used the data to spam others, but I can say they used it to "suggest links" to my profile. That's pretty dirty and disgusting.)
Trust is a difficult thing these days... a fragile thing. And I hope companies everywhere, large and small, learn that lesson. They can learn the hard way or they can be good and decent people asking themselves "would I want someone doing this to me?!" (Just like government gun confiscation -- the answer is NO. The government wouldn't allow the citizens to take their guns, so why should the citizens allow government to take theirs?) Of course, too few people care about golden rules of morality because the world is run by psychopaths. Psychopaths think they can just buy trust. That may have been true, but the pendulum has reached its furthest point and is about to swing back the other way. Microsoft and others are only now figuring that out.
Publicity stunt - not practical (Score:2, Interesting)
This is nothing but a feel-good publicity stunt, designed to offset international suspicions that Microsoft works a little too closely with the NSA.
Pick your favorite product: Windows 7? Office? SQL Server? IIS? It doesn't matter, you are talking about millions of lines of source code. No government, or government contractor will have the expertise, time an money to analyze such a mass of code. They will be utterly dependent on Microsoft to point them to the core routines responsible for whatever they're in
What this is more likely about ... (Score:2)
Microsoft is giving other governments the possibility to install their own backdoors by cooperating in special "transparency centers", provided they pay for it and are buying enough Microsoft products instead of switching to open source alternatives.
Provenance matters (Score:3)
For highly reliable code, knowing that the code you review is the code you compile with is vital both for stability and security. This can't be done by visual inspection: it requires good provenance at every stage of the game.
This is actually a security problems with many opensource and freeware code repositories. The authors fail to provide GPG signatures for their tarballs, or to GPG sign tags for their code. So anyone who can steal access can alter the code at whim. And anyone who can forge an SSL certificate can replace the HTTPS based websites and cause innocent users to download corrupted, surreptitiously patched code or tarballs.
I'm actually concerned for the day that someone sets up a proxy in front of github.com for a localized man-in-the-middle attack to manipulate various targeted projects.
Probably pointless (Score:2)
Unless they let you compile your own binaries and distribute them, this is utterly useless.
If it's the govts job to review code... (Score:2)
*The benefit being that someone else is supposedly reliably curating the co
Backdoors will be added at build time (Score:2)
Re: (Score:3, Insightful)
1/ How can observers know that the source code shown results in the compiled binary sold.
Compile the code and compare the binaries?
2/ How can observers know that when compiled the compiler does not introduce vulnerabilities.
Same way you would for open source software: inspect the compiler code.
3/ Would not a malicious observer use the knowledge of the source to look for vulnerabilities for their intelligence agencies to exploit later.
Maybe.
4/ As a private citizen how can I be assured of o
Re: (Score:2)
1/ How can observers know that the source code shown results in the compiled binary sold.
You don't. You never do. What you do is build a system on trust. However in the legal world there is a difference. If I give you a closed source binary with a back door I have all sorts of excuses for you. "Our code review system broke down." "It was an unpatched bug." "We assume no liability, you accepted our licence agreement, right?"
But if you look at the source code and determine yourself it's not backdoored, and yet I put a back door in the final product, that would be an incredibly clear cut case of c
Re: (Score:2)
Re: (Score:2)
You don't need to convict. You just need to ban. Thinking that because Microsoft is a US based company makes them somehow immune is a ignoring the fact that they have been taken to task by foreign governments before, and have lost. Or have we forgotten why there was an "N" version of Windows XP?
Re: (Score:2)
/ How can observers know that the source code shown results in the compiled binary sold.
Compile the code and compare the binaries?
On your own hardware.
But, I guess, MS won't let them download the source code to their laptops, or these laptops would have to be destroyed after the review process.
Re: (Score:2)
You can't compare binaries for Microsoft's attempt at a C compiler. If you use the /GS (IIRC) flag, Microsoft will insert a different random value just before the return address of a function so any buffer overrun will change it.
/GS only allocates space for a random value, the random value isn't computed at compile time.
Re: (Score:2)
Correct! It would be a remarkably stupid stack canary (which is a security measure) otherwise. Since the value would be the same on everybody's computer, you'd only need to find it once and then when you overflow the buffer be sure to write the canary value back as it was!
Instead, getting past stack canaries is considerably more difficult than that. It's possible, of course, with the right vulnerabilities... but it's *harder* and sometimes a program that would be exploitable without them (using the vulnerab
Re: (Score:2)
This is why I now only use open source.
Yeah, because it's not like there have been any major security flaws found lying around for years in open source software lately.
Re: (Score:2)
Microsoft is still operating under NSL restraints. That means the NSA has the keys anyway.
TLS doesn't work that way, the implementation trusts, and uses, whatever keys it's told to trust (via certificates). And that's the problem, while most implementations will allow you to manage your own certs, for example by creating self-signed certs, the Windows implementation will only trust certs from commercial CAs. You know, Diginotar, Trustwave, Comodo, those sorts of guys. So you can't just generate and manage your own keys and certs but are forced to pay, and trust hundreds of external CAs to manag
Re: (Score:2)
...multiple international companies... Like Nokia, Skype, Visio ...?
Re: (Score:2)
The real cost is the inability to contribute to any open-source project that covers similar ground.