Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?
Privacy The Internet

Schneier: The US Government Has Betrayed the Internet, We Need To Take It Back 397

wabrandsma writes "Quoting Bruce Schneier in the Guardian: 'The NSA has undermined a fundamental social contract. We engineers built the internet – and now we have to fix it. Government and industry have betrayed the internet, and us. This is not the internet the world needs, or the internet its creators envisioned. We need to take it back. And by we, I mean the engineering community. Yes, this is primarily a political problem, a policy matter that requires political intervention. But this is also an engineering problem, and there are several things engineers can – and should – do."
This discussion has been archived. No new comments can be posted.

Schneier: The US Government Has Betrayed the Internet, We Need To Take It Back

Comments Filter:
  • by Anonymous Coward on Friday September 06, 2013 @05:45AM (#44772969)

    they've got flamethrowers, man

  • Agreed (Score:4, Insightful)

    by msobkow ( 48369 ) on Friday September 06, 2013 @05:48AM (#44772977) Homepage Journal

    But in all practicality, how do you seize back control from the likes of the three-letter agencies?

    It's not like there is any party in the US which hasn't been complicit in granting them ever-greater powers. It's not like a Canadian like myself can vote against the bullshit. It's not like Canada is about to invade the US over the issues, nor anyone else, seeing as their three-letter agencies are doing the same god-damned thing.

  • by black3d ( 1648913 ) on Friday September 06, 2013 @06:01AM (#44773011)

    Naw, HTTPS only protects you against folks who don't already have the keys. You pretty much can't trust virtually any data communication that takes place on the internet. However, that doesn't mean stop doing stuff - it just means weigh the value of what you're doing against the expectation that the information is likely to be used against you. For example - the NSA may have my internet banking credentials - but am I worried they're going to steal my money? No - either 1) they don't need to, 2) if some rogue agent decided to, there are legal protection and insurance avenues I can take to regain my money, 3) if the government decided they needed to steal my money, then even them not having my internet banking credentials isn't going to stop them anyway.

    I'm not an advocate for "if you have nothing to hide, you have nothing to worry about" at all. I'm just facing the realization that our government is completely morally corrupt, and outside of changing it by force, I can never protect my information online unless it's information I've encrypted and uploaded myself (and even then I'm still at risk if my OS is rooted or my encryption algorithm has a master algorithm). So, I weigh that knowledge against my activities and don't worry too much. If I was concerned about being identified, then you can protect yourself, but it largely involves not using your net connection, among other things.

  • by N1AK ( 864906 ) on Friday September 06, 2013 @06:02AM (#44773017) Homepage
    The issue with Darknets etc is that it'll only protect a limited proportion of what normal people do:
    1/ Email, if you want to send or receive, from normal people won't be secret.
    2/ Facebook, Youtube, Skype, Amazon etc won't be on it.

    If you've got something you want to hide enough then the tools to try and do it are available. For the average person though it isn't a viable or effective proposition. We need to stop this happening, not just find ways for a few people to work around it.
  • Re:Union (Score:4, Insightful)

    by jabberw0k ( 62554 ) on Friday September 06, 2013 @06:05AM (#44773035) Homepage Journal
    If there had been programmer unions in the Win95 era, we never would have got rid of IE6 to protect all the people with certifications in IE6-specific programming. Spare us, please.
  • Re:Union (Score:3, Insightful)

    by Anonymous Coward on Friday September 06, 2013 @06:14AM (#44773061)

    You're missing the bigger picture - if we had unions, we could sit behind nice desks, and have those with computers problems make appointments to see us (at times convenient for us, when we're not playing golf). Then we'd sit down and discuss the problem with them and go "reboot it twice and if it doesn't fix it call me in the morning", and charge a hefty fee.

  • Spot On (Score:5, Insightful)

    by some old guy ( 674482 ) on Friday September 06, 2013 @06:29AM (#44773101)

    Bruce nailed it. We've sat on our collective asses and watched the politicians, spooks, and marketing clowns turn an engineering marvel into a sad parody of it's former intended self. I don't think anyone nowadays can question the need for some serious re-engineering. We can solve the technical problems and propose new standards and protocols.The real question is how do we implement the fix.

    Will the standards committees support it? Will the Powers that Be allow it? Like Bill the Bard wrote, "Aye, there's the rub."

  • Re:Agreed (Score:5, Insightful)

    by Joce640k ( 829181 ) on Friday September 06, 2013 @06:33AM (#44773115) Homepage

    See Robert Heinlein's book "Take Back Your Government" for details.

    Unfortunately, it needs people like you to get up from their sofas and actually do something instead of just grumbling about it.

  • by Joce640k ( 829181 ) on Friday September 06, 2013 @06:37AM (#44773125) Homepage

    Neither will make a difference so long as people use Gmail/Hotmail/Yahooo/Facebook/etc.

    If your communications go through a large US corporation then no amount of quantum-encrypted cables (or whatever) will help.

  • by aliquis ( 678370 ) <dospam@gmail.com> on Friday September 06, 2013 @06:40AM (#44773139) Homepage

    Now even better! Only 159 characters per message! .. ? =P

    Seriously. Twitter suck. Why would I want any form of twitter? 160 characters suck. SMS suck to.

  • by Joce640k ( 829181 ) on Friday September 06, 2013 @07:01AM (#44773229) Homepage

    Make it really easy to encrypt/base-64 encode your emails before they're uploaded to Gmail/Hotmail/Yahooo/Facebook/etc.

    "Transparent to the user" would be ideal. I don't know if a browser plugin could manage that but I don't see why not - just intercept the "send" process.

    Encryption keys could be generated automatically during the first few exchanges with another person by attaching information to the end of the email. After two or three replies the displayed email address turns green and you're good to go.

    Yes, they could do mass man-in-the-middle attacks during the key exchange but so long as two people can verify their keys by phone (or whatever) then we'll know about it. More importantly, we'll be able to prove they're doing it. That would lead to more news stories about what the NSA does and more public awareness of the importance of installing an encryption plugin.

  • by daem0n1x ( 748565 ) on Friday September 06, 2013 @07:03AM (#44773233)

    I couldn't care less if Assange or Snowden are nice guys. That's completely irrelevant for the matter if they're sweet little cherubs or like to fuck sheep on their spare time. Nobody does what they did by being that nice guy everybody wants to have a beer with.

    The hateful crimes they exposed are the true stars, here. If you focus on the messenger, you miss the message. That's what the governments, corporations and their global propaganda machine (a.k.a. mass media) badly, badly, badly want you to do. Quite successfully.

  • by Arrogant-Bastard ( 141720 ) on Friday September 06, 2013 @07:08AM (#44773249)
    The worst part of the damage done by this isn't technical. It's human.

    The reporting on this latest disclosure reveals that the NSA has systematically inserted itself into the standard-crafting process, in order to deliberately weaken those standards. It also reveals that the NSA has bypassed the management of communications providers and recruited technical staff directly. In both cases it's reasonable to assume that the people involved have been through a security clearance process and are thus barred for life from disclosing what they know.

    I must now ask myself how many people I've worked with weren't doing so in good faith. When they argued that such-and-such a fine point of a network protocol standard didn't need improvement or that it should be changed in a certain way, were they doing so because it was their principled engineering opinion, or because it served some other purpose? Or when they were recommending that one of the many operations I've run move its colocation point or change its router hardware, was that good customer service, or was it to facilitate easier traffic capture?

    Will anyone be asking themselves the same questions about me? (They probably should.)

    The Internet was built on, and runs on, trust. Every postmaster, every network engineer, every webmaster, every system admin, every hostmaster, everyone crafting standards, everyone writing code, trusts that everyone else -- no matter how vehemently they disagree on a technical point -- is acting in good faith. The NSA, in its enormous arrogance, has single-handedly destroyed much of that trust overnight.
  • by FriendlyLurker ( 50431 ) on Friday September 06, 2013 @07:13AM (#44773289)

    I agree that peer review is no panacea and that open-source is at significant risk too. however open peer review is sure better than no open review. Silent Circle could easily continue to sell their services to the US and UK government AND fully open source the code. Why dont they? More $$$ instead of more security, more likely - not a good sign.

    Also your logic that they sell their software to the US and UK government so the NSA would not want to backdoor it does not hold up to scrutiny. How do we know that the NSA does not buy 10K worth a licenses - hardly a blip on their budget - just to shelve and never use them. In exchange the Silent Circle product is backed doored through gag orders, threats, coercion and/or covertly subverted (all things we know they now do, regularly). How do we know that the binary we get is not different than the binary the NSA gets - because their sales team told us?

    There is no way around it anymore - if your a company providing security products and your not full open source, and that source has not been stable and well reviewed for some time, then your product cannot be trusted no matter how many famous upstanding people are on your board of directors or licenses the US/UK Gov buys from you.

  • by Ash Vince ( 602485 ) * on Friday September 06, 2013 @07:24AM (#44773331) Journal

    There is no way around it anymore - if your a company providing security products and your not full open source, and that source has not been stable and well reviewed for some time, then your product cannot be trusted no matter how many famous upstanding people are on your board of directors or licenses the US/UK Gov buys from you.

    But if you do release all your source then someone can take all you hard work and then undercut you on price in the case of something like silent circle where you are selling a service not a product. Alternative people can take your source and just use it in house to roll their own solution. In both of these cases nobody pays you a penny and you go broke real fast.

    Open Source is really tricky to do well and make money from and sometimes it is just not a viable business model.

  • Re:Union (Score:2, Insightful)

    by Anonymous Coward on Friday September 06, 2013 @07:24AM (#44773335)

    I've had a number of union programming jobs in Denmark. The union ensured that I got to take my vacation, that my contract was in order, that I got training on company time for new technology and that if something illegal happened, I'd have access to a lawyer. I don't doubt that what you are saying was true in your case, it's hardly a universal property of programmers' unions.

    We don't need unions. We need _good_ unions.

  • Excellent point (Score:4, Insightful)

    by bradley13 ( 1118935 ) on Friday September 06, 2013 @07:38AM (#44773391) Homepage

    You make a really excellent point. Sadly, we can only react at this point. It seems to me that there are three useful reactions:

    - Keep up the political and media pressure. Don't let this issue die in the news cycle. Americans can apply internal pressure; those of us elsewhere can do our bits to keep up international pressure. For example: I will be integrating the NSA as part of a larger Internet security discussion in at least two of my university lectures in the coming semester.

    - Promote open-source software for all security purposes. While not everyone can audit the software, there are enough people out there who can and will. The NSA cannot predict who will do so, and hence cannot have them all in its pay.

    - Refuse to use any American IT services where security is important. This is not only sensible, it also applies economic pressure to companies that can lobby in Washington.

  • by Anonymous Coward on Friday September 06, 2013 @08:10AM (#44773527)

    I believe one issue people think we have is that the root CA's are compromised. Now I have no information one way or another there, but if that is true, one possibility might be a web of trust type approach. For instance, rather than one signing authority, you could use three and then use three levels of public key encryption. The assumption would be that if say the CA's were in countries that did not trust each other, then presumably at least one of the signing keys would remain secure regardless...

  • by PopeRatzo ( 965947 ) on Friday September 06, 2013 @08:17AM (#44773573) Journal

    Bruce Schneier is putting his name on the line with everything he publicly does and says. I trust him more than I trust someone who posts FUD wanting to know what his "game" is.

    One thing about the compromised web: don't trust anyone but really be suspicious when someone tries to spread FUD on someone who has generally been trustworthy.

  • by FriendlyLurker ( 50431 ) on Friday September 06, 2013 @08:41AM (#44773683)

    They can still go up to the head of the open source organization and says "you must include this back-door in your program, or go to jail". Or/and they can just just hire someone to contribute code that has security flaws.

    And in the extremely unlikely event that anyone spots the bas code, just replace it with something else 2 days latter.

    Yes they could, and probably do. However your leap to the conclusion that it is extremely unlikely that anyone spots code change is not correct. Thousands of people, even millions for the more successful products will update their source code repositories - the exact lines of source code that have changed will be highly visible to many people - and a subset of those will be security professionals and they are _very_ interested in any changes to the base code of their main security tools. You just proposing that we close our eyes download a binary and trust it instead. To reiterate: todays news has told us just how far the NSA has gone to compromise ALL MAJOR proprietary closed source security tools. All of them.

  • Dream On (Score:5, Insightful)

    by shawnhcorey ( 1315781 ) on Friday September 06, 2013 @09:54AM (#44774185) Homepage
    The internet has always been open. There have been fools that think adding "security" to it will change this. It doesn't. Get real, people. There are only two rules to security on the internet: 1. Never put anything on the net that you can't afford to be viewed by the public. 2. Never put anything solely on the internet that you can afford to lose. Corollary: Never put anything in a cloud that you can't afford to be viewed by the public.
  • by cardpuncher ( 713057 ) on Friday September 06, 2013 @09:54AM (#44774189)
    The original Internet wasn't built on trust, it was built by the government for military purposes in the sure and certain knowledge that the only people that had the ability to mess with it knew what was likely to happen to them if they did.

    The Internet was later coopted by groups of academics who didn't really have to worry if their communications were intercepted because they were pretty much public anyway and had nothing really to gain from abuses such as faking BGP route updates. Trust wasn't required.

    The public, commercial, Internet may have had an illusion of trust, based solely on the fact that nobody historically worried about it. That doesn't mean it was based on trust, if means any trust it enjoyed was based on ignorance.

    Trust in the Internet is in any case a wider issue than who is listening in. It's also knowing what really happens to the data about you provided voluntarily that gets hoovered up by all those online services chatting to each other behind the scenes.

    Nor is it merely about the Internet - it's about your phone, your car, your smart watch, your contactless payment card and all the other things that can be enabled by technology to spy against you.

    There isn't a technical fix to all of that, some of it has to be a political fix.

  • by DuckDodgers ( 541817 ) <keeper_of_the_wo ... 59o.com minus pi> on Friday September 06, 2013 @10:45AM (#44774643)
    If the root certificate private key is held by the NSA, they can bypass the entire remainder of the web of trust.

    Say I set up a website, whatever.com, and I have a root certificate from Verisign, an intermediate from Intermediate CA, Inc, and my whatever.com certificate. If the NSA subpoenas or hacks and steals the Verisign root certificate, they can make a fake public and private key with the name Intermediate CA, Inc and sign that with the Verisign private key. Then they can make a public and private key for whatever.com. Then they use their fake Intermediate CA Inc.certificate to sign that. Unless you the person visiting whatever.com specifically have an original copy of the real whatever.com certificate public key, and you look at the public key of the certificate every time you visit the website, you'll never notice that the NSA has replaced the real certificate with theirs. As long as they're using the correct Verisign private key, your browser will not detect any problems.

    This of course permits the NSA to do a classic Man-In-The-Middle attack. They give your browser the fake certificate chain and a copy of the website login page, you type things in, they decrypt them, and use them to log in to the real website, they get the results back from the real website, re-encrypt them with the fake certificate chain, and send them back to you. As far as you know you're using the real website, as far as the website server knows they're speaking with a normal browser, but the NSA is capturing everything either side transmits in clear text and can inject fake content in either direction whenever they want.

    The SSL/TLS chain of trust only works if private keys of the root certificate authorities are genuinely private. If anyone gets a private key, SSL's security is demolished (unless the theft of that private key becomes public, in which case that key is added to certificate revocation lists).
  • by dbIII ( 701233 ) on Friday September 06, 2013 @11:09AM (#44774965)
    People who torture will just make up evidence anyway. What you say is not going to matter. It's a terror tactic and not for information gathering - the KGB knew that from day one due to their Czarist predecessors, it's only us in the west that get suckered by "24" or whatever into thinking it's a way to get people to confess secrets.

Never tell people how to do things. Tell them WHAT to do and they will surprise you with their ingenuity. -- Gen. George S. Patton, Jr.