Death and the NSA: A Q&A With Bruce Schneier 149
Daniel_Stuckey writes "Since Edward Snowden's disclosures about widespread NSA surveillance, Americans and people everywhere have been presented with a digital variation on an old analog threat: the erosion of freedoms and privacy in exchange, presumably, for safety and security.
Bruce Schneier knows the debate well. He's an expert in cryptography and he wrote the book on computer security; Applied Cryptography is one of the field's basic resources, 'the book the NSA never wanted to be published,' raved Wired in 1994. He knows the evidence well too: lately he's been helping the Guardian and the journalist Glenn Greenwald review the documents they have gathered from Snowden, in order to help explain some of the agency's top secret and highly complex spying programs.
To do that, Schneier has taken his careful digital privacy regime to a new level, relying on a laptop with an encrypted hard drive that he never connects to the internet. That couldn't prevent a pilfered laptop during, say, a 'black bag operation,' of course. 'I know that if some government really wanted to get my data, there'd be little I could do to stop them,' he says."
obligatory quote (Score:5, Interesting)
"Security is mostly a superstition. It does not exist in nature, nor do the children of men as a whole experience it. Avoiding danger is no safer in the long run than outright exposure. Life is either a daring adventure, or nothing." Helen Keller
Schneier is right,
Cold warriors (Score:5, Interesting)
Re:I thought (Score:5, Interesting)
One solution is technological. You can't educate users, but you can configure the technology to improve the 'idiot mode' security. SSL is a good example: The end user can get most of the benefits without even knowing what SSL is or having any understanding at all of cryptography.
Schneier's privacy todo list at IETF Tech Plenary (Score:4, Interesting)
I thought it was a good speech, but this 'todo' part [youtube.com] towards making mass surveillance "expensive" stood out for me. So I used it as a list of criteria to evaluate [slashdot.org] my favorite privacy tools, I2P and Qubes OS.
Schneier's guidance does seem like a mixed bag to me, especially in this day and age; He mostly wants the privacy tech of the 1990s, only "more". I also got the same impression once watching Jake Applebaum speak at a gathering. There is this tendency to appreciate all the neat little qualities that targetted crypto does within various applications and platforms, and when asked about online privacy they regurgitate them all in a fashion that ensures no normal person would take heed. Extra demerits for implying that large IT industry projects need to be unleashed to address the privacy problem.
Its not hard to surmise from my other posts that I advocate a more blanket approach that is PC focused, so that ordinary people on their own can make the largest improvement in their online privacy using the fewest number of tools. The upshot is that those tools have to be more radical than usual in their design.
Re:Cold warriors (Score:3, Interesting)
The intelligence community ... was made as an conventional army fighting another conventional army (the GRU and KGB) and the sigint operations was hand-tailored to this kind of war. But what has happened since is that the enemy has changed.
It's like the immune system gone into overdrive attacking its own body - c.f. Aids.
Re:false flag? (Score:4, Interesting)
Where no one reports it, or prosecutes it, it's already being done. I've recently worked with educational facilities whose compliance with basic security practices for student and staff data is in "compliance" with EU law because they passed an audit, but the audit was basically a checklist they filled out. I refused to sign the parts of it that came to my crew, because the answers were lies, and submitted my concerns to their company and my company's lawyers and security managers. The education company was very, very careful to keep the auditors away from _me_.
They have changed their approaches to a number of the security issues I raised, but their own leaders did not know the security violations performed as a matter of common practice by their own staff, especially concerning student private data.
Encryption can't be cracked. (Score:4, Interesting)
I'd go with Snowden's appraisal of the NSA's (in)ability to crack certain forms of encryption. Snowden is obviously not going to write a how-to for us,
but it's been reported multiple times that he's using layers of encryption. If it's possible for Snowden to craft something the NSA can't break, then it's possible for Schneier too.
Encryption fucking works. And well done modern encryption is more or less impossible to crack.
(Just think about Bitcoin and all the other alt-coin. They all heavily rely on modern encryption. Yet, despite the tremendous monetary incentive, nobody has managed to crack their algorithms yet. Only find implementation bugs to exploit).
Usually, when NSA finds something, it's not by magically cracking an "impossible-to-break" crypto.
It's by getting around the crypto: using exploits or otherwise abusing bugs, bribing their way, etc.
Crypto is the strongest link in the chain, but they are tons of other link much more easy to break.
What makes the difference between successful security operation like Snoden and Scheiner on one hand and busted fails (like Silk Road's DPR) is the rigorous discipline in doing *EVERYTHING ELSE in addition of crypto* absolutely right.
To come back to the example in the summary:
online exploit won't be of any help for breaking into a computer if this computer is never connected (and "off" most of the time).
(Though said offline computer can still simply be stolen).