Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Communications Encryption Privacy Your Rights Online Apple

Should Edward Snowden Trust Apple To Do the Right Thing? 196

Nicola Hahn writes: As American lawmakers run a victory lap after passing the USA Freedom Act of 2015, Edward Snowden has published an op-ed piece which congratulates Washington on its "historic" reform. He also identifies Apple Inc. as a champion of user privacy. Snowden states: "Basic technical safeguards such as encryption — once considered esoteric and unnecessary — are now enabled by default in the products of pioneering companies like Apple, ensuring that even if your phone is stolen, your private life remains private." This sort of talking point encourages the perception that Apple has sided with users in the battle against mass surveillance. But there are those who question Snowden's public endorsement of high-tech monoliths. Given their behavior in the past is it wise to assume that corporate interests have turned over a new leaf and won't secretly collaborate with government spies?
This discussion has been archived. No new comments can be posted.

Should Edward Snowden Trust Apple To Do the Right Thing?

Comments Filter:
  • by ColdWetDog ( 752185 ) on Tuesday June 16, 2015 @11:06AM (#49921949) Homepage

    Is that you don't trust nobody.

    I can't imagine actually 'trusting' Apple or any other corporation or government. Give them a pat on the back for making security easier - sure. Trusting them, not so much.

    • by praxis ( 19962 ) on Tuesday June 16, 2015 @11:26AM (#49922147)

      You must trust some things some times with some data, or you can get no meaningful work done. Balancing that trust (the risk) with that data (the value) is what security is about. I put this data, that I just wrote, on this website, because it is low value and low risk. I wouldn't post here my social-security number because that would be high risk. I wouldn't post my private key here either, but I do store my private key on a hard drive I did not build myself nor did I verify myself running an OS I did not build myself nor did I verify myself because while it's high value, it is low risk due to the many rounds of secure math protecting it.

      As to trusting Apple or any other corporation or government. You implicitly trust your hardware manufacturers, all of them, unless you build your own hardware from scratch. Same goes for software, even open source software.

      • by s.petry ( 762400 )

        I completely disagree. I trust nobody, and am glad that others follow suite. It's through this constant lack of trust that we found flaws in numerous technologies which allowed certain people back door access. No company should get a free pass with security. They should all have to constantly prove their worth and people should constantly be testing.

        That is not to say I don't use products, but use does not imply 100% trust. It implies just enough trust to use something.

        • by King_TJ ( 85913 ) on Tuesday June 16, 2015 @12:16PM (#49922551) Journal

          If we really "trusted nobody", then nobody would ever build another electronic device. Heck, we'd have to pretty much destroy all of them we've got in use already.
          (Say we're simply talking about a "security appliance" for your network like a box that handles junk mail filtering, or even a firewall. If you don't place any trust in the idea that the components making up the units aren't back-doored at the factory, secretly allowing leaks of the data that passes through them? Then why buy and implement them at all? Same goes for the firmware or software running them.)

          I don't think the original poster was suggesting any company get a "free pass" .... Rather, it's an ongoing process where a company establishes trust over time by putting out products that get widely used and tested, and appear to be working as advertised. When it's discovered they didn't do so, then that trust level evaporates quickly and people look at other options.

          So right now, yes, I have a fair amount of trust in Apple to protect my privacy. I don't "trust them absolutely" by ANY means. But the nature of the marketplace indicates to me that Apple has some strong motivations right now to make it a priority. (EG. They're competing with cloud services, head to head, with Google at the present time -- so they need to be able to show their products are advantageous over Google's because your data is safer from misuse or resale with them.)

          • If we really "trusted nobody", then nobody would ever build another electronic device. Heck, we'd have to pretty much destroy all of them we've got in use already.

            No. You're conflating two different ideas: deciding to take a calculated risk, vs trust. They are not the same things.

            When you trust, you are assuming the other party is "on your side".

            • If we really "trusted nobody", then nobody would ever build another electronic device. Heck, we'd have to pretty much destroy all of them we've got in use already.

              No. You're conflating two different ideas: deciding to take a calculated risk, vs trust. They are not the same things. When you trust, you are assuming the other party is "on your side".

              True - they are not the same things. When I trust it means I believe I know how something or someone will act - not that they/it are "on my side", just that I "believe" they are predictable. And yes - I have trouble determining what is currently a fact, and I suspect the few things I "completely trust" are the result of insufficient thought.

              I trust that when I go swimming the chance are slim that I'll get eaten by a shark. It's a poorly calculated risk in which I place limited trust - which can/will be re

              • When I trust it means I believe I know how something or someone will act - not that they/it are "on my side", just that I "believe" they are predictable. And yes - I have trouble determining what is currently a fact, and I suspect the few things I "completely trust" are the result of insufficient thought.

                Good point. Trust doesn't necessarily mean they're "on your side", but can be "trusted" to act in a certain way... and often because it's in their own best interest.

                • Trust doesn't necessarily mean they're "on your side", but can be "trusted" to act in a certain way... [snipped]

                  The bit I should have bolded was believe. I "believe" intuition is my greatest bias. A failing that is of course unique to me, as the rest of the planet tests their gut instincts whereas I rarely do. When I do test my intuition (e.g. journalise and later review) I find it's wrong more often than it's right, and in the latter case it's damn hard to be certain (did I really "know" Apple was going to "improve their security"). Damn duality [mutter mutter impel, compel, confabulation, mutter mutter, nutters tal

          • by s.petry ( 762400 )

            Like the person I responded to, you seem to be treating hardware and software differently. Hardware should be treated the same level of trust you give to software.

            I had to re-read and make sure I'm not being picky with wording, and I don't believe I am. GP said "You implicitly trust your hardware manufacturers" and you stated "if we really "trusted nobody", then nobody would ever build another electronic device." Both of those statements are incorrect because there should be no explicit trust with eithe

        • that's the most similar complete disagreement i've ever seen.
          • by s.petry ( 762400 )

            You implicitly trust your hardware manufacturers,

            That is the part I disagree with, and in most cases is simply not true.

            • I get it. you put your data on a hard drive, but you feel uneasy about it. your unease makes your feel like you aren't trusting the hard drive manufacturer. the gp (i think) would say regardless of feelings, you put your data on a hard drive. at the end of the day, your data is there and the trust was implicit. your trust may not be 100%, but your private key is 100% there.

              semantics to me.

              unless you actually are splitting your key among different drives sourced from different manufacturers. in that ca
              • by s.petry ( 762400 )

                I get it. you put your data on a hard drive, but you feel uneasy about it.

                Most of us do exactly that right? We don't just store our sensitive data on a hard drive, we create an encrypted volume and put data in that.

                I quoted the statement, and will ask you to read the definition of implicit. I don't have unqualified trust for hardware any more than I have unqualified trust for software.

                • we seem to be at an impasse then as there are a couple of definitions. One for me and one for you.
                  • by s.petry ( 762400 )
                    There is one common definition, and if you read the dictionary you will find several parts but none of those parts discount the other. In normal use it means "implied" which indicates no formal explanation or express statements needed. See item 2 for further clarification which is "unquestioning". Even when dealing with Mathematics it has the same meanings.
        • by nine-times ( 778537 ) <nine.times@gmail.com> on Tuesday June 16, 2015 @12:53PM (#49922859) Homepage

          I trust nobody

          Bullshit. As praxis pointed out, you trust some people, sometimes, with some data. Otherwise you wouldn't post here. At a bare minimum, you've trusted Slashdot with your username and password, and you've trusted us, the Slashdot readership, with the contents of your post. What's more, whatever computer you're working on has at least hardware (with BIOS/firmware), an OS, and a web browser. You've trusted whoever made all of those things. Even if you are using FOSS, unless you've performed a thorough code review of the sort that you would perform on a suspected virus, you've trusted the community to review the code and remove security threats. Even if you encrypt your data, you're trusting whoever wrote the encryption software, along with the people who created the platform that the encryption software runs on, to be both honest and competent.

          What praxis was pointing out, which is entirely correct, is that security is not about being "absolutely secure". It's about balancing "making things accessible to those who I'd like to grant access" against "making things inaccessible to those who I would not like to have access." It inherently includes trusting authorized users, but also it pretty much always includes some level of trust (not necessarily absolute trust) of some 3rd parties. When you put money in the bank, you're putting some trust in the people who own the bank, in the bank's guards and tellers, in the police to protect the bank, and in the government to oversee the whole system and provide legal recourse if anyone else violates your trust. You don't have to trust any of those people absolutely, but that's because of the security practice of dispersing trust among multiple parties.

          So no, you're trusting someone, whether you admit to it or not.

          • "Bullshit" that is. Why should I "implicitly" trust hardware as praxis stated? You seem to be happy defending him, so explain that position. I assume risk using Software, but with hardware it's only full and unqualified trust or nothing? How on Earth do you come to that conclusion? Do you know what you are defending?

            • Why should I "implicitly" trust hardware as praxis stated?

              It's not so much a matter of "you should" as it is a matter of "you do." You already do trust hardware. I assume you're posting on Slashdot using some kind of electronic computing device, and you're typing this by banging rocks together.

              Do you know what you are defending?

              Yes, I'm defending the concept of security from those who have a very poor understanding of it.

              • by s.petry ( 762400 )

                t's not so much a matter of "you should" as it is a matter of "you do."

                Wrong wrong wrong! If I had full trust I would not have to do something like create an encrypted file system of any type. I would not have to encrypt traffic between hosts, and I would not have to encrypt my backups. Yet for some reason, I do all of those things with critical data.

                What you are defending is an invalid generalization and statement which happens to ignore reality.

                • Ok, so let me ask you two questions: 1) Do you have any information on your computer that you would not like to share with the world? 2) Did you build the entire computer from scratch?

                  If your answers were "yes" and "no", respectively, then to some extent, you're trusting your hardware vendor to have not included some kind of... well, who knows what they could theoretically include. A chip that serves as a keylogger, but that will send the info out through the NIC without involving the OS. If they design

          • OK, I'll bite.

            Why would anybody have to "trust Slashdot" to post a comment here? Please be specific, instead of spewing rhetorical bluster that sounds good until you actually think about it.

            I don't use this username or password anywhere else. What am I "trusting" Slashdot (or "the readership".. wtf?) with exactly?

            • by praxis ( 19962 )

              What am I "trusting" Slashdot (or "the readership".. wtf?) with exactly?

              You are trusting Slashdot with your username, your password, and the text you submitted in any comments or stories you may have written. If you do not use this username or password anywhere but here, (and good for you on that), then you are making the data you give them less valuable. That's a sensible thing to do, but you do trust them with that.

              If you had zero trust in Slashdot, you wouldn't even give them that. They could, if they were malicious, alter your submitted posts to make you out to be a patholo

              • I don't see the problem. It's a reasonable deduction, but not a certainty, that my name is David Thornley, and I believe I've put enough geographical information into Slashdot posts to identify a single person of that name. Therefore, anybody could look at my posting history and tie them to an identity which is probably mine. That's why there are certain things I simply don't mention on Slashdot posts.

                Suppose that I had never heard of Slashdot. What would prevent somebody else using this identity as

            • You might think that "security" is a concept that only applies to some information, and then that information is either "secure" or "not secure". Essentially what I'm arguing (and I think you are too) is that "security" is a concept that applies to all information, and it's a spectrum of "how inaccessible is it to people that I don't want to have this information" vs. "how accessible is it to people that I do what to have access to this information". Nothing falls outside of that.

              So even the contents of

        • by praxis ( 19962 )

          I trust nobody

          That is not to say I don't use products, but use does not imply 100% trust. It implies just enough trust to use something.

          You seem to contradict yourself there. You cannot trust nobody but then trust somebody just enough to use something. My entire point was that non-zero trust was impossible when using anything you did not build yourself. A point you seem to agree with despite your protestations that you completely disagree.

    • You cannot trust NObody. To do that, you'd not only have to have unlimited experience and access to any and all source codes, even if you had both you lack the third ingredient: unlimited time.

      You simply do not have time to audit everything yourself. Every line of code in Linux, every line of code in your router's firmware, every line of code in every tool used to create any other code you use... Not possible.

      At some point you have to trust someone. A person, an entity, an organization. What I agree on is t

      • You cannot trust NObody.

        No that is easy. You just have to trust anybody, and then you won't trust nobody anymore.

        Now what you and the GP probably meant with is to "not trust anybody" or "trust nobody".

        • Not trusting nobody does not mean trusting anybody. It means trusting somebody. The difference is that with somebody, you pick and choose rather than letting chance take control.

    • by AmiMoJo ( 196126 )

      His point isn't that you should trust Apple, it's that every company should make things as private as possible with encryption by default. Even if there is a back door, what Apple has done will prevent a lot of low level abuse. Police won't be able to access you phone without permission, and there is no way the NSA is sharing their back door with them. Maybe the FBI might get a look in if they have a really high value target and can explain it away with some parallel construction, but in any case it's a hug

  • None of the tech companies have been shown to be co-operating voluntarily with government spies. Telecoms have but not Google / Apple / Microsoft.
    • by gstoddart ( 321705 ) on Tuesday June 16, 2015 @11:11AM (#49921993) Homepage

      None of the tech companies have been shown to be co-operating voluntarily

      Quite honestly, does it matter if this is voluntary?

      When you have secret laws which say "give us this or else", WTF difference does 'voluntary' matter?

      Even the transparency reports say "we can't actually tell you what we did because we're under a gag order".

      Unless the government no longer has secret laws, or tech companies stand up to them and implement tech which doesn't have built in security bypass ... voluntary don't mean a damned thing.

      • Well, one way could be put responsibility in hands of users or within product itself.
        So even if some one comes up with "give us this or else", companies can just say go take it from device its yours. We dont have information you request but we can point you to person whom we sold the device.

        This can make difference between companies that voluntarily cooperate and put backdoors and those that just choose to push this responsibility to end users and their devices.
      • You are absolutely right, but....
        American companies are starting to feel the paranoia about this hit their bottom line. People are trying to find alternatives to american products.
        And THAT means that the lobbyists are jumping into the game. The US government may not listen to the little people's fears, but they will listen to corporate America and we will see changes to the laws.

        Thank goodness America is run by the corporate elite, otherwise we might be in trouble.....
        • Right, and, I'll tell you what's a bitch:

          American companies are subject to attack by people just like you and me, if we felt like it.

          They aren't running stuff that's much better than we have.

          The whole goddam Internet is porous.

      • When you have secret laws which say "give us this or else", WTF difference does 'voluntary' matter?

        Lawyers.

        If it is involuntary, the company pays intelligent lawyers to use the law to (1) hold the government to the law, even if secret and even if it's less protective than it should be. This in turn (2) makes the government less likely to make absurd requests and (3) costs the government resources, which provides at least some limitation on what they do.

        It's not enough, of course--we really need more robust protections on the secret side by cleared personnel with automatic publication a decade or two dow

      • It is a shame that I used up all of my mod points already. You are +4 but your words and their meaning should be +5. I doubt many folks actually think about the consequences of all of this secrecy garbage.

        Voluntary? Involuntary? When choices are forcibly removed through secrecy, it seems a lot like enslavement to me.

    • Define spies. Google has turned data over to the feds that has led to US citizens being arrested. Google cooperates with China's great firewall policies and therefore contributes to human rights violations. Google is as evil as they come when it comes to data security and protection, do not consider your data secure with them.

      • by Luthair ( 847766 )
        Google pulled out of China years ago. How is a lawful request co-operating?
      • Telecoms have fallen over their feet to hand information over to DHS. (Qwest didn't, so the powers that be had to make an example of them.) Do you have any evidence that Google handed over information without a legal requirement? You can't expect a company to break the law for you, because if they have a pattern of lawbreaking that annoys the authorities they will not survive.

  • by kwoff ( 516741 )
    said Betteridge. https://en.wikipedia.org/wiki/... [wikipedia.org]
  • What reform? (Score:5, Insightful)

    by CrimsonAvenger ( 580665 ) on Tuesday June 16, 2015 @11:10AM (#49921981)

    The only real change as a result of this law is that the telecoms have to pay to collect & store the information that the Feds used to do themselves.

    So now they'll have to get their secret court to rubberstamp a warrant for them instead of just emailing a request downstairs when they want some information on someone. Big whoop!

    • So now they'll have to get their secret court to rubberstamp a warrant for them instead of just emailing a request downstairs when they want some information on someone. Big whoop!

      I'm not even sure it requires that much effort. Since the data is being held by a 3rd party they could always just issue a NSL or given how complacent telecoms were in the past just ask nicely and get the data they want, all without a warrant. Having not read the entire USA FREEDOM Act I don't know if this is the case but it seems the most likely. If someone more knowledgeable that myself chimes in with this info that would be great but until I hear otherwise I will assume this was a shitty deal.

    • ... and the NSA has a new revenue stream as they sign contracts to co-locate the telco's metadata storage in their data centers...

    • This is actually a very important technical difference, even if it's not a big practical difference. Essentially, the NSA was already collecting all of the data first, and just saying, "we promise we won't look at it unless we have a warrant." If the procedure is now to have telecoms (who inherently have access to that information) turn over records when they're presented with a warrant, then this falls back into something resembling normal law enforcement procedures. The police can get your phone record

      • Also, one of the protections afforded by the Secure Communications Act was that while, yes, the government needs a warrant to get the records from the phone company, the phone company also MAY NOT hand the data over to the government WITHOUT a warrant. So the phone company is incentivized to protect records about you, because it's illegal to turn them over otherwise.

        Which is actually why they need a warrant, and not merely a subpoena. A warrant is written authorization from the government immunizing you fro

  • by Anonymous Coward

    Anyone characterizing that single line as an endorsement is just clickbaiting. It is absolutely appropriate to give apple praise for improving their baseline. Just because you recognize the improvements they've made doesn't mean you've left them off the hook for continuing on that path.

    The article's author seems to be taking the tact that nothing short of perfection is worthy of praise. That's a recipe for maintaining the status quo.

  • by 93 Escort Wagon ( 326346 ) on Tuesday June 16, 2015 @11:14AM (#49922021)

    It's still an open question how much we should trust companies like Google and Apple... with regards to their internal motivation and plans. However (anecdotally, at least) it seems pretty obvious these companies learned from Snowden's leaked documents just how much the government was screwing them, and they've seen how it's hit their bottom line - any trust that might've previously existed is gone.

    Remember the (anecdotal) reaction of the Google engineers when they heard how the NSA was tapping their unencrypted intra-datacenter communications?

    • Re: (Score:2, Insightful)

      by Anonymous Coward

      It's still an open question how much we should trust companies like Google and Apple... with regards to their internal motivation and plans. However (anecdotally, at least) it seems pretty obvious these companies learned from Snowden's leaked documents just how much the government was screwing them, and they've seen how it's hit their bottom line - any trust that might've previously existed is gone.

      Remember the (anecdotal) reaction of the Google engineers when they heard how the NSA was tapping their unencrypted intra-datacenter communications?

      Snowden's revelations have either hurt or in some cases come close to ruining the business of many US companies. While it is the norm on this forum to assume the worst about large corporations in particular, In many cases companies have suffered damage without collaborating in any way with the NSA that has yet been proven. Cisco for example has been losing sales not because they allowed the NSA to screw around with their equipment but rather because the NSA intercepted their shipments. I don't expect corpor

      • by 93 Escort Wagon ( 326346 ) on Tuesday June 16, 2015 @02:12PM (#49923613)

        Snowden's revelations have either hurt or in some cases come close to ruining the business of many US companies.

        No, in reality it is the NSA's formerly-hidden behavior that has hurt these businesses.

        If you are stealing from your employer, and I provide documentation of that fact which results in your getting fired - I'm not to blame for the loss of your job, you are.

  • by engineerErrant ( 759650 ) on Tuesday June 16, 2015 @11:18AM (#49922073)

    The poster's interpretation seems completely off-base to me; not only is Snowden not encouraging us to blindly trust Apple et al with our privacy, he explicitly warns of the very danger the OP brings up.

    As an iOS developer, my perception is certainly not that Apple is trying to grab our data instead of the government - in recent years, they have started a major cultural shift toward real protections of user data - simply not collecting it, encrypting it in transit, etc., etc., even if it's a burden on third-party developers to make the transition. This is a Good Thing, full stop. Props to Apple (as well as Google, who is also making its own efforts).

  • Why do you assume that Snowden is not an NSA operative? An active one. He took a story which was in the public eye (the SLC building is huge so everyone knew about it) and turned it into a story about him for at least half of the population. He could have stayed anonymous. Instead he made the story that would have galvanized 80% of the population against NSA into a story that galvanized 30% of the population against him, 40% against the NSA and 30% not care at all. He couldn't do NSA a bigger favor if
  • by gstoddart ( 321705 ) on Tuesday June 16, 2015 @11:23AM (#49922115) Homepage

    I trust corporations to do the "right thing" inasmuch as PR dictates there is a public perception that this is important.

    But I do not trust corporations to ever do the "right thing" out of a corporate sense of morality.

    I expect corporations to act like vicious sociopaths trying not to be noticed and miming "the right thing" without actually giving a damn.

    Trusting the moral compass of a corporation is a pathetic joke and a lie.

    • Actually, some corporations do things out of a corporate sense of morality, although they may have to present it as PR. This doesn't mean you can trust them to do so, since no corporation is more than one stockholder revolt from turning into a moral sleazeball. I find it easier to trust revenue streams. Apple makes only limited money from getting data from its customers, and lots and lots from selling stuff to its customers. Being known as a government snitch would have a great impact on their public i

  • Apple technically has end-to-end encryption, but the problem is the key exchange. Apple retains the keys for all of your devices, which is how one iMessage can be sent to multiple devices. The way it works is that the sender communicates with Apple's servers to obtain a list of public keys for devices registered to the recipient. The sender then encrypts the message once per key, and sends the encrypted messages to Apple, who then distributes them to each device. In theory, and likely in practice, Apple

    • Apple retains the keys for all of your devices, which is how one iMessage can be sent to multiple devices.

      Do you actually know this, or is this your guess? Because my understanding is that iMessage encryption was designed explicitly to avoid having Apple hold the kind of private keys that can decrypt the message. I thought there was some scheme where each device got its own decryption key, and that those keys never left the device.

      Add to that that iMessage silently falls back to SMS,

      Well, not entirely "silently". Messages sent via SMS turn green, so you know whether they were sent via iMessage. You don't necessarily know ahead of time whether, when you hit "S

      • 1) They hold the public keys. But the risk isn't them decrypting with your private key, it's them adding their own public key (or one they generate) to your list of keys without your knowledge. Apple could send your public iPhone key, your public Ipad key, and their own generated snopping public key to the sending device, and no one would be the wiser. You can read about it here: http://techcrunch.com/2014/02/... [techcrunch.com]

        2) There's a setting to turn off "Send as SMS," but under some circumstance, it might not hon

        • But the risk isn't them decrypting with your private key, it's them adding their own public key (or one they generate) to your list of keys without your knowledge.

          Well yeah, or they could also backdoor the whole device without doing anything half so subtle or sneaky. So could RIM, Microsoft, or Android phone manufacturers. On some level, with every device you use, every service you use, and every piece of software you use, you are assuming that the manufacturer/provider/developer isn't a malicious evil mastermind.

          But in general, their system is designed so that it won't add a public key without approval from an already approved device, or some other authorization.

          • The difference is that the HTTPS certs are auditable -- they are truly public, in the common understanding of the word. To my knowledge, there is no way to audit Apple's repository.

            My line of thinking when it comes to security is that "good enough" has been repeatedly shown not to be. We know the best practices, so why not follow them? Every time someone compromises, whether it's in the RNG, or allowing infinite login attempts, or allowing degraded connections, or inventing their own untested encryption

  • Given their behavior in the past is it wise to assume that corporate interests have turned over a new leaf and won't secretly collaborate with government spies?

    No, but it is wise to use free market forces to force corporate interests to offer encryption, to be seen as encryption friendly, and eventually to even be friendly to end-to-end user-controlled encryption.

  • by GuB-42 ( 2483988 ) on Tuesday June 16, 2015 @11:29AM (#49922187)
    • The celebrity photo hack was not confined to iCloud, and it was not a result of anything Apple did. It was a matter of bad end-user passwords. Unfortunately, not even Steve Jobs and Alan Turing together could make easy-to-use security strong (even with possible additional posthumous abilities).

  • Both have proven themselves untrustworthy in my view.

  • Why would you even ask that question?

    They will do what's best for them, not "the right thing". That what Steve Jobs did.

  • by tom229 ( 1640685 )
    For apple maybe. Will they do the right thing for you? I highly doubt it. And if Mr. Snowden thinks they will, I would advise him to brush up on his history of that company.
  • No reason to trust (Score:5, Insightful)

    by endus ( 698588 ) on Tuesday June 16, 2015 @11:59AM (#49922417)

    I see no reason to trust Apple or any similar companies whatsoever. They have betrayed consumers' trust in the past, have cooperated with illegal surveillance programs, etc. If a given company has cleaned up its act, great, but independent verification, open standards, etc. are the only way to gain assurance. Trust is irrelevant.

    • Got evidence to back up your statements? Apple has made mistakes, but as far as I know they've never betrayed the trust of customers deliberately, or cooperated with illegal surveillance programs. They have jerked the rug out from under their developers more than once (anybody remember OpenDoc?).

  • Regardless of how bad a corporation or government agency has been in the past, there's nothing wrong with lauding them whenever they take a step in the right direction. It might not get them all the way to the place you want them to already be, but they're all going to move in the direction of encouragement and what gets them better results. And the faster they get the positive or negative feedback, the more effective it will be. Continuously lambasting Apple today for something that Jobs did in the past
  • We can only judge them on what we know they have done and are doing, not on what they MIGHT do in the future or don't know they are doing.

  • Apple is a profiteering corporation; not a human being. As such, it has only one purpose - the bottom line (profit$). Take as much as you can, and give back as little as possible.

    Trust Apple? Sure, trust them to do what they will to increase profits. Trust them to make moral decisions? Nope. Not unless it prevents cuts to their bottom line.

    "Trusting" a corporation is about as foolish as believing that a corporation "cares" about you. Corporations do not care because they cannot care. They are a lega
  • If 'Right Thing' means lining the pockets of upper management, then yes he can.

  • What you have to understand in all this is motivation.

    Google has ALWAYS has a significant monetary motivation to collect and analyze as much data as possible about you.

    Apple has NEVER had that motivation. They just have never had a need to collect information about you because it doesn't do them any good, therefore collecting it is only an added expense with no return.

    Now it turns out that no only does Apple not gain by collecting user data, in fact they have figured out how to PROFIT from not collecting

  • ... feed the tuna mayo.

    Why the hell don't we all just give it up and sign off on FOIA to each other?

    I mean, all the people on the planet.

    I'll know your shit; mine; theirs and everybody can have mine and stuff.

    --

    You may say I'm a dreamer
    But I'm not the only one
    I hope someday you'll join us
    And the world will be as one

    ~ John Lennon

  • Especially as a publicly held company, apple could change management literally tomorrow.

    The new management could monetize user data instantly.

Think of it! With VLSI we can pack 100 ENIACs in 1 sq. cm.!

Working...