Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Security Government The Courts Your Rights Online News

EFF Position on Trusted Computing 183

Seth Schoen writes "EFF has just released our analysis of Trusted Computing. We find that the technology could benefit computer security, but must be fixed to ensure that the computer owner is always in control. We also propose a specific way of fixing it. There's coverage of our position at news.com. More articles should be up in the near future at the new EFF Trusted Computing page. Thanks to all the people who helped us understand this technology!"
This discussion has been archived. No new comments can be posted.

EFF Position on Trusted Computing

Comments Filter:
  • In short (Score:2, Funny)

    by Anonymous Coward
    Don't trust trusted computing. It does not compute.
  • Bad assumption (Score:5, Insightful)

    by Jason1729 ( 561790 ) on Thursday October 02, 2003 @06:16PM (#7119061)
    This seems to be assuming "Trusted Computing" is intended to benefit users.

    The real reason it exists is precisely to take control away from the computer owner and give it to the content owner. Given that, what is the point of the EFF proposing "fixes" to help keep the computer owner in control, when its primary design goal is the exact opposite?

    Jason
    ProfQuotes [profquotes.com]
    • by dpilot ( 134227 ) on Thursday October 02, 2003 @06:23PM (#7119130) Homepage Journal
      Not just Executive, but Legislative, as well.

      Our government responds to campaign finance, and the lion's share of that is done by large corporations and other aggregates that want to make sure that THEIR rights come first.

      Most people don't understand enough about computers to understand how completely OUR rights in this realm have been trampled, already.
    • Re:Bad assumption (Score:5, Insightful)

      by pla ( 258480 ) on Thursday October 02, 2003 @06:33PM (#7119224) Journal
      Given that, what is the point of the EFF proposing "fixes" to help keep the computer owner in control, when its primary design goal is the exact opposite?

      Because it throws the ball back over the fence to those trying to force DRM on us.

      In essence, the EFF has given these folks an ultimatum - "You want a trusted computing environment, but not the public backlash? You can fix it like this. Now put up or shut up".

      Up to this point, the Palladium group et al could safely ignore most of us, since all of us opposed to DRM have basically just whined about it. Now that someone (and a respectable someone, at that) has offered them a way to get what they claim they want, choosing to ignore that will very tangibly clarify the real intent - If they ignore the EFF's recommendations completely, they all but publically admit they only care about stripping users of the right to use their own machines, rather than creating some fictional "safe" computing environment.
    • by Morgaine ( 4316 ) on Thursday October 02, 2003 @06:36PM (#7119242)
      The point of the EFF doing this is precisely to underline the fact that big business is attempting to take control of the end-user computing platform away from the user.

      You see, the problem is not so much that big business is doing this, but that it is doing so by subterfuge rather than out in the open.

      The EFF is just flushing out the rats here. If business were trying to take control of people's property openly then the EFF wouldn't need to put on an act of innocence and merely be "identifying dangers" as the proposed solutions as if business wasn't aware of them.

      It's a good strategy. Big business can only respond by saying either "Oh yeah, we hadn't realized" (LOL), or else it can reply that this was indeed the intention. In both cases, the user wins.

      My bet though is that the EFF will be met by total silence.
    • The reality is that most of the time, the user is the content owner, having created the content. And for these people, having control over their documents will be a direct result of having created them. They will also have the ability to control the way they are disseminated. This point seems missed in a lot of these discussions. Someone has to create and lock down the files... and that person probably meant to. They had every right to do it, and MS is giving them what they want, control over the access ri
    • by Atario ( 673917 ) on Thursday October 02, 2003 @06:51PM (#7119352) Homepage
      You're exactly right. In "Trusted Computing", as the analysis points out:
      ...the computer's owner is sometimes treated as just another attacker or adversary who must be prevented from breaking in and altering the computer's software.
      I can't put it any more directly than that without risking being modded "Funny". Your computer, in effect, belongs to them. (See?)

      Even the proposed "Owner Override" seems to me a "how are you going to do that" issue. How are you going to assure that a change was made by you and not by some software pretending to be you?

      There are other oversights too:
      • "Identity" of software is determined by submitting a hash value, but how can you be sure someone's not sending a canned hash value?
      • "Secure output can prevent information displayed on the screen from being recorded" -- until someone invents a screen-scraping monitor. If information exists, there's a way to copy it. That's just what information is.
      • The most serious point of all -- that the EFF is lending credibility to this blatant grab for dictator-like powers by suggesting that it can be "fixed" and the problems "addressed", at which point we should all happily adopt it. Not me, brother.
      I would have much preferred the factual analysis and then a great big "run away from this as fast as you can".
      • Even the proposed "Owner Override" seems to me a "how are you going to do that" issue. How are you going to assure that a change was made by you and not by some software pretending to be you?

        That question is the machine owner's problem to deal with, and "Trusted Computing" spec and app developers need not concern themselves with it. It doesn't matter how it gets done (or even if it really gets done). The owner takes responsibility.

        The most serious point of all -- that the EFF is lending credibility to

        • I agree. I think the EFF is saying essentially, "TCPA is anti-customer, remove [core feature that sounds minor] and it won't be." They're doing it in a way that people can't help but agree with "Owner override? Of course I trust myself," so that the TC advocates can't ignore it. But, if they comply, the intended 100% trust won't be there and none of the friendly companies advocating this will want to be involved.

          It's the computer equivalent of fluffy kittens and kissing babies, of "won't someone please thi
      • Even the proposed "Owner Override" seems to me a "how are you going to do that" issue. How are you going to assure that a change was made by you and not by some software pretending to be you?

        The idea would be to use the secure I/O capabilities to make sure the user approves the change/override at the keyboard, which can't be spoofed by software in a TC system.

        "Identity" of software is determined by submitting a hash value, but how can you be sure someone's not sending a canned hash value?

        The hash val
      • by Alsee ( 515537 ) on Friday October 03, 2003 @01:39AM (#7121680) Homepage
        How are you going to assure that a change was made by you and not by some software pretending to be you?

        Actually that is pretty easy, you press a special button/switch. Malicious software is incapable of faking actual physical control. I proposed exactly such a modification to TCPA months ago.

        I e-mailed this one of the main TCPA proponents about this back in January. It was David Safford, author of Why_TCPA [ibm.com] and TCPA_Rebuttal. [ibm.com] I explained this system and pointed out that there every single claimed benefit of Why_TCPA works just as well with actual and full owner control like my (and the EFF's) proposed modification grants. He did not dispute this.

        His only reply was to suggest this change would no longer keep laptops secure against a thief. This suggestion fails on two grounds. First of all it directly contradicts TCPA_Rebuttal where he claims TCPA is not designed to be secure against physical access and that this supposedly 'proves' that TCPA is not designed for DRM. If TCPA is not supposed to be secure against physical access then it is disingenuous to claim it is supposed to protect a laptop against theft. The second reason his 'theft' argument fails is that it is simple to combine a physical button-press with an owner ID code or password before full control is given. A theif cannot get this owner password, and software can neither get the password nor press the button.

        Granting the owner of the machine to his own keys (passwords) that are locked in the TCPA chip gives the owner full control over the system. There is absolutely no justification for denying the owner access to his own keys. The only purpose for this design requirement is to use it as a weapon against the owner and for various varients of DRM.

        Of course Microsoft and the TCPA proponents will never accept my proposal (and the EFF's proposal) because the only real motivation for this hardware change is for DRM-type purposes. If owners maintain actual control over their machines and it can't be used for DRM systems then the entire project is a waste of time. Everything else is just a smoke-screen. TCPA will not prevent your computer from being infected with a virus, and it will not prevent that virus from slagging your entire hard drive and everything on it. The only thing it will do is prevent the virus from distributing copies of your 'secure' music files.

        -
        • "Of course Microsoft and the TCPA proponents will never accept my proposal (and the EFF's proposal) because the only real motivation for this hardware change is for DRM-type purposes. If owners maintain actual control over their machines and it can't be used for DRM systems then the entire project is a waste of time."

          Actually, it only takes some owners. Let them implement remote attestation, but make sure it remains legal to build hardware with the EFF's owner override feature. If it's legal then there'l
          • Let them implement remote attestation, but make sure it remains legal to build hardware with the EFF's owner override feature.

            Won't work. The remote attestation involves checking a sort of serial number (key) hidden in the chip without ever actually revealing the number to anyone. Part of attestation verifies that you have an approved key number.

            If someone else makes hardware with an owner override feature it will fail all attestation requests because it doesn't have an approved key number. And the key n
            • "Pretty much the only way to override the system they've designed is to chemicaly peel the secure chip you bought and read your key number out with areally high power microscope and lab equipment. The good news is that once to do that you can get "god level" control over the system, like the EFF override. The bad news is that each key you dig out is pretty much only good for one person. If they ever detect that you've done this they'll revoke that key."

              If they try doing key per user (chip) I bet it blows u
              • If they try doing key per user (chip) I bet it blows up in their faces.

                That is the TCPA and Palladium/NGSCB plan. I certainly HOPE it does blow up in their faces. But they have every intention of making sure that every new computer sold comes with this locked chip inside and a unique code for each computer. I've been reading the TCPA design documents. Detailed specs on Palladium/NGSCB aren't really available, but the capabilities are fundamentally the same as for TCPA and the hardware would need the same
        • The DRM applications of this technology are small potatoes compared to the ability to lock-in consumers to an application suite (major score for the capitalists) and the ability to lock-out subversive information (major score for government censors).

          That said, something absolutely must be done to protect end-user computers better; the current state of affairs is intolerable. I thought the EFF did a nice job not just crying Chicken Little, but making a specific suggestion on how to prevent the abuse of this
          • something absolutely must be done to protect end-user computers better

            This is actually an important question - what exactly do you want to protect them from?

            Despite all of the propaganda, Palladium/TCPA are not designed to secure the computer against outside attack and intruders. They are designed to secure the computer against the owner. Despite the fact that they both use the word "secure" those two goals actually have almost nothing in common. Despite the claims to the contrary, Palladium/TCPA are nea
    • This seems to be assuming "Trusted Computing" is intended to benefit users.

      I don't necessarily expect there to be end users on the machines I intend to run Palladium on. I want an effective means of hardening a server against compromise.

      Slashweeniedom might want to take a look at the people who have worked on Palladium before claiming that Microsoft employees know nothing about security. Butler Lampson won the Turing award for his work on computer security.

      There are only two applications I can think


      • This seems to be assuming "Trusted Computing" is intended to benefit users.

        I don't necessarily expect there to be end users on the machines I intend to run Palladium on. I want an effective means of hardening a server against compromise.


        Yeah, but DRM is being proposed in the hardware and OS for *ALL* computers, not just servers.
      • Palladium and similar "trusted" environments won't help your server at all. It's not like you've got a "Upload and run program - non-malicious only please" page on your server, where all you need is a decent way of checking digital signatures. Attacks happen when there's a hole in software. When bad data (that's all code it, until your machine decides to execute it) causes a program to do something it has the access to do, but is not supposed to be doing.

        For example, Outlook has access (intentionally) to y
    • Re:Bad assumption (Score:2, Insightful)

      by fermion ( 181285 )
      The EFF is doing exactly what it should. It is taking business propaganda at face value and then compare the actual product to the propaganda. If the two match, the yea. If not, then either the company is deceiving through it's propaganda or building a deficient product.

      In this case trusted computers is being billed as a way to allows owners to control their content. The opportunity for deception is provided by the interpretation of the word 'owners.'

  • That users are ignorant of Computer Security, so it must be controlled by a more intelligent source, like Microsoft. (It's true most are, but does anyone believe MS will fix it?)
    • > But then they will argue That users are ignorant of Computer Security, so it must be controlled by a more intelligent source, like Microsoft. (It's true most are, but does anyone believe MS will fix it?)

      Then I will argue that Microsoft knows nothing (or does nothing) about security and I really do think I'd be better holding the controls. Of course, that assumes that I still use Windows.......

  • by Anonymous Coward
    I've been working in the security field for about 30 odd years, starting with securing mainframes back at Berkeley in the early 70s and am now providing consulting services to the major financial institutions in the US.

    I think that any corporation that invests at least 10% of their budget wisely should be on the track to provide their clients and staff a secure environment in which to deliver their products. I have to deal with a lot of intrusions on a daily basis while overhauling the infrastructure. Cu
    • keeping things in control is all well and good, but where do you draw the line? next are you going to keep tabs on what is on each persons screen? i dont know about you, but i sometimes might actually feel the urge to check my personal email during the day, and having people look at my personal things with out me ever knowing about it is rather disturbing.
    • This is key, we don't just want to control the desktops but also the software running on them.

      Which is nice.


      Good, if you have succeeded, you are safe from Microsoft worms and viruses running unpatched software.
      If the systems aren't safe, then you're just fooling yourself and your clients.

    • That is all well and good, but you surely know the masses of people at home are more important than the securtity of any fortune 500 or all of them for that matter.

      So long as all DRM applications are knocked out and actively prevented. And there is some sort of guarantee no VENDER can use this technology, that it can only be used at the private level... then it would be a good thing. I could control what software runs on my home systems, but no vendor could generate a key and require ME to verify to THEM
  • Fear (Score:3, Insightful)

    by Davak ( 526912 ) on Thursday October 02, 2003 @06:17PM (#7119079) Homepage
    In order for a computer to be more secure, it must monitor more aggressively for changes. This seems to be point 4 in the article (remote attestation).

    However, by intuition, this would mean that your computer system would know and monitor your system and thus the user more and more.

    Misconceptions about this design abound. The most common misconception denies that the trusted computing PCs would really be backwards-compatible or able to run existing software.

    Well, crap... of course there is going to be compatibility problems... I am much more concerned that my system and my massaging of that system is going to be tracked and recorded at higher and higher resolution of detail.

    Davak
    • In order for a computer to be more secure, it must monitor more aggressively for changes.

      The EFF's point is that this is perfectly fine, so long as it is done strictly for the benefit of the owner and that the owner have actual control over it. If would be a good thing if it were a tool for the owner.

      The problem is that the current design is not doing this for the owner, and that the owner does not have actual control. The only control the owner has is to kill the system entirely. This will kill much of
  • Great timing (Score:2, Interesting)

    Right on the heels of learning that Outlook Express was mostly responsible for the HL2 Source Code Leak..
  • by Anonymous Coward
    The EFF basically wants your computer to lie to a content provider so that you can turn off the security and still receive their content. It might as well not exist in the 1st place then, which is probably their real goal.
    • Well, its my computer, and I can damned well modify it to my hearts content. This seems more than reasonable; it seems *necessary*.
      • by Anonymous Coward
        Fine, but part of the point is to provide secure content. The user has shown that he can't be trusted, so the content providers have a useless security system. A computer is a tool. It's pretty stupid to say you have a god-given right to see every bit in every memory location. You're just limiting the tool's usefulness for secure content, which is the goal of the EFF. So don't be surprised and indignant, when people refuse to sell you content you want in a form you want, and sue you afterwards when you clai
        • Fine, but part of the point is to provide secure content. The user has shown that he can't be trusted, so the content providers have a useless security system. A computer is a tool. It's pretty stupid to say you have a god-given right to see every bit in every memory location.

          Content providers do not have a God-given right to complete control of their content. They have a government given right to partial control. As history has shown, content providers are completely against fair use. (I'm talking a

        • but part of the point is to provide secure content.

          Those advocating TCPA and Palladium are doing everything they can to deny that very fact. Forcing them to admit it is actually designed as a DRM system is a good step in defeating them.

          It's pretty stupid to say you have a god-given right to see every bit in every memory location.

          I'm not about to start on "god-given rights", but the fact is that my computer is MY PROPERTY and I have absolutely every LEGAL right to disect my computer and analise every s
      • To wit, Seth points out in the article that Trusted Computing still allows you to do with your computer what you want; there is nothing that prevents you from just turning off the trust mode outright and running it like a normal PC.

        It is a misconception that the trusted PC hardware will be unable to run your custom code. I have been guilty of propagating this misconception as well.

        More significantly, the trusted PC now enables a set of security primitives with implicit security policies that you can

        • The trouble is... (Score:4, Insightful)

          by tkrotchko ( 124118 ) * on Thursday October 02, 2003 @06:55PM (#7119376) Homepage
          If this is unopposed, it will not be long until everything useful requires "trust". And so my PC, the one I paid money for, will not work the way I want anymore. Oh, theoretically it will, but in a practical sense it won't.

          If a content provider wants to "trust" a device, then they should buy it for me.

          My cell phone providers wants a trusted device. Great. They give me a phone, and I pay to use it.

          Ask yourself this... is watching an HDTV version of Star Wars so compelling that you're willing to compromise yout ability to control your PC? If you answered "yes", then you and I simply have a completely different viewpoint on the subject that I suspect we'll never agree on.
    • by Highrollr ( 625006 ) on Thursday October 02, 2003 @06:28PM (#7119179)
      Having my computer do what I want it to doesn't seem particularly outrageous to me.
    • by prichardson ( 603676 ) on Thursday October 02, 2003 @09:18PM (#7120406) Journal
      How about this, since I can't control my computer, why should I have to pay for it. I would be much less opposed to not controlling it if I didn't own the hardware. Perhaps Microsoft will start liscensing computers as well.
    • The ONLY LEGITIMATE use of trusted computing is for network administrators to control what applications their users can install and run.

      The other uses like DRM are not legitimate and your right they'd be more or less useless with the EFF's changes. So what's your point?
  • The will search your hardrive for any copyrighted material.

    Next Tv's,Stereos
  • Doesn't that... (Score:3, Insightful)

    by chill ( 34294 ) on Thursday October 02, 2003 @06:27PM (#7119177) Journal
    ...defeat the purpose? I mean, everyone knows that end users can't be trusted. Given the chance, they'll do nothing but pirate movies, music, television and software, etc.

    *** END SARCASM ***

    I think DRM is a *good* thing. Once people have to pay for music, movies, etc. the industry will realize exactly what they were losing to piracy -- almost nothing. If someone could wave a magic wand and people had to abide 100% by the rediculous license agreements, you'd find that instead of buying what they were sharing, they would go without.

    Or does Microsoft, the BSA, MPAA and RIAA really think all those people in Asia are going to pay a few months worth of wages for software or entertainment?
  • We find that the technology could benefit computer security, but must be fixed to ensure that the computer owner is always in control.

    That's simple enough to solve. The computer will just be both owned and "0wnz0red" by someone else, most likely by the entity that licensed the operating system to the user, and the hardware imprinted for that specific operating system and all others irrevocably locked out.

    And it will all be done with the click on a seemingly innocuous little virtual button that reads sim
  • Trust. (Score:3, Insightful)

    by Simple-Simmian ( 710342 ) on Thursday October 02, 2003 @06:32PM (#7119214) Journal
    The EFF is correct as usual. Trusted computing = Me knowing what the hell is running on my computer and having control over it. Anything else is untrustworthy computing. Anyone that wants to control what I can do with my own property (computer) can stuff it where the sun don't shine.
    • They may be correct, but they need to work on their presentation a little. I'm afraid the general population (the non-slashdot crowd) is going to read their position and disagree.

      Companies (through this "Remote attestation" feature) checking to see if their software has been modified? What's wrong with that, my mom will ask? They own it. The subtleties of how this can run counter to users' interests will escape many.

      And what's the EFF's solution? To LIE to these companies. That sounds wrong, doesn't
      • Yeah, lying about remote attestation seems like a convoluted solution. How about just not having it?

        Companies can still lock down their computers by disabling boot from removable media and setting a BIOS password.
  • by Anonymous Coward
    Personally, I still prefer "Trustworthy Computing [trustworthycomputing.com]" over "Trusted Computing."
  • I used to intern for a company that works with a product called "WIBU KEY" now despite the fact their is a single note of one group "getting around" WIBU key, done properly WIBU should not be feasible to break.

    Now WIBU is making something called "Codemeter" in which a user will be able to have licence information for hundreds of different software packages, that means if someone has MS word on their computer, and knows someone else that has it, they can use their licence on the Codemeter stick on their fri
    • Very fancy dongles to be sure, but here they are again. Will users reject them for the same reason?
      • They are very easy to install, at least CodeMeter is, it "looks" like a USB harddrive, but the software can make calls to it through a COM interface.

        The dongle "SCREAMS" ease of use, you merely stick it into your computers easiest USB port and BAM you have a working dongle. USB is also pretty robust, and the place I interned at only had a few returned WIBU USB dongles returned, and that was mostly due to children stepping on them, or in one case a child breaking it in an attempt to get back at his/her par
        • What usually happens is that a cracked version not requiring the key/dongle is released, so people who want to copy the program still do so. And then you, a real user, are trying to use the program at 2am when the internet goes down or the devices fails are you are stuck with an app that will not run.

          That's why if I buy a program that requires a CD key or something along those lines, I almost always download the cracked version or updater and use that instead.
    • You're talking about dongles, not trusted computing. All dongles are crackable by simply modifying the application not to check the dongle at all.

      Dongles don't provide sealed storage (which is pretty much the only useful feature of trusted computing), so they are not an alternative to trusted computing.
      • in concurrence with an operating system they can be used for sealed storage, and key decryption. Right now, griffin technologies has a working (though not released) program that can encrypt/decrypt based on secure keys stored ina dongle onto the hard drive. Without the dongle you dont get access to the data. even if you try to make whatever program is accessing the data "bypass" the dongle, it will only read it in "raw" encrypted form, which doesnt help
      • Oh and btw, these are the "future" of dongle based computing. They have thier own memory, their own processors, and their own Trusted memory which cant be accessed, and only written to by a vendor. Yes you can write a program to "re-write" the trusted sectors, but you cant read them.

        This is Trusted computing, the only difference is the dongle is a small computer hooked to yours and you can carry it around. Instead of being forced to lug your machine around to listen to that copy of britaney spears, you
  • by kfg ( 145172 ) on Thursday October 02, 2003 @06:37PM (#7119256)
    Not a "trusted" one.

    Just as I wish with my house. I want my house to protect me, my papers, possessions and privacy. I want it to be nobody's business what my house contains, even to the point of being able to protect myself against legitimate legal prossecution.

    Oddly enough, that's what the Constitution was written to provide my house with.

    It is up to me to secure my house with whatever technological measures are available to provide that security and understand how to use that technology. I'm perfectly willing to take the same responsibility for the security of my computer. Just provide me with the tools. Then go the hell away and leave me alone.

    The second my house starts deciding for me what I may or may not keep in it or do inside it I get a new house.

    The day my computer decides it doesn't "trust" me with what I'm storing in it or doing with it I pull the plug.

    Fortunatly for me there are already hundreds of millions of "untrusted" computers already out there in the wild that do everything I might require my computer to do.

    KFG
    • Just as I wish with my house. I want my house to protect me, my papers, possessions and privacy. I want it to be nobody's business what my house contains, even to the point of being able to protect myself against legitimate legal prossecution.

      I don't think you understand Trusted Computing. I suggest that you read the linked article.

      TC will not allow anyone else to look into your computer and see what software you are running, without your permission. What it does is to allow you to SHOW other people wh
      • You can't convincingly show me that you're running Mozilla or Internet Explorer or Opera. . .

        Thank God. I'll do everything I can to keep it that way.

        And if you feel the need to be "convinced" of what's in my house, get a warrant. I have no such need of demonstration.

        My house has doors and windows. My computer has ports and file ownership. I can open them. I can close them. I can let people in. I can throw people out.

        I don't have to "prove" a bloody thing. To anybody. Even if you have a warrant.

        KFG
        • You can't convincingly show me that you're running Mozilla or Internet Explorer or Opera. . .

          Thank God. I'll do everything I can to keep it that way.
          And if you feel the need to be "convinced" of what's in my house, get a warrant. I have no such need of demonstration.
          My house has doors and windows. My computer has ports and file ownership. I can open them. I can close them. I can let people in. I can throw people out.
          I don't have to "prove" a bloody thing. To anybody. Even if you have a warrant.

          The point i

          • What I have no capability of doing I cannot be forced to do either. Even with a warrant.

            It has everything to do with warrants.

            I can let you look in my house window, but that doesn't mean you can see in my file cabinet or dresser drawers. You cannot be sure of what is in those dresser drawers without coming into my house and looking into them. Even then you cannot be sure I didn't remove what you were looking for before you came in or falsely placed something there which is not mine.

            If I wish to prove to
          • At most you may be requested to prove something in return for being offered something that you value.

            And when that "something that you value" expands to include virtually every new peice of software and even basic access to the internet? Yeah, yeah, I'm skipping to the end. I'm discussed this exact expansion process at length elsewhere.

            You say it's like giving people the ability to put a window on their house. But it's more like imprisoning them in their house unless they "voluntarily" live in a glass ho
      • If you wish to use a house analogy, it's like having a house whose walls are completely transparent. You've got a tarp over it most of the time, but if someone asks you a question about your house, even if you don't wish them to know, you are unable to lie because they demand to look for themselves. Further, because it's one of these transparent houses (a computer - not granted the same constitutional protection as a primary residence) the police can lift the tarp and peer in. Even worse, it's impossible fo
  • "We also propose a specific way of fixing it"

    Namely, removing it.
  • This article over at cnet [com.com] looks like Microsoft may actually be listening to the critics of trusted computing and rather moving towards what it calls Shield Technology - basically incorporating better firewall technology into the operating system. I for one would welcome this over trusted computing.
    • The question is weather they are listening or just using the thin edge of a wedge. Shield Technology looks to be a parallel path from a different bunch of researchers because the security problem is bad and they may not have enough time to stall user uproar before they can get everybody on the trusted computing bandwagon.

      Although, to be fair, the more they attack some of the applications they claim are problems that TC is supposed to solve, the fewer remaining reasons other than DRM can be given.

      Trusted
    • They look like separate issues to me. Trusted computing provides lock-in, DRM, secure data, etc., but it doesn't protect you from viruses. "Shield technology" may help protect against that stuff. I'm sure MS is not dropping trusted computing.
  • Whatever changes are made to DRM, just remember what the consumers' position will be when DRM is commonplace:

    Bent over, taking it in the poop chute.
  • The (Score:3, Interesting)

    by erroneus ( 253617 ) on Thursday October 02, 2003 @06:59PM (#7119403) Homepage
    That's a CRAZY idea. As usual, let's compare computing on the information super highway with driving on our own freeways.

    What would happen if we let people drive their own cars? They would repair their own cars, "upgrade" them too! But if they are in control, they may not make repairs as needed and then their cars would fall apart on a public super highway and cause other people to die and stuff.

    Oh wait... we have a "license" to help ensure that the public has a bare minimum amount of knowledge and skill to operate a vehicle safely on public roads.

    Now let's return to cyber-reality again. Instead of "trusted computing" how about "trusted users."?

    Let's say that the price of admission to the information super highway should be controlled in the same or similar way to the way we control access to the roads. What a fabulous world we'd live in! "License to SPAM" wouldn't exist. Maybe there are a lot of bad things I haven't considered but is it much worse than requiring a driver's license write a check?

    Wow... imagine getting a ticket and your license revoked for SPAMing... or for operating a computer with a virus...

    "The Responsible Computing Initiative" is born!
  • Trusted computing is being promoted as a computer platform that users can trust, but it's really more about creating a platform where third parties can determine whether the users themselves can be trusted.

    Trusted computing benefits content producers and service providers more than it benefits users. The reason is that producers and providers are usually the ones whose systems are being acessed, while the users are the ones accessing these well-known systems. It is the nature of the transaction that truste
    • The "trusted party" in trusted computing is the software. TC lets you trust software to behave in a certain way: software on your own computer, or software on remote computers, with the owner's permission.
      • The "trusted party" is indeed the software, but whose software, exactly?

        The point I'm making is that it's usually the user/client who must show that the software he is running on his computer may be trusted by third parties. I'm suggesting that establishing such trust relationships is the primary purpose of the trusted computing initiative, more so than users establishing trust relationships with their own software (it's easier to fool the user than it is to fool a third party's computer).
  • when the hardware gets hackered and cracked, then you will have to upgrade....

    Whats the difference between hardware and software when it comes to bit flipping?

    to put something in hardware gains the advantage of speed but the inflexability of change.
  • Related News (Score:3, Informative)

    by superyooser ( 100462 ) on Thursday October 02, 2003 @07:41PM (#7119746) Homepage Journal
    Microsoft Preps Major Security Strategy Shift [wininformant.com]

    Microsoft's chief security strategist made the surprising statement that the company is about one-third of the way to its goals for Trustworthy Computing. I guess there's a lot more going on internally than we're aware of.

    The article also says, "Microsoft's short-term strategy will shift from patch management to what the company calls 'securing the perimeter.'" What this means is that they're working more closely with firewall companies.

  • Attestation (Score:2, Insightful)


    With Microsoft, IBM, and other major players involved in this process, the EFF doesn't have much of a choice but to work with what they've got. I don't think that the EFF agrees with the Trusted Computing initiative; as they say in the article, most of the changes described by the initiative can be implemented at the software level. I agree that that is where the changes should take place.

    I agree with some of the other posters here and I don't really see anything useful about the attestation process (see
  • by SiliconEntity ( 448450 ) * on Thursday October 02, 2003 @09:31PM (#7120476)
    There are some other problems with Trusted Computing that the EFF article fails to address.

    One is the difficulty of dealing with upgrades, failures and replacement of computers, if your data is locked to the old machine. TCPA had a hugely complicated process you would have to go through to migrate any of your "secure" data to the new machine. It involved going back to the manufacturer, getting a special transfer key, moving the data over and having it get re-encrypted. Microsoft hasn't said what they're going to do, but it's an extremely difficult technical problem to solve while retaining the security.

    Another problem is the PKI (public key infrastructure) issue. For remote attestation to work, it's necessary that the TC chips have some kind of crypto certificate that says that they are legitimate. Microsoft has said nothing about who will issue these certificates and who will revoke them if a machine gets broken into. Setting up a successful, global PKI is a prerequisite for DRM type applications and will be an enormous job.

    The article also overlooks that the sealed storage feature, which the EFF mostly views favorably, can also be used to achieve lock-in and secure closed formats. Microsoft Word could store data encrypted using the TC hardware, such that only Microsoft-signed applications can access the data. This kind of lock-in does not depend on the remote attestation features that the EFF is so concerned about, and would not be addressed by their Owner Overrides.
    • Another problem is the PKI (public key infrastructure) issue. For remote attestation to work, it's necessary that the TC chips have some kind of crypto certificate that says that they are legitimate. Microsoft has said nothing about who will issue these certificates and who will revoke them if a machine gets broken into. Setting up a successful, global PKI is a prerequisite for DRM type applications and will be an enormous job.

      More importantly, the embeddded certificates used for the remote attestation (T
  • I know no one wants to hear this, but the dark picture painted by the EFF of the ills due to Trusted Computing is not likely to come to pass.

    The main point that the EFF analysis overlooks is the role of competition in the marketplace. Yes, TC could allow web sites to require you to run particular software; yes, TC could allow vendors to encrypt their data formats making it impossible for you to switch to a new software package; yes, TC could be the foundation for DRM and restrictive licensing.

    But the poi
  • Well, I thank the EFF for this analysis, but I think they've missed an important tactic. Let Microsoft and Co. lock out non-MS software all they want. They're at a fundamental disadvantage. If they wish to exacerbate their tenuous position vis-a-vis monopoly, fine. If they want to gamble shareholder confidence on a risky offensive against the general good will of the net public, we should help them.

    The EFF warns that Microsoft's IIS web-server could block web-browsers other than Microsoft's IE. Well,
  • Where can I buy a box with "owner override"?

    It will only take one vendor doing it, and I'd pay a few extra bucks for my Linux owner override feature to work.

    Ice, ice baby.
  • Trusted computing is about creating a platform the content providers can trust. Not the computer owners.

    Whether the content provider is a network admin rightfully protecting the company owned computers on his network. Or microsoft/riaa wrongfully protecting the computer YOU own from copy infringing materials and from things they just don't like even though you have every legal right to do them.
  • All that "trusted" platform does is preventing things that were developed over decades of technology development, and replaces them with things that no one thought about because they are useless.

    1. Make sure that the user is typing at the keyboard, and not another program is doing it. That means, no remote access, no automation, no pretty much any feature that allows users to use interactive software unless it's sitting on the box behind his desk. Yes, one can try to send locally encrypted data blindly ove

Don't tell me how hard you work. Tell me how much you get done. -- James J. Ling

Working...