Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?
Google Businesses The Internet Government The Courts News Technology

YouTube Video-Fingerprinting Due in September 115

Tech.Luver writes "The Register is reporting on Google's statement to a presiding judge that video-fingerprinting of YouTube material will be ready in September. The development is required to head off a three-headed suit against the company, currently being debated in a New York City courthouse. The system will, according to Google, 'be as sophisticated as fingerprinting technology used by the Federal Bureau of Investigation.' From the article: 'As Google told El Reg in an earlier conversation, the company already has two systems in place for policing infringing content - but neither are ideal. One system allows copyright holders to notify Google when they spot their videos on the company's sites. When notified, the company removes the offending videos, in compliance with the American Digital Millennium Copyright Act. A second system uses "hash" technology to automatically block repeated uploads of infringing material.'"
This discussion has been archived. No new comments can be posted.

YouTube Video-Fingerprinting Due in September

Comments Filter:
  • Hard AI ftw (Score:5, Interesting)

    by UbuntuDupe ( 970646 ) * on Tuesday July 31, 2007 @10:14AM (#20057035) Journal
    Earlier I had joked [slashdot.org] about Google's claim to be nearing a system that lets them check for copyrighted works. I said that they're basically claiming to have solved a hard AI problem.

    Others pointed out that, no, it's not a hard AI problem to just compare some kind of checksum of the video against a set of banned checksums. That's true. But what about once people know they're using this system? They can just trivially re-encode. Perhaps add a scene break here or there, and totally mess up the fingerprint. To prevent that, it seems, you would need to solve a hard-AI problem: that is, be able to determine if an arbitrarily-encoded video appears to a human to match some copyrighted work. It would have to be robust against minor scene shortenings and lengthenings, scene breakups, color gradients laid over the video, etc.

    Anyone know how difficult this program is to circumvent? (Just hypothetically -- not advocating criminal activity here.)
    • by InvisblePinkUnicorn ( 1126837 ) on Tuesday July 31, 2007 @10:19AM (#20057097)
      That would be the ultimate mechanical turk - sitting around watching youtube videos all day and getting paid... in addition to what you are already being paid as you put off work to watch youtube videos all day.
    • Re: (Score:2, Interesting)

      by Udderdude ( 257795 )
      They could potentially scan the entire movie for a few keyframes that they know will be in there reguardless of silly scene breaks, etc.

      Nobody would know what the keyframes are, so it would be hard/impossible to black out that specific frame.
      • Re:Hard AI ftw (Score:5, Interesting)

        by sholden ( 12227 ) on Tuesday July 31, 2007 @10:54AM (#20057537) Homepage
        So I guess it doesn't matter if it then blocks a video which has taken a couple of seconds of video from a TV show in a "Review of Episode X" video post, and just happens to grab one of those keyframes?

        Of course a little bit of coding and you have a program that takes that 10 minute video, splits it into 10 1 minute videos and uploads them. The ones that got rejected it splits into 10 6 seconds videos and uploads them. Rinse and repeat until you have however small an set of rejections you asked it for. Then it cuts out just the necessary fragments of videos (replacing them with the last good frame or something?).

        Of course that can be worked at google's end by adding a delay to the report rejection step, and by banning those who get lots of rejects.
        • The obvious way to protect fair use would bet to have keyframes every X seconds, and only block when you get more than 5 or so matches. (If it's only one or two keyframes, it's a short clip)
          That'll also help protect against mismatches. If my yet-another-guy-getting-hurt home vid matches The Matrix in one frame (according to whatever algorithm they use), I'm fine.
          There's no way it'll match a dozen frames.
      • Re:Hard AI ftw (Score:5, Interesting)

        by DDLKermit007 ( 911046 ) on Tuesday July 31, 2007 @10:59AM (#20057595)
        Sorry, but fingerprinting the video is damn near impossible. Closest any image recogniton app can go is similar scene right now. Some can do logos, but were talking low resolution videos. Pretty much they'll do a file checksum, and that'll kill most reuploads. For the determined. All that'll be needed is a simple re-encode. That'll completely change any checksums involved. They won't seriously spend that much machine time on checking if they are allowed to make money off a video. They'll do just enough to say they did something, but not so much you can't find a way around it.
        • by babyrat ( 314371 )
          reminds me of the old saying:

          "those that claim it is impossible should stay out of the way of those that are doing it"
          • Never said it was impossible, but were no where near that level yet. Facial recognition is here, but in very primitive forms. Not to mention the processor time involved. Checking a video against a massive catalog of banned videos is just not practical at this time. Won't be for some time too. The processing time is really what kills this. We have the tools, while very primitive require allot of processing power, and that power requirement really won't go down much.
        • Pretty much they'll do a file checksum, and that'll kill most reuploads.

          Did you even read the summary? That's mentioned as one of the two already in place. YouTube is adding a different system.

        • Philips has a video fingerprinting system [philips.com]. From the site:

          The system is robust against severe degradations like low bit rate video compression, scaling, rotation, cropping, noise addition, median filter and noise removal. [...]
          A 5 second video fingerprint on any segment of video content is sufficient to uniquely identify that segment.

          You obviously need more than a simple re-encode to get around that and I'm sure Googles system won't be fooled by simple tricks either.
          • Where this falls apart is parody, mashups, and other such legal uses. What if my clip I use is part of their 5 second fingerprint? our kinda arsed. I'd love to see their tech in action though. I've played with enough, and none have always fallen short of useful.
        • If they use a simple file hash, or checksum, changing one (or a few) bit anywhere in the file will result in a different hash. A hex editor will do this in half a second. I agree, it'll still kill the majority of copy-cat 're-uploads'

          The other way I guess would be to break files up in to a series of chunks and hash those, scan every new upload and look for some identical percentage of values and allow or deny based on that - still trivially simple to bypass though.
        • I don't think google will try and claim their system is foolproof, I think they will claim it's as good as what the cops do with real fingerprints. I think they will will claim it enforces "fair use" and if the complainant is not happy with the automated "fair use" they still have the right to serve a "takedown notice". I'm sure they will also point out that real fingerprints are auto-matched on patrial information, and will claim this be done by google taking a few random hash strings and using it as the p
        • You are focussed on fingerprinting a single frame or still picture. That is hard, i agree with that.

          Fingerprinting a movie is much simpler. For example, to create a useful hash for a series of frames, just throw away the color information and work on the grayscale channel only. Then reduce the size to something like 8x8 pixels, and compare these to the previous frame. Encode the difference (e.g., set a bit to one if the level goes up, and to 0 if it goes down, this gives a 256-bit number for those two frame
      • Could digital watermarking actually be used against the copyright holders in this case?

        Say an encoder inserts a unique watermark that can't be seen by eye but is part of the data stream. Google isn't looking for it and doesn't recognize it when the video's uploaded, so it allows the video. Somebody would then have to complain, and Google would take down the video and add it to their "banned" database. The problem is, they would then have added basically a garbage entry into their database because it only
      • That could kill fair use. Any keyframes that would be in every reasonable edit of a film or TV show would be critical to many short non-infringing excerpts from that film or TV show.
        Pity the MPAA doesn't believe in fair use....
    • Re:Hard AI ftw (Score:5, Insightful)

      by MyLongNickName ( 822545 ) on Tuesday July 31, 2007 @10:21AM (#20057135) Journal
      I think you miss the point.
      1) Is there a way around the system? Yes.
      2) Does that matter? No.
      3) Why is that? This solution shows that Google is making reasonable efforts to comply with the legal issues.

      The majority of folks aren't going to take the effort to circumvent these controls. Rates will drop significantly. Google can honestly say they are making every effort to comply with copyright protection. Lawsuits will go away.
      • Will the lawsuits go away? Or this will prove that the only way to fight this copyright thingies is to shutdown the youtube service? I think that is almost impossible to fingerprint a video and all the changes that can be submitted by re-encoding the video..
        • Re:Hard AI ftw (Score:5, Insightful)

          by utopianfiat ( 774016 ) on Tuesday July 31, 2007 @11:04AM (#20057681) Journal
          It's not going to get shut down because of copyright infringement. :/ That's like saying we should bomb Hong Kong because they sell copyrighted works there- just because something has an illegitimate use doesn't make it illegitimate on face ffs.
          Note, this can also be applied to "kitchen knives can kill so we should ban kitchen knives." and "people can die in cars so we should ban motor vehicles"
          and uh... "People who have killed a lot of people have played video games, so we should ban video games." The states needs to get over the damn prohibitionist culture that's removing any sense of personal responsibility from our great nation.
          • Of course it wont shut down... I just said that the only way to get off all the copyrighted thingies is to shut down the entire site. Or to kill anyone who tries to upload a copyrighted video even with a frame changed.
          • hat's like saying we should bomb Hong Kong because they sell copyrighted works there-
            Bad analogy. Hong Kong is a city, YouTube is a video content. If we bomb Hong Kong people will die, if we shut down YouTube people will lose their jobs.

            Not that I wish to imply that corporations estimate that human life is worth more than money :-P
      • by jZnat ( 793348 ) *
        Legally, all Google have to do is remove videos identified by DMCA take-down requests. The fact that they're going above and beyond that could end up biting them in the ass when it fails in any given situation. Any IP lawyers or law students here to clarify the matter?
        • Legally, all Google have to do is remove videos identified by DMCA take-down requests. The fact that they're going above and beyond that could end up biting them in the ass when it fails in any given situation.

          IANAL or law student, but I'd think the "above and beyond" would work in Google's favor in a court case. They can tell a copyright holder that is suing them: 1) You never bothered to use the existing laws and just ask us to take the offending material down and 2) We're making every practical effor

        • It WILL bite them in the ass for the very reason laid out in the DMCA itself:

          512(g)(1) No liability for taking down generally.-- Subject to paragraph (2), a service provider shall not be liable to any person for any claim based on the service provider's good faith disabling of access to, or removal of, material or activity claimed to be infringing or based on facts or circumstances from which infringing activity is apparent, regardless of whether the material or activity is ultimately determined to be i

      • As will its userbase.
    • Re:Hard AI ftw (Score:4, Insightful)

      by Nasarius ( 593729 ) on Tuesday July 31, 2007 @10:25AM (#20057177)
      There are already several decent systems [wikipedia.org] for fingerprinting audio; it's not particularly surprising that Google researchers would be able to do something similar for video.
    • Depending on what they release. If its something akin to a facial recognition system, I would think that filling every few frames unseen to the naked eye or perhaps slight snow could throw off any facial recognition like software Google could throw out. Another method if this is the case would be to throw in perhaps a lens filter over the video. E.g. assume Google's software is set to compare a current video with known content. How would it achieve this. Perhaps light based spectrums, image positioning... S
      • Well, you could always start with the sound alone, which has already been done.

        As for image recognition, usually this is done using Fourier transforms as an edge finding algorithm. Basically, you can use a computer filter to throw out the bulk of the information, and keep only the most visually identifying features in the video. Changes in tint or timing won't affect the shapes and movements of these predominant edges.

        Umm. To visualize this, imagine you take a full colour photo, and you trace it as best you
        • The method you describe can still be defeated quite easily just by adding randomly placed contrasting edges throughout the video. These don't even need to be particularly visible, just enough to throw out the transform method used.

          In the RADAR world, electronic warfare weenies spend their lives fingerprinting emitters. Coming up with seemingly different hardware can be as simple as tapping on the klystron with a screwdriver (or a hammer for those fun situations where you can safely say "it just fell apart i
    • Why does it have to be hard AI?
      Wouldn't some sort of soft AI (expert-system, neural-net) do just as well?
      I could be wrong, but doesn't "hard" AI refer to a system that is conscious?
      Why would you have to be conscious to recognize movie clips?
      • Re:Hard AI ftw (Score:4, Insightful)

        by Hatta ( 162192 ) on Tuesday July 31, 2007 @10:41AM (#20057383) Journal
        Mostly it's an assumption. Since only humans so far have been able to recognize video, then it must take something like a human to do it. Viewing clip A and clip B and abstracting the parts that make them the same is not the kind of problem computers are good at, and if you've done it you've probably solved the really hard problem in AI, how abstraction works. Of course this rests on the assumption that strong AI is fundamentally different from weak AI, and the difference is not just one of degrees. Personally I think that's a specious distinction much like that between microevolution and macroevolution.
    • or it could check audio for a small segment (let's say a 15 second clip), and have it converted from the sound waves to a check sum and compare that to a database of "blacklisted" checksums. Then if it is within an acceptable margin of error, have a few parts of video which it will look for, also within a margin of error. The margin of error can be quite large actually to account for missing frames, changes in formats, etc. based on more data they decide to use to judge by.
    • The article said google is already doing checksums and they aren't ideal. Creating fingerprints wouldn't be impossible. Very difficult - yes, but not impossible. We should also note that google employs some of the smartest and most creative people in the world.

      Off the top of my head... I would throw out the whole notion of "checksums". They aren't really applicable because their purpose is to compare exactly. Even if doing key frames, all one would have to do is lighten/darken the video and the whole che
    • Maybe you could see from the viewer pattern if it contains something interesting to a wide audience.

      E.g. if it contains copyrighted material it should get posted to an indexing site. That should bring people from all over the world. Then you tag it and get a human to watch it and check if it is copyrighted. Whereas original material is probably viewed by a small circle of friends.

      Just dumbly checking the popular stuff would help a bit, but I think you need to look at referrer information, or the location of
    • by Kjella ( 173770 )
      Google doesn't need so solve the hard problem. They can spend several decades "improving" the filter as long as they can nab some new trivial changes, then go tell the content producers "hey, we're getting smarter but the pirates are too". Besides, if they can make videos look at least somewhat crappy that's already a win for the content producers... I certainly get annoyed by that quickly.
    • eg. in a gimp plug-in.

      But yes, it will reduce it. By how much?
      I don't know. Maybe a lot.
      • YouTube videos are already relatively low-res. You've a fine balance there--if you circumvent with random noise, you've gotta make sure that the resulting video is still watchable by human beings, or you are circumventing to no purpose.
        • yes, you're right. Also, it's surely compressed, and random noise will reduce the compressed size.

          However, I think only a little noise would be necessary. Eg. +1 or -1 brightness, on one pixel per screen.

          Hmmm... or maybe just different compression parameters? This changes the artifacts introduced into the video, and so the uncompressed video would look slightly different.
    • It's not as hard a problem as you think. There are algorithms that could recognize a short segment of video even if re-encoded.. it's like iris recognition. They can recognize your iris even though the new image may be rotated and scaled. It's all about transforming the image so that these kinds of trivial transformations are made to disappear, and then comparing the result to a bank of pre-stored images.
    • by j1mmy ( 43634 )
      The easiest way to circumvent such a system is to upload your video elsewhere.
    • This is trivially easy to work around. Just put a description of the video at the beginning or end, or use freely available video editing software to make a scrolling image across the bottom. Heck, you could just insert a couple black frames randomly across the film. Or scale it down and put it on a background of white noise.

      Checksums are horrible ways of checking non-text data.

    • My AI lecturer keeps saying that AI is a tough area to work in, since you get loaded with every problem that regular software developers can't figure out ("Sorry, you need AI for that") but then after any solution is found you never get any credit ("That's not AI, that's just a database-backed checksum comparison and Fourier waveform analysis system"). The lines about what AI actually is vary depending on whether it is effort or praise involved.
      • Heh ... the former is a pet peeve of mine: people thinking that their job is more complicated than it really is. It's not limited to software developers.

        As for the later...

        "okay, okay, you *technically* passed a Turing Test, but only hy having it basically ignore me and ridicule everything I did wrong."

        "Sir, you were talking to your wife the whole time."
    • When you write "checksum", do you think of an MD5 hash?

      Google could just re-encode uploaded videos with a very low resolution (say 8*8 pixels) and use the result as a fingerprint. This is trivial to implement and makes re-encoding useless. I guess that cropping, stretching and many other modifications are detectable as well without tackling any AI problems at all.

      Google is certainly able to make uploading of banned videos at least very inconvenient.
    • The really surprising thing about your post is it sounds like you think Google is stupid. I doubt you really believe that.

      >They can just trivially re-encode.

      No. You're thinking of cryptographic hashes where a one-bit change in the input leads to a totally different signature. This wouldn't be that kind of hash. It would most likely be a collection of a lot of hashes for each video, amalgamated into one or more signatures for each video.

      If I explained the basics of the problem to my eight year old and poi
      • And before you say that a collection of hashes would be vulnerable to a re-encoding attack, no, they wouldn't be, if they were hashes of the right features of the video. You don't do a hash of the bits of the file, you do a hash of the properties of the video. Properties being things like: is this pixel/group of pixels surrounded by darker or lighter pixels/groups of pixels in each compass direction. Similar approach with audio.
      • The really surprising thing about your post is it sounds like you think Google is stupid. I doubt you really believe that.

        No, I don't, and I don't think my statement implies that. As a matter of fact, I recall this post [slashdot.org], where I was highly critical of those who assumed Google used a crude quickfix to solve their Googlebomb problem.

        I do, however, think Google's *public statements* *imply* a belief in a higher level of robustness against circumvention than they can provide, even if they are aware of the actu
    • Take 1 frame every 2 minute from a source material (of a scene you want banned), choosing ones where there is no motion for (say) 3 frames. The two minute skip that means that you get a 3000 to one compression on the number of frames you're looking for, and the three frames of stillness means

      (1) that the frame will compress well (making it a better match), and

      (2) that you only have to look at 1 out of three frames in uploaded material - and even that you can ignore high motion scenes entirely..

      Combined the
    • Dunno, but I think you're a bit mistaken.

      A low resoution DCT comparison would be able to flag frames of a video with some ease, thus fuzzily* filtering out a large amount of negatives. Then, the paid turks sort out what's a positive from what remains.

      Mind you, when I say 'DCT', I'm not talking the 8x8 quanta in an MPEG4 stream, I'm talking a single DCT of the whole frame. More computationally intensive, sure, but you get a somewhat scalable data picture that, given a hash that can retain meaningful proxim
    • Once you have the original video, it is a simple case of pattern matching to find how close the two videos are. Google could say that any video that matches a copyrighted video close to 80% or more is not allowed on youtube.

  • by 4solarisinfo ( 941037 ) on Tuesday July 31, 2007 @10:17AM (#20057065)
    As soon as Google stops indexing/posting material people want (legal or not) people will stop using Google. I believe they know what a fine line they're walking between 'do no evil' and survival here, I wonder which will pervail?
    • I'm not sure I'd complain overly much if Google went back to being 'that obscure search engine that knows all about OSS'. I'm sure their stockholders would though.
    • Re: (Score:2, Informative)

      by drrck ( 959788 )
      I've posted numerous video clips to Google Video and YouTube. I have recently received 3 e-mails from Google telling me that I have been flagged by the copyright holder. Subsequently I have already stopped using Google Video.
      • by 4solarisinfo ( 941037 ) on Tuesday July 31, 2007 @10:33AM (#20057267)
        For the unoffocial record - are you the copyright holder of these clips, or could they legitimate requests?
        • by drrck ( 959788 )
          I'm clearly not the copyright holder. But that's not the point I attempting to make. They were clips from Robot Chicken on Adult Swim. I've just made my choice based on the reactions of the two services.
          • Re: (Score:3, Insightful)

            Soo... the system works, and "do no evil" gets a point.
    • I believe they know what a fine line they're walking between 'do no evil' and survival here, I wonder which will pervail?

      How exactly is automatically removing copyrighted content that shouldn't be posted in the first place evil?
      • How exactly is automatically removing copyrighted content that shouldn't be posted in the first place evil?

        I believe you missed my point.

        "Do no Evil" - company motto, remove all copyrighted content
        Survival - being the engine people use to find things

        I was trying to point out that they may bo oposing forces in this case. the more they remove, the less people will use google to search.
      • by Sancho ( 17056 )
        It depends upon how far they go. Don't forget Fair Use.
  • by HitekHobo ( 1132869 ) on Tuesday July 31, 2007 @10:21AM (#20057129) Homepage
    Can I just filter the video to change the general shape and size of the content and scribble all over it until humans can't recognize it? Seems to work for websites that require a signup... I had one the other day that took 4 people and 5 attempts to actually sign up.
  • Two-part Protection (Score:4, Interesting)

    by RancidPickle ( 160946 ) on Tuesday July 31, 2007 @10:26AM (#20057195) Homepage
    One part is the same -- someone spots "their" video, they take it down immediately to avoid getting sued under the DCMA. Expect the takedown notices to continue, which will still kill parody videos, fair-use compliant videos, and videos that are legal, but someone sends bogus takedown notices, such as the ones that Viacom "accidently" included in their original request.

    The second part sounds more promising, but someone may be able to get around hashing the videos, such as inserting random one-frame images, as in the Fight Club movie, or adding in overlay text, or possibly adding in effects. If they try to hash a few selected time slices, someone will figure it out eventually. As with all digital protection, this just pushes off the inevitable. At least it will make Google look good in court, since they're attempting to comply with Viacom and the other copyright holder's requests for not posting their material.

    In the end, it won't count for much. It would make more sense to add in additional protections for false or malicious takedown notices, such as adding in a $50K fine for false claims. This would at least make the big companies scrutinize the videos that they're issuing a takedown notice for.
    • by kebes ( 861706 )
      Yeah the users can probably circumvent it... but will they bother?

      My impression is that most of the unauthorized clips on YouTube are put there by fans of the shows in question. They do it because it is easy and fun, and they want to share the thing they like with others. I don't think they will continue to bother if it becomes onerous to post the clips (requiring constant editing, posting, re-editing, re-posting, etc.). This is different from P2P networks, where once someone goes to the effort of making a
  • by dpbsmith ( 263124 ) on Tuesday July 31, 2007 @10:26AM (#20057199) Homepage
    We don't want "sophistication," we want reliability.

    And since they are making the comparison... just how reliable [truthinjustice.org]
    are fingerprints, really?

    True, a character in Mark Twain's 1893 novel Pudd'n'head Wilson tells a court

    "Every human being carries with him from his cradle to his grave certain physical marks which do not change their character, and by which he can always be identified -- and that without shade of doubt or question. These marks are his signature, his physiological autograph, so to speak, and this autograph canImage available not be counterfeited, nor can he disguise it or hide it away, nor can it become illegible by the wear and mutations of time. This signature is not his face -- age can change that beyond recognition; it is not his hair, for that can fall out; it is not his height, for duplicates of that exist; it is not his form, for duplicates of that exist also, whereas this signature is each man's very own -- there is no duplicate of it among the swarming populations of the globe! This autograph consists of the delicate lines or corrugations with which Nature marks the insides of the hands and the soles of the feet."

    and ever since Mark Twain said so everyone has believed it, but that doesn't necessarily make it true.
  • Dumb. Really dumb. (Score:4, Insightful)

    by Anonymous Coward on Tuesday July 31, 2007 @10:27AM (#20057207)
    The supposedly clever media moguls are missing a wealth-building opportunity. Lots of these "infringing videos" are short clips from longer presentations. If they had any smarts at all, they'd ask Google to set up a link on those pages where people could buy the programs/music on disk, or direct download them for a fee. Instead, the moguls want to get rid of what amounts to "free advertising" because they fear the new paradigm.
    • Appearantly you never had a boss who would shoot down every idea that isn't his own. Usually, businesses with a boss like that go under in pretty short time.

      In other words, I have my hopes up that we might get rid of them pretty soon.
    • Bingo! I already posted, or I would mod you up.
  • When notified, the company removes the offending videos, in compliance with the American Digital Millennium Copyright Act.
    The trouble with the first system is that neither Google nor the copyright holders can possibly keep up with the vast number of copyrighted videos uploaded each day.

    What exactly is the compelling legal argument that spawned three lawsuits?
    That GooTube isn't complying with the DMCA?
    That complaince with the DMCA isn't enough?

    Depending on your POV, the 'right' thing to do is either to create new filters (business), or to try and win the lawsuits (users).

    • Re: (Score:3, Insightful)

      by eclectro ( 227083 )
      Well, it's the cost of fighting a copyright battle, and also the dark possibility that the judge would side with the copyright holders which they almost invariably always do these days.

      Take that and the fact that Google is actually a big fat cash cow with a bulls-eye on the side of it and it becomes obvious that the best strategy is one of accomadation. Rather than a long drown out battle that would also hurt googles stock price because of the uncertainity it creates.

      So anyway you cut it, this looks like th
  • by sjonke ( 457707 ) on Tuesday July 31, 2007 @10:34AM (#20057287) Journal
    YouTube Video Strip-Searching is due in January '08
  • Alright kiddies,

    Step1. Get out your laptop & your random MAC address generator toolkit
    Step2. Drive down some random street until you find...
    Step3. A neighbour with unprotected WIFI (or just crack their non-WPA2 secure connection)
    Step4. Carry on & upload your Simpson's episodes to Youtube.
    Step5. Cause profits (loss) for Simpson's authors

    • you forgot the part of step 3.3 Sign up for e-mail addresses, and a you tube account so that those can not be traced to you. Also, step 5 is not a proven profit loss by any stretch of the imagination. Oftentimes the teasers and stuff people post on youtube cause more people to want to watch the movie.
    • Step2. Drive down some random street until you find...
      Step3. A neighbour with unprotected WIFI (or just crack their non-WPA2 secure connection)
      Call me a pedant, but do you not find it disconcerting to have neighbours on random streets? Maybe you meant they were somebody else's neighbour...

  • 'infringement' (Score:3, Interesting)

    by SCHecklerX ( 229973 ) <greg@gksnetworks.com> on Tuesday July 31, 2007 @11:03AM (#20057677) Homepage
    Same as with music. If people are going to buy it, they will. Just charge a fair price. Use youtube as advertisement for commercial interests (daily show, colbert report, robot chicken, anyone?)

    But youtube is a little different in that many of the things people go there for are unique or one-time things that the only way you'll ever get a chance to see them again is if you recorded it yourself, or somebody else does and you are lucky enough to find it online.

    The biggest issue I have is stuff that you'll NEVER BE ABLE TO ACTUALLY BUY OR SEE AGAIN being taken down. My favorite example is prince performing at half time for the superbowl. Now, not only are the videos gone from youtube, but also all of the comments (which IMHO are equally as valuable to the community) about the videos.

    Taking things like this down erodes our culture and destroys valuable records of what has gone on in our lives.
    • by tlhIngan ( 30335 )

      The biggest issue I have is stuff that you'll NEVER BE ABLE TO ACTUALLY BUY OR SEE AGAIN being taken down. My favorite example is prince performing at half time for the superbowl. Now, not only are the videos gone from youtube, but also all of the comments (which IMHO are equally as valuable to the community) about the videos.

      <stupid question>I seem to remember that one could buy DVD sets of the superbowl, no? Wouldn't said DVD sets include the half-time show?</stupid question>

      I only ask because

    • The biggest issue I have is stuff that you'll NEVER BE ABLE TO ACTUALLY BUY OR SEE AGAIN being taken down. My favorite example is prince performing at half time for the superbowl.

      Isn't not being able to see Prince perform a feature? ;)
      • There were enough comments about what you just said on youtube when the video was up...oh wait, you can't see those comments anymore...
  • Google is already complying with the letter of the DMCA. What right does any other organization have to compel Google to go beyond that and spend a fortune creating a video fingerprinting system? Google ought to be fighting this, not bending over and doing whatever large media companies demand.

    Doing this is manifestly against the interest of the people who made Google what it is today. What happened to doing no evil?
  • Funny- I can't post a home video with copyrighted background music on MSN Soapbox - they refused to publish it. But it works great in GooTube. A previous writer hit the nail on the head - when Google pulls down all of the illegal content, GooTube will turn into a GooGhostTown...
  • Are they gonna sue Google because its search engine aids in the acquisition of copyrighted content (like when you search for torrents)?
  • Phone companies and ISP's claim that since they are "Common Carriers" just carrying content from other people without any knowledge/filtering of the content, that they can't be sued if naughty/illegal stuff is carried on their wires. If Google has pro-active measures in place to filter submissions, then won't they lose that protection and can then be sued for anything that slips through?
    • Re: (Score:1, Informative)

      by Anonymous Coward
      Um, no. A carrier passes traffic. YouTube hosts content. A big difference.
      • Um, no. A carrier passes traffic. YouTube hosts content. A big difference.

        YouTube is an Internet Service Provider. Generally they are not liable for content hosted on their servers as long as they honor removal requests from copyright holders in a timely manner.

        IPS's are also not required to develop copy protection technology for the benefit of copyright holders.

        ISP's do have several requirements in order to maintain their ISP status, but none of these have anything to do with copyright infringement or developing software.

  • People will find another outlet. If the content that people are looking for is not on youtube then they will abandon the site for somewhere else.
    The content industry should take a lesson learned from the past. Right now they have a large concentration of people looking at grainy, low resolution video in one place. Remove that and the sites will go underground, and maybe with even better quality video which would be a real threat to their model. They should take the opportunity to promote their product - h
    • I have since abandoned Youtube when I found this:

      http://stage6.divx.com/ [divx.com]

      The image quality is amazingly good, totally opposite to Youtube and flash video in general.

      Also, and very important to me (because I multitask), it consumes about the same CPU in both cases. Probably flash video consumes a little more.
  • Thing is, the content owners will have to provide the "fingerprint," or at least the video from which the "fingerprint" will be made. Now, while I can see the bigger corporations taking the time, effort and money to do this for current and recent popular TV shows and movies, a lot of the more obscure stuff is still going to be around. Part of the fun and appeal of sites like YouTube is getting a chance to see or re-see more esoteric (i.e., not very commercially viable) clips -- old TV shows with a cult foll
  • Of course, such a fingerprinting scheme ignores fair use. If a match is found, the video will automatically be flagged as infringing without concern for context and prevented from being uploaded without any oversight.

    It's like being found guilty for murder because your fingerprints were found at the scene... on some groceries you bagged for the victim at the supermarket a week earlier, but that is of no concern to the justice droid.
  • you could easily beat any hash of a video with a watermark hue shift of say, 1 value in the red. The video would be unintelligibly different to the naked eye, but each and every frame would be significantly different than an original based on the exact video hash. Is this not how it would work?
  • by AmiMoJo ( 196126 ) <mojo&world3,net> on Tuesday July 31, 2007 @01:05PM (#20059579) Homepage Journal
    How is this going to work? Will Google process all copyrighted videos themselves and produce the necessary data to block them? If so, what is the backlog going to be when big media submits 90 years of video?

    If Google are not going to check it, what is to stop me downloading a Quicktime trailer of a movie, generating the data and submitting it to Google for blocking? It will quickly become impossible for even sanctioned videos to appear. Cultists/Scientologists will be screwed too.

    As usual, media companies are being idiots. They paniced about the VCR, they paniced about P2P, they are panicing about DVRs and YouTube. In the end, new technology tends to do them good in the long run and besides which, you can't fight it.
  • Wonder if it's got anything to do with Kadokawa Group's [varietyasiaonline.com] deal with Google. I found it amusing that they were helping develop this kind of system for Google, since they're probably one of the few groups that have really gotten some sales/publicity boost from user-generated content on Youtube and NiconicoVideo [nicovideo.jp].

It seems intuitively obvious to me, which means that it might be wrong. -- Chris Torek