Forgot your password?
typodupeerror
Microsoft Graphics Patents Technology

Microsoft Patents GPU-Accelerated Video Encoding 304

Posted by CmdrTaco
from the keep-'em-accelerated dept.
An anonymous reader writes "Microsoft has just received a patent that grants the company the rights to GPU-accelerated video encoding, which may be the primary technology that takes advantage of the horsepower of the GPU in today's consumer applications. The broad patent covers tasks to perform motion estimation in videos, the use of the depth buffer of the GPU, to determine comprising, collocating video frames, mapping pixels to texels, frame processing using the GPU, and output of data to the CPU."
This discussion has been archived. No new comments can be posted.

Microsoft Patents GPU-Accelerated Video Encoding

Comments Filter:
  • Extra Extra! (Score:4, Insightful)

    by WrongSizeGlass (838941) on Tuesday October 12, 2010 @11:39AM (#33871828)
    A Graphics Processing Unit has been used to accelerate video!

    If this doesn't qualify as 'obvious' then we are all doomed.
    • Re: (Score:2, Interesting)

      It may be obvious now, but Microsoft applied for this patent in 2004, well before stuff like OpenCL and CUDA came around.

      Was it obvious then? IMHO, yes: I had thought of this as early as 1996 and was disappointed to learn that GPUs at that time lacked sufficient horsepower and, more importantly, sufficient instruction sets.

      • Re:Extra Extra! (Score:5, Insightful)

        by autocracy (192714) <slashdot2007 AT storyinmemo DOT com> on Tuesday October 12, 2010 @11:55AM (#33872234) Homepage

        It seems that the idea was apparent long before the patent came about. I think the underlying reason that we haven't seen it yet is that the tradeoff value wasn't present yet. The GPU must beat out the CPU by a sufficiently wide price and performance margin for the workload before anybody bothers with specialized code for it.

        In fact, this is a math coprocessor revisited. Remember those?

        • Re:Extra Extra! (Score:5, Informative)

          by Jeremiah Cornelius (137) on Tuesday October 12, 2010 @12:10PM (#33872532) Homepage Journal

          We saw this at SIGGRAPH for YEARS before 2000.

          I think that Be Computer had colliding patents - not to mention SGI and nVidia.

        • Re:Extra Extra! (Score:4, Interesting)

          by Cornelius the Great (555189) on Tuesday October 12, 2010 @12:31PM (#33872928)

          In fact, this is a math coprocessor revisited. Remember those?

          Yes, those chips that handled floating-point operations so well that they eventually were integrated directly onto the CPU die itself; ie- 80386 CPU + 387 co-processor evolved into a single 486DX with integrated FPU.

          Still, I don't see the why you're comparing them to GPUs... FPUs were small in comparison, even compared to early fixed-function rasterizers from the 90s; today GPUs are multi-billion-transistor chips with hundreds of programmable stream processors (with faster/higher bandwidth memory) that not only cover all of the rendering pipeline, but can do general-purpose computation as well. While small GPUs are getting integrated into future CPUs (AMD Fusion, Intel Sandy Bridge, etc), I'm doubtful discrete graphics will disappear in the way x86 math-coprocessors did, at least for the foreseeable future.

          • So, you mean that if I take 11 cylinders instead of 10 or 12, I could get a patent on a new car trunk, just because of that? Or, if I would get patent on 3 CPU word processing, or 4 CPU word processing or double butt *%&#+/@, whatever...

          • Re: (Score:3, Insightful)

            by beelsebob (529313)

            Because while an early FPU may be drastically more simple than a modern GPU, that *is* what a GPU is... It's a big floating point maths unit – it's particularly good at matrix/vector floating point maths.

      • Re: (Score:2, Informative)

        by Anonymous Coward

        In 1990 I worked on a project using TMS320C30 floating pt. processors to simulate radar imagery under control of a 68040 processor. They weren't called GPUs specifically - I'm sure in lawyer speak that wouldn't affect MicroQuack's patent trolling. But in non-lawyer speak, it surely amounted to video processing using dedicated processors & the only thing different is the packaging - independent chips on a VME board.

      • by Joce640k (829181)

        The patent is basically "use the fastest chip in the computer to do the work".

        How is that non-obvious?

      • ATI Rage 3D which was introduced in 1995 supported MPEG-1 acceleration, Rage II added MPEG-2. Either there's something a bit more nuanced about the patent or the patent examiner really screwed up.
        • Re: (Score:3, Informative)

          Either there's something a bit more nuanced about the patent....

          Um, that's the case with ALL the patents Slashdot has covered. Everybody cries 'obvious' when they oversimplify the task, nobody cites elements of the patent and says 'prior art!' I haven't seen the patent yet, but I have a dollar that says they attempted to patent using a particular instruction set in a way it wasn't intended to pull off a really slick result. That's usually what happens but it takes time to go look that up and we all want the word 'insightful' to appear by our posts.

          • Re: (Score:3, Informative)

            by russotto (537200)

            Um, that's the case with ALL the patents Slashdot has covered. Everybody cries 'obvious' when they oversimplify the task, nobody cites elements of the patent and says 'prior art!'

            Yes, we do. For instance, I did exactly that with a patent which covered gaming techniques which netrek had included years before.

            One problem is that one need merely make one minor variation for a patent to not be covered by prior art. You'd think that the patent would then be totally hemmed in by the prior art and cover only the

      • Re:Extra Extra! (Score:5, Informative)

        by jedidiah (1196) on Tuesday October 12, 2010 @02:00PM (#33874472) Homepage

        > It may be obvious now, but Microsoft applied for this patent in 2004, well before stuff like OpenCL and CUDA came around.

        Nevermind OpenCL. How about things like PureVideo, VDPAU and xvmc?

        Accelerating video with the video card is about as obvious as it gets.

        This sounds like YOU patenting something: Visualize something that's obvious but hasn't quite gotten here to do hardware limitations and then rush off to the patent office.

        The only difference between you and Microsoft is that Microsoft has lawyers on retainer.

    • Re: (Score:2, Insightful)

      by smallfries (601545)

      Why would it be obvious that hardware designed to accelerate 3d rendering - transformation, lighting and rasterisation - can accelerate the compression of video frames?

      It seems that you are 'obviously' wrong.

      • Re:Extra Extra! (Score:5, Insightful)

        by onionman (975962) on Tuesday October 12, 2010 @12:02PM (#33872366)

        Why would it be obvious that hardware designed to accelerate 3d rendering - transformation, lighting and rasterisation - can accelerate the compression of video frames?

        It seems that you are 'obviously' wrong.

        It's seems incredibly obvious to me. Of course, I've worked on FFT code for Cray vector units which were around a long time before 2004. If you can't see the relationship between vector processing, FFTs, and any form of video compression/display, then perhaps you shouldn't be in charge of determining what is "obvious" regarding this particular patent.

        I have long felt that our patent system is ridiculous because it allows such silly patents. If something is obvious to an expert in the field, then it shouldn't be patentable.

        • No. You are confusing a current graphics card for one of six years ago. It may be obvious now that we can do fast vector processing on a GPU, but they have come along way in six years, each generation aiming more at GPGPU.

          In 2004 the 6600 was just making it onto reviewers desks. CUDA was a gleam in some engineers eye and most graphics cards still used fixed function pipelines to perform rendering. Even the latest generation that allowed programable pixel shaders did not automatically accelerate any vectoris

          • Re: (Score:3, Interesting)

            by Jah-Wren Ryel (80510)

            No. You are confusing a current graphics card for one of six years ago. It may be obvious now that we can do fast vector processing on a GPU, but they have come along way in six years, each generation aiming more at GPGPU.

            While I totally disagree that it wasn't obvious from well over a decade ago - TI DSP chips (aka vector processors) were being used on video cards from companies like NeXT in the 1990s - lets ignore that and assume your premise. If the patent really applies to such crippled GPUs then it clearly can't apply to GPUs that have been specifically enhanced for such functionality which make the patent moot when granted.

          • by Joce640k (829181)

            The words "graphics card" are a complete red herring.

            It's obvious that you should use the fastest bit of the computer for heavy processing jobs, whatever that part happens to be...

          • Re: (Score:3, Informative)

            by onionman (975962)

            So again I'll ask you; why is it obvious that units designed to do transformation, lighting and rasterisation (not general vector operations) should be good at encoding video?

            Because (linear) transformations, lighting and rasterisation are just large matrix-vector operations... as is the FFT which is central to every video compression algorithm. Just because the words are different doesn't mean that the math is different.

            • Re: (Score:3, Insightful)

              by smallfries (601545)

              No, but providing someone with a unit to calculate (for example) a transformation doesn't mean that you give them an efficient way of computing FFTs. Of course if you give them a general purpose matrix-vector multiplier then it does. In 2004 a graphics card used a fixed function unit, and today it uses a general purpose one. Assumptions about how obvious it is that other applications can be performed don't carry back to previous generations of the hardware.

              • Re:Extra Extra! (Score:5, Interesting)

                by onionman (975962) on Tuesday October 12, 2010 @01:40PM (#33874182)

                No, but providing someone with a unit to calculate (for example) a transformation doesn't mean that you give them an efficient way of computing FFTs. Of course if you give them a general purpose matrix-vector multiplier then it does. In 2004 a graphics card used a fixed function unit, and today it uses a general purpose one. Assumptions about how obvious it is that other applications can be performed don't carry back to previous generations of the hardware.

                I do believe that you are arguing in good faith and that you are not a troll, and I understand your argument. Here are the things which I believe were obvious (to general practitioners in the field) in the 1990s:

                1. Video display and video (de)compression are essentially large linear algebra problems which lend themselves to vector processing.

                2. Video display hardware will continue to improve.

                3. Eventually video display hardware will become sufficiently powerful to perform video (de)compression.

                The reason I believe that those were all obvious to practitioners in the field in the 1990s is because they were obvious to me in the 1990s and I wasn't even a specialist in the field. (I was, however, working on large number-crunchy stuff). I would suspect that the true experts in those fields were well aware of all three of the above points even in the 1980s.

                I'll have to end my participation in this thread, now, and get back to real work...

          • Re: (Score:3, Interesting)

            by LingNoi (1066278)

            If you truely understand how software patents worked you'd realise that the time the patent was filed means nothing. You can submit a vague patent now and continue to make edits to it. So all you really need to do is patent flying cars, wait for someone to actually do that then fill in the patent with the competitors implementation.

            Microsoft is guilty of filing bogus patents [slashdot.org] already so I'm not so likely to readily believe they've come up with anything.

        • by westlake (615356)

          If something is obvious to an expert in the field, then it shouldn't be patentable.

          A patent is awarded for a clearly described and working implementation of an idea. It isn't enough to say - in a vague sort of way - that the idea alone seems obvious enough in retrospect.

          • Re: (Score:2, Interesting)

            by Anonymous Coward

            A patent should be awarded for a clearly described and working implementation of an idea.

            FTFY.

          • Re:Extra Extra! (Score:4, Insightful)

            by dgatwood (11270) on Tuesday October 12, 2010 @12:37PM (#33873068) Journal

            Except that conceptually, it's a trivial extension of texture compression, which video cards have supported natively since at least the late 1990s. The only reason we weren't doing video compression is that the video cards weren't fast enough and/or were too power hungry to offer an advantage over CPUs. The patent office should not be awarding patents for discoveries, and that's all GPU-based video decompression really is---discovering that suddenly GPUs are faster than CPUs and things that were impractical (but widely discussed) years before are now practical.

            I've only skimmed patent, so it's possible that the summary sucks and that there's something novel and unobvious here, but at a glance, this patent really does look like an explanation for a straightforward mapping of video compression onto a GPU which with the possible exception of the motion estimation would have been obvious a decade ago. For that matter, if you had asked somebody "how would you do motion estimation on a GPU" a decade ago, they probably would have come up with a similar solution.

            Then again, it's a software patent, and the design process for nearly all software is obvious to someone with suitable skills in the field, which is why these patents are almost universally crap anyway.

        • Re:Extra Extra! (Score:4, Interesting)

          by IICV (652597) on Tuesday October 12, 2010 @12:42PM (#33873158)

          That's why I'm an advocate of an adversarial patent system, something kinda like this:

          You (as a private individual or a company) can sign up to be a "patent examiner", for a minor fee. You specify which areas in which you have expertise (and you (or your employees) may need certain certifications as specified by professional groups in that area, depending). When a patent is submitted, it is required that the patent clearly and specifically state what problem the covered art solves - e.g, in this case, it would be something like "efficiently encoding video using components found in a commodity computer".

          Then, a few examiners are picked at random. They're given a day or so in isolation, with whatever reference materials they want to bring (no networked devices, though) to figure out how they would solve the problem. At the end of their isolation, they just need to produce a couple of sketches of how they would go about solving the problem.

          If a majority of the examiners (who should be experts in the field of the patent) produce any solution sketches that are largely similar to the patent, the patent is rejected - because clearly, if when experts in the field set their minds to solving that problem they come up with the to-be-patented invention, then it's not novel; it's just an obvious evolution no one else has gotten around to doing yet.

          • by znerk (1162519)

            Modify your process so that if *any* of the examiners come up with the same solution, it's obvious, and I'd be willing to give that a shot.
            Oh, and throw out all the rest of the "obvious" patents while you're at it.

            As an aside, what does Microsoft think an MPEG decoder card does? Just curious...

        • by prgrmr (568806)
          Under the current patent law in the US, the obviousness tests hinges upon "a person having ordinary skill in the art" [wikipedia.org]; so if it takes an expert to realize how to take an idea and make it real (i.e., the transformative part), then, by definition in the law, it's NOT obvious.
          • by onionman (975962)

            [me] If something is obvious to an expert in the field, then it shouldn't be patentable.

            [westlake] A patent is awarded for a clearly described and working implementation of an idea. It isn't enough to say - in a vague sort of way - that the idea alone seems obvious enough in retrospect.

            [prgrmr] Under the current patent law in the US, the obviousness tests hinges upon "a person having ordinary skill in the art" [wikipedia.org]; so if it takes an expert to realize how to take an idea and make it real (i.e., the transformative part), then, by definition in the law, it's NOT obvious.

            I was asserting what should be, not what actually is :-)

      • Re:Extra Extra! (Score:5, Insightful)

        by dgatwood (11270) on Tuesday October 12, 2010 @12:27PM (#33872836) Journal

        You're being sarcastic, right? Why would a glorified vector engine be useful for doing video compression, which is basically lots and lots of vector math? It's so obvious that anybody with even basic knowledge of video compression would immediately understand how the two problem spaces map onto one another with no instruction whatsoever.

        It's so obvious that ATI released software to do it within a year of when that patent was first filed, which means they were working on it at least a year before that, which means that multiple people independently came up with the idea at the same time, which means it is obvious.

        Heck, other companies had already been doing this, and even held patents on it [espacenet.com] five years earlier. Okay, so texture compression and video compression aren't quite the same thing. One deals with a single image, one deals with compressing a series of images.... Yeah, that's not obvious to anyone who has never seen someone make a flipbook during class in elementary school.

        • Why would a glorified vector engine be useful for doing video compression, which is basically lots and lots of vector math?

          Because in October 2004, mainstream video cards weren't necessarily "a glorified vector engine". Many were still very much fixed-function.

          One deals with a single image, one deals with compressing a series of images.... Yeah, that's not obvious to anyone who has never seen someone make a flipbook during class in elementary school.

          Flipbooks are like motion JPEG: every frame is a keyframe, and no motion compensation is used. The claims in the present patent relate specifically relate to the use of motion estimation data.

      • by arivanov (12034)

        The fast forms for all of these use convolutions based formulae. This are best done using vector processing.

        If Nvidia and SGI do not have prior art here I can think of someone who does. IBM and altivec and IBM/Sony and Cell.

      • But is it still the same kind of hardware?
        With graphics pipelines that can run increasingly complex programs, the GPUs of today are not quite the spcialized devices of several years ago. It seems to me that "running task X on the GPU" is an overly broad and poorly specified claim that should be rejected for lack of accuracy.

    • I wonder how the hell could they actually get this patent through when GPU has been used to speed up both decoding AND encoding for atleast 10 years now. There's LOADS of previous art to prove that and thus this patent should not have gone through in the first place.

      • by tepples (727027)

        I wonder how the hell could they actually get this patent through when GPU has been used to speed up both decoding AND encoding for atleast 10 years now. There's LOADS of previous art to prove that

        Can you cite three examples from before October 2004, three days prior to the filing date on this patent?

    • Re: (Score:3, Interesting)

      by beelsebob (529313)

      No, no it's not obvious at all – what they've patented is the methods of doing it, not the act of doing it, and those are complex, difficult algorithms to come up with – using specialised hardware to do a different specialised task. There are other ways of doing video encoding on the GPU. The headline is massively inflamatory, they haven't patented video encoding on the GPU at all –merely one way of doing it.

  • by EmagGeek (574360) <<gterich> <at> <aol.com>> on Tuesday October 12, 2010 @11:39AM (#33871846) Journal

    Are they serious? This is virtually the same thing as someone inventing a car and me winning the patent on "driving cars."

    • Agreed. This is a pretty obvious patent. A GPU is a CPU by a different name, and we've been doing it on the CPU for ages.
      • Re: (Score:3, Interesting)

        by BatGnat (1568391)
        It specifically says GPU, dont sell them as GPU's anymore. Lets call them MFPU's, multi-function processing units (or some other use of the letters MF).
  • by Anonymous Coward on Tuesday October 12, 2010 @11:40AM (#33871872)
    Oh look, another patent that shouldn't have been granted. The only thing the modern patent system is good for is buying new boats for patent lawyers. Does this still surprise anyone?
  • by ciaran_o_riordan (662132) on Tuesday October 12, 2010 @11:40AM (#33871890) Homepage

    This patent mightn't change much, but it's the weight of the hundreds of patents that's spoiling the AV field.

    Microsoft is a member of MPEG-LA, but they pay more royalties than they make from the organisation, so they're probably eager to make their own AV thicket.

    * http://en.swpat.org/wiki/Microsoft [swpat.org]
    * http://en.swpat.org/wiki/MPEG_LA [swpat.org]
    * http://en.swpat.org/wiki/Audio-video_patents [swpat.org]

  • Badaboom? (Score:2, Informative)

    by Deathnerd (1734374)
    But what about programs like Badaboom that already use GPU acceleration in their encoding? Patents confuse me to no end.
    • But what about programs like Badaboom that already use GPU acceleration in their encoding? Patents confuse me to no end.

      The patent application was received in October of 2004 according to the article. So I assume Badaboom would have to precede that or produce some form of prior art preceding that date to defend themselves should Microsoft resort to litigation after failing to agree to a licensing deal with Badaboom's creators. Regardless, a cursory glance proves that Microsoft could out lawyer them whether they are right or not so I believe with a 98% confidence that BadaboomIt is facing some serious liabilities.

      • Re: (Score:3, Insightful)

        by forkazoo (138186)

        The patent application was received in October of 2004 according to the article. So I assume Badaboom would have to precede that or produce some form of prior art preceding that date to defend themselves should Microsoft resort to litigation after failing to agree to a licensing deal with Badaboom's creators. Regardless, a cursory glance proves that Microsoft could out lawyer them whether they are right or not so I believe with a 98% confidence that BadaboomIt is facing some serious liabilities.

        The annoying

    • by Kindgott (165758)

      If the patent process is working as intended, which limits my comment to an imaginary world, the only thing I could think of is that Microsoft applied for this patent before the prior art (badaboom, MediaCoder, et al) were developed.

      In the real world, however, the patent office probably just dropped the ball on prior art.

    • Re:Badaboom? (Score:5, Informative)

      by wagnerrp (1305589) on Tuesday October 12, 2010 @12:03PM (#33872396)
      This patent is specifically for running motion compensation calculations on the GPU, and everything else on the CPU. Badaboom runs everything on the GPU, so the patent does not apply.
      • I was going to go through the claims in detail, but you said it very succinctly. Slashdot never lets things like "facts" get in the way of bashing a patent even though the patent is actually pretty narrow in scope and does not cover modern GPU encoding techniques.

    • by yabos (719499)
      Now they're going to go Big Badaboom
    • Re:Badaboom? (Score:5, Insightful)

      by DrgnDancer (137700) on Tuesday October 12, 2010 @12:15PM (#33872634) Homepage

      What is confusing? Microsoft does something. Microsoft applies for a patent on that thing. A patent lawyer who knows very little about the tech in question, has about 600 applications he's currently supposed to be processing, has been instructed that he can't work overtime this week by his boss, but also that he is too far behind on his portfolio and needs to catch up, and who doesn't make near as much as his buddies from law school do to begin with, looks at it. He thinks "I don't even know what half of these words *mean*", then notices that Microsoft filed the patent. Through his haze of pain and frustration he dimly remembers that Microsoft is an "Innovative and economy driving company" and says "fuck it." He hits the "Approve" button.

      His boss is happy because his numbers are better this week, and there is no real penalty for approving patents that later get overturned. Even assuming that Microsoft ever attempts to defend the patent rather than just threatening small companies with it in hopes that they'll cave without a court battle.

      The things currently wrong with the patent system which this story demonstrates:

      1) Patent attorneys often don't understand the tech they are expected to review. This is less of a problem with "real" patents, since the device being patented is just that. A device. If it does what it says it does, in the way it says it does, understanding why isn't all that important. Software is essentially algorithms. If you don't understand them, then judging their uniqueness is difficult.

      2) The reviewers in the patent office are phenomenally overworked right now. There are literally tens of thousands of applications backed up. I saw some patent official guy at the end of the Bush administration say that if all applications stopped, right then, he could maybe catch up in a year or two. I don't imagine it's gotten better. Both Bush and Obama have authorized more reviewers, but it seems to be like filling the ocean with a teaspoon.

      3) Patent reviewers make a fraction of what patent attorneys in private practice make. This means that they're always looking to get out and get into private firms. Probably not all of them, but like any rational human, most want to make more money and get more respect.

      4) There is no real penalty for screwing up. Most patents never get defended in court, because the companies that own them mainly used them as bargaining chips, or to threaten smaller, defenseless, companies. Even if the patent does go to court, it'll take years to invalidate, and no repercussions fall on the approver.

      Eliminating software patents would, in one stroke, alleviate or eliminate two of these four problems. Probably the most serious two. It'd be awful nice if it happened. The alternative is probably the whole system collapsing under its own weight eventually.

         

    • by Yvan256 (722131)

      Leeloo: ...jella boom!
      Korben Dallas: Boom. Yeah! I understand boom.
      Leeloo: Bada boom.
      Korben Dallas: Big... yeah, big bada boom.
      Leeloo: Big! Bada big boom! Big! BOOM!
      Korben Dallas: Yeah! Big bada boom!
      Leeloo: Bada boom!
      Korben Dallas: Yeah-hahaha! Big boom! Big bada boom!

  • 2004 (Score:4, Informative)

    by iamhassi (659463) on Tuesday October 12, 2010 @11:43AM (#33871966) Journal
    Before we get a million "Adobe does this!" comments RTFA: "Microsoft applied for the patent titled “Accelerated video encoding using a graphics processing unit” in October 2004"

    Far as I know no one was doing this in 2004
    • Re:2004 (Score:5, Informative)

      by samkass (174571) on Tuesday October 12, 2010 @11:54AM (#33872212) Homepage Journal

      Before we get a million "Adobe does this!" comments RTFA: "Microsoft applied for the patent titled "Accelerated video encoding using a graphics processing unit" in October 2004"

      Far as I know no one was doing this in 2004

      Still not enough information. Patent claims can change between the original filing and the version that gets granted by amending the patent application. It's done by trolls... get a patent application going and keep it in limbo. Someone else comes to market with something cool, and you add it to your patent application. If it gets approved, viola! You have a patent that is "back-dated" years before the invention hit the market.

      I'd be curious to see exactly what the patent said in 2004 and compare that to the then state-of-the-art.

    • Re: (Score:3, Insightful)

      by swilver (617741)

      All I have to say is... a GPU is like a CPU. We're gonna get patents for everything twice now, just like with handheld devices?

      • Re: (Score:3, Insightful)

        All I have to say is... a GPU is like a CPU. We're gonna get patents for everything twice now, just like with handheld devices?

        Just wait until they start putting GPU's in handheld devices and we're going to see a *hat trick for each patent.


        * I would have used 'three-peat' or '3-peat' instead of 'hat trick' but I believe those are trademarked.

    • Re: (Score:3, Funny)

      by Ant P. (974313)

      I was using a GPU to encode video (to S-Video format) using a GPU long before that.

    • Video games have been doing this for as long as graphic accelerator cards have been in existence. using tools to specify the exact parameters of what is being rendered and saving the output to a file instead of simply displaying it on the screen does, in my opinion, constitute "bleeding obvious". Additionally, while 3DStudio Max, Maya, or After Effects might not have used GPU acceleration prior to 2004, Amiga had all kinds of video-oriented add-in cards. Hell, Matrox RT-2000 units did real-time 3D effects o

    • by Fnkmaster (89084)

      NVidia's PureVideo product was in development in 2003. See the GeForce 6-series page [wikipedia.org] for some information. I don't know if it was doing encoding on the GPU, but it was definitely doing post-processing and I believe video decoding on the GPU. This line was released in mid-2004, and features of PureVideo were released throughout that year so must have been in development since sometime in 2003.

    • Re: (Score:3, Insightful)

      by salesgeek (263995)

      Running code on a coprocessor is not novel in any way. It's what graphics coprocessors are for. This is like patenting a lemonade stand because it is located on the corner rather than in the middle of the block.

  • Smaller Companies (Score:2, Insightful)

    by cymbeline (1792306)

    This will make it almost impossible for smaller companies to make fast video encoding applications. They will have to start paying royalties if they want to encode video using the GPU in applications such as FRAPS or any video converter. Their products will either have to become more expensive or remain inferior to products made by larger companies.

    • by wagnerrp (1305589)
      The patent specifically covers offloading motion compensation to the GPU, while running everything else in the CPU. It only covers this limited offload, because of limitations of the programmable shaders of graphics cards at the time. Any modern GPU encoding implementation would offload most, if not all, of the duties to the GPU, and thus would not be covered under this patent.
      • by Obfuscant (592200)
        The patent specifically covers offloading motion compensation to the GPU, while running everything else in the CPU. It only covers this limited offload, because of limitations of the programmable shaders of graphics cards at the time. Any modern GPU encoding implementation would offload most, if not all, of the duties to the GPU, and thus would not be covered under this patent.

        Hold on a mo', bro'.

        How can a patent that covers offloading the motion compensation part of encoding video to the GPU NOT apply t

    • by zero_out (1705074)

      This will make it almost impossible for smaller companies to make fast video encoding applications. They will have to start paying royalties if they want to encode video using the GPU in applications such as FRAPS or any video converter. Their products will either have to become more expensive or remain inferior to products made by larger companies.

      I think that's the whole purpose of filing patents. The greater purpose of patents is to provide incentive to innovate, but for the patent filer, it's to gain a temporary monopoly on an idea, which gives a competitive advantage.

  • Enforceable? (Score:5, Interesting)

    by airfoobar (1853132) on Tuesday October 12, 2010 @11:48AM (#33872084)

    The patent was filed in 2004, and there must be loads of prior art. Companies such as Nvidia and ATI have had GPU-accelerated video encoders for years now.

    Regardless, this patent should never have been granted. It's all because of the patent office's massive backlog, and their decision to accept every random patent to reduce it.

    • by MobyDisk (75490)

      Which begs the question of why it took 6 years to grant the patent. Since 2004, tons of software has done this. If a patent takes more than a month, something is wrong. Because technology can change radically in 6 years.

    • Re: (Score:3, Insightful)

      I bet the Roman Empire had a massive amount of patent backlogs too, right before the fall.
  • by Anonymous Coward on Tuesday October 12, 2010 @12:04PM (#33872432)

    I can PERSONALLY prove prior art on this patent.

    I have custom code from a project back in 1997-1998 for a Chromatic Research MPACT video card that used it to offload either MPEG-1 or H.263 video encoding process to the card.

    I also have code from the same era that offloads both H.263 and/or MPEG-1 encoding to a video card that is based around a combination of a Trident 9xx series video chip and an 8x8 VCP.

    So, I can PROVE I have WORKING code that does what this patent is for that was written in 1998 or earlier.

    • 'nuff said. I think there's another Anonymous Coward around here who claims he has a working cold fusion reactor in his garage, and another one who invented time travel and could use it, but he just doesn't want to.
      • Re: (Score:2, Interesting)

        by Magnificat (1920274)
        So, I'll do it non-anonymously.... This is very easy to verify. Go look up either the 8x8 VCP or the Chromatic Research MPACT! series of parts. Specifically, the MPACT! part was released as a VGA card that implemented in VLIW software 3D, DVD decode, and a Soft Modem (yes, a modem). They additionally had software for doing both MPEG1 video encoding and H.263 video encoding for POTS teleconferencing. I still have a working PCI card that implemented VGA, soft modem, and H.263 encode to allow teleconferenci
        • by JustNiz (692889)

          It would be cool if you contacted Microsoft and told them you're prepared to licence the rights to them. I'd love to see them wriggle and their reply.

  • 3dfx anyone (Score:2, Interesting)

    by goobenet (756437)
    Pretty sure the prior art goes back waaaaaay beyond 2004. 3Dfx was out of buisness by the time this patent was filed. In other news, the fastest counterclaim lawsuit has been filed by any/all video card manufacturers in business before 2004.
  • Listing of prior art (Score:2, Informative)

    by itsybitsy (149808) *

    Crowd sourced obliteration of this patent.

    Let's list the prior art. If you know of a patent that is prior art please list it here. If you know of a program or computer science paper or article that is prior art please list it here. Provide links if possible. If you review the patent and find a flaw please list it here with your explanation of what the flaw is. If you find any part "obvious" please indicate why. Good hunting.

  • by GreatBunzinni (642500) on Tuesday October 12, 2010 @12:13PM (#33872604)

    This patent is nothing more than a description on how to use a general purpose processor to perform specific tasks. Adding to that, it describes a way to use computers to handle video. And using GPUs to do work is fundamentally very old technology, as they basically are glorified vector processors. So, how can such an obvious and overreaching patent pass regarding such fundamental technology? Is this not a obvious application of this particular technology?

    • If Microsoft can patent the ability to "shut down" a computer, and the ability to have a "user interface with buttons," then why can't they patent the ability to "use a GPU for its stated purpose" ? I'm just wondering how long it will be before Microsoft patents the ability to "use a computer to do computations." Did you know that the concept of "using a hash table to lookup information" has also been patented?

      I really love how patents on software and technology are awarded by Lawyer's and secretaries who

    • by devent (1627873)

      This patent is nothing more than a description on how to use a general purpose processor to perform specific tasks.

      Welcome to software patents, you must be new here.

  • Prior Art? (Score:3, Interesting)

    by psbrogna (611644) on Tuesday October 12, 2010 @12:19PM (#33872712)
    Weren't "Toaster" boards in Amigas doing video encoding on GPUs in the early 1990's?
  • I'd be pissed (and I don't mean drunk) if I was AMD (nee ATI) or Nvidia. Microsoft doesn't even make GPUs, yet now they've patented a way to limit one of the major uses of them and by extension, control the manufacturers of them. How long before the graphics houses need to license (i.e. pay extortion) to MS in order for their hardware to be fully utilized by end users without fear of a lawsuit? It's just another Microsoft tax by another name.
  • A) If you haven't read the patent, read it.
    B) If you don't understand how to read a patent (and odds are you don't if you've never written one or aren't a lawyer, even if you think you do), recognize that your knee-jerk reaction to it may not be accurate ...

    Slashdot has a long and glorious record of flamefests because 99% of its readers don't understand patents or how to read them and think a well-written fairly-narrow patent is covering some broad obvious area.

    Now I'm sure the flamefest will happen, anyway

  • If you actually read the claims of the patent, it only covers doing motion estimation on the GPU and everything else on the CPU. This seems like a rather obvious part to offload to 2004's marginally-programmable GPUs.

  • Graphics cards are used to RENDER video all the time. But, the actual encoding of video has always been done with the CPU, at least up until a few years ago. It's because the encoding itself is a complex algorithm, and GPUs had not really been used to implement arbitrary algorithms before, only the simplistic, high-volume calculations needed to render 2D/3D scenes onto your monitor. It's not until recently that GPUs have become popular for highly-parallelized, general-purpose programming (GP-GPU) that th
  • Someone invents the knife and Microsoft patents cutting meat with it. Society is not well served by patents like this.
  • Look, we already went through the era of patenting obvious, well-known process X "with a computer". Then we went through patenting X "on the internet". We look back on that now, and we say, "Duh, putting it on a computer or on the internet didn't make it novel."

    Is putting it on a GPU any better? No.

It's a naive, domestic operating system without any breeding, but I think you'll be amused by its presumption.

Working...