Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Microsoft Graphics Patents Technology

Microsoft Patents GPU-Accelerated Video Encoding 304

An anonymous reader writes "Microsoft has just received a patent that grants the company the rights to GPU-accelerated video encoding, which may be the primary technology that takes advantage of the horsepower of the GPU in today's consumer applications. The broad patent covers tasks to perform motion estimation in videos, the use of the depth buffer of the GPU, to determine comprising, collocating video frames, mapping pixels to texels, frame processing using the GPU, and output of data to the CPU."
This discussion has been archived. No new comments can be posted.

Microsoft Patents GPU-Accelerated Video Encoding

Comments Filter:
  • Enforceable? (Score:5, Interesting)

    by airfoobar ( 1853132 ) on Tuesday October 12, 2010 @12:48PM (#33872084)

    The patent was filed in 2004, and there must be loads of prior art. Companies such as Nvidia and ATI have had GPU-accelerated video encoders for years now.

    Regardless, this patent should never have been granted. It's all because of the patent office's massive backlog, and their decision to accept every random patent to reduce it.

  • Re:Extra Extra! (Score:2, Interesting)

    by morgan_greywolf ( 835522 ) on Tuesday October 12, 2010 @12:52PM (#33872168) Homepage Journal

    It may be obvious now, but Microsoft applied for this patent in 2004, well before stuff like OpenCL and CUDA came around.

    Was it obvious then? IMHO, yes: I had thought of this as early as 1996 and was disappointed to learn that GPUs at that time lacked sufficient horsepower and, more importantly, sufficient instruction sets.

  • by eldavojohn ( 898314 ) * <eldavojohn@noSpAM.gmail.com> on Tuesday October 12, 2010 @12:52PM (#33872172) Journal

    But what about programs like Badaboom that already use GPU acceleration in their encoding? Patents confuse me to no end.

    The patent application was received in October of 2004 according to the article. So I assume Badaboom would have to precede that or produce some form of prior art preceding that date to defend themselves should Microsoft resort to litigation after failing to agree to a licensing deal with Badaboom's creators. Regardless, a cursory glance proves that Microsoft could out lawyer them whether they are right or not so I believe with a 98% confidence that BadaboomIt is facing some serious liabilities.

  • Re:Holy cripes! (Score:3, Interesting)

    by frinkster ( 149158 ) on Tuesday October 12, 2010 @12:57PM (#33872260)

    Isn't one requirement of a patent for it to be non-obvious?!!!

    One of the strongest justifications for patent protection is when you create something that becomes ridiculously obvious once you create it. This is pretty much the most perfect definition of

    promote the Progress of Science and useful Arts

    that you will ever find.

    It was mentioned in another comment that the patent was applied for in 2004 and that as far as that poster knew nobody else was doing it. So... Were people contemplating this back in 2004? The idea of video cards being used for general purpose computing is not very old. Transcoding user-generated video from one format to another was not very common until YouTube got popular. When was that?

  • by Anonymous Coward on Tuesday October 12, 2010 @01:04PM (#33872432)

    I can PERSONALLY prove prior art on this patent.

    I have custom code from a project back in 1997-1998 for a Chromatic Research MPACT video card that used it to offload either MPEG-1 or H.263 video encoding process to the card.

    I also have code from the same era that offloads both H.263 and/or MPEG-1 encoding to a video card that is based around a combination of a Trident 9xx series video chip and an 8x8 VCP.

    So, I can PROVE I have WORKING code that does what this patent is for that was written in 1998 or earlier.

  • 3dfx anyone (Score:2, Interesting)

    by goobenet ( 756437 ) on Tuesday October 12, 2010 @01:05PM (#33872446)
    Pretty sure the prior art goes back waaaaaay beyond 2004. 3Dfx was out of buisness by the time this patent was filed. In other news, the fastest counterclaim lawsuit has been filed by any/all video card manufacturers in business before 2004.
  • Prior Art? (Score:3, Interesting)

    by psbrogna ( 611644 ) on Tuesday October 12, 2010 @01:19PM (#33872712)
    Weren't "Toaster" boards in Amigas doing video encoding on GPUs in the early 1990's?
  • Re:Holy cripes! (Score:3, Interesting)

    by eldepeche ( 854916 ) on Tuesday October 12, 2010 @01:21PM (#33872730)

    How does it promote the progress of science and useful arts to grant a monopoly on using a processor to encode video? Now Microsoft is the only one who can sell a product that uses a GPU to encode video, a situation that doesn't give me a great deal of hope for the progress of anything.

  • Re:Extra Extra! (Score:3, Interesting)

    by Jah-Wren Ryel ( 80510 ) on Tuesday October 12, 2010 @01:31PM (#33872924)

    No. You are confusing a current graphics card for one of six years ago. It may be obvious now that we can do fast vector processing on a GPU, but they have come along way in six years, each generation aiming more at GPGPU.

    While I totally disagree that it wasn't obvious from well over a decade ago - TI DSP chips (aka vector processors) were being used on video cards from companies like NeXT in the 1990s - lets ignore that and assume your premise. If the patent really applies to such crippled GPUs then it clearly can't apply to GPUs that have been specifically enhanced for such functionality which make the patent moot when granted.

  • Re:Extra Extra! (Score:4, Interesting)

    by Cornelius the Great ( 555189 ) on Tuesday October 12, 2010 @01:31PM (#33872928)

    In fact, this is a math coprocessor revisited. Remember those?

    Yes, those chips that handled floating-point operations so well that they eventually were integrated directly onto the CPU die itself; ie- 80386 CPU + 387 co-processor evolved into a single 486DX with integrated FPU.

    Still, I don't see the why you're comparing them to GPUs... FPUs were small in comparison, even compared to early fixed-function rasterizers from the 90s; today GPUs are multi-billion-transistor chips with hundreds of programmable stream processors (with faster/higher bandwidth memory) that not only cover all of the rendering pipeline, but can do general-purpose computation as well. While small GPUs are getting integrated into future CPUs (AMD Fusion, Intel Sandy Bridge, etc), I'm doubtful discrete graphics will disappear in the way x86 math-coprocessors did, at least for the foreseeable future.

  • Re:Extra Extra! (Score:2, Interesting)

    by Anonymous Coward on Tuesday October 12, 2010 @01:36PM (#33873038)

    A patent should be awarded for a clearly described and working implementation of an idea.

    FTFY.

  • Re:Holy cripes! (Score:3, Interesting)

    by wonkavader ( 605434 ) on Tuesday October 12, 2010 @01:37PM (#33873074)

    "Were people contemplating this back in 2004? The idea of video cards being used for general purpose computing is not very old. Transcoding user-generated video from one format to another was not very common until YouTube got popular."

    Encoding isn't transcoding, though they're similar. People have been encoding for a long time and always looking to offload any of the task onto some other processor. You live in a time when some encoding can be done in real time with a general purpose CPU. In the 90's, when I was doing encoding, any way one could scratch a little more speed by getting something else to do the work made a lot of sense.

    I suspect that there is prior art on the concept (though perhaps using different algorithms) which date back to the first GPUs. Things were really slow, and people were desperate. If I remember correctly, I wound up buying a card for a PC with two PowerPC chips in on it. That was not cheap. And yes, I know, not prior art. But I strongly suspect that some people in my position probably tried to do the same things with their Voodoo cards, which would be.

  • Re:Extra Extra! (Score:4, Interesting)

    by IICV ( 652597 ) on Tuesday October 12, 2010 @01:42PM (#33873158)

    That's why I'm an advocate of an adversarial patent system, something kinda like this:

    You (as a private individual or a company) can sign up to be a "patent examiner", for a minor fee. You specify which areas in which you have expertise (and you (or your employees) may need certain certifications as specified by professional groups in that area, depending). When a patent is submitted, it is required that the patent clearly and specifically state what problem the covered art solves - e.g, in this case, it would be something like "efficiently encoding video using components found in a commodity computer".

    Then, a few examiners are picked at random. They're given a day or so in isolation, with whatever reference materials they want to bring (no networked devices, though) to figure out how they would solve the problem. At the end of their isolation, they just need to produce a couple of sketches of how they would go about solving the problem.

    If a majority of the examiners (who should be experts in the field of the patent) produce any solution sketches that are largely similar to the patent, the patent is rejected - because clearly, if when experts in the field set their minds to solving that problem they come up with the to-be-patented invention, then it's not novel; it's just an obvious evolution no one else has gotten around to doing yet.

  • Re:Extra Extra! (Score:3, Interesting)

    by LingNoi ( 1066278 ) on Tuesday October 12, 2010 @01:52PM (#33873322)

    If you truely understand how software patents worked you'd realise that the time the patent was filed means nothing. You can submit a vague patent now and continue to make edits to it. So all you really need to do is patent flying cars, wait for someone to actually do that then fill in the patent with the competitors implementation.

    Microsoft is guilty of filing bogus patents [slashdot.org] already so I'm not so likely to readily believe they've come up with anything.

  • by Magnificat ( 1920274 ) on Tuesday October 12, 2010 @01:53PM (#33873356)
    So, I'll do it non-anonymously.... This is very easy to verify. Go look up either the 8x8 VCP or the Chromatic Research MPACT! series of parts. Specifically, the MPACT! part was released as a VGA card that implemented in VLIW software 3D, DVD decode, and a Soft Modem (yes, a modem). They additionally had software for doing both MPEG1 video encoding and H.263 video encoding for POTS teleconferencing. I still have a working PCI card that implemented VGA, soft modem, and H.263 encode to allow teleconferencing over a standard phone line using a Windows 95/98 PC.
  • by Magnificat ( 1920274 ) on Tuesday October 12, 2010 @02:24PM (#33873914)
    As I said, that is why I said the 8x8 VCP part is a bit iffy in terms of how it relates to this patent. However, the Chromatic MPACT! series of parts a dedicated GPU capable of general purpose VLIW processing, VGA, and 3D. In essence, it *was* a dedicated VGA/3D accelerator that could also be reprogrammed to do video decode/encode and general purpose DSP functions. Basically, it was a lot like CUDA, only 10 years before the market was ready for it. The other thing that is MOST IMPORTANT about this is that Chromatic Research was acquired by **ATI** in the 1999 time frame. This means that as part of this acquisition, ATI acquired any rights to prior art on this patent!!!
  • Re:Extra Extra! (Score:3, Interesting)

    by dgatwood ( 11270 ) on Tuesday October 12, 2010 @02:26PM (#33873938) Homepage Journal

    All of you lack the experience or the memory to say how obvious it would be ON THE HARDWARE AVAILABLE THEN.

    Video decompression had been done with GPU assist (including decoding the motion compensation information) for at least a year [tomsguide.com] by NVIDIA hardware before this patent was filed, and maybe earlier. Are you saying that reversing the process---moving from decoding to encoding---is so fundamentally different that it isn't an obvious thing to do? That seems like a prima facie absurd argument that demands extraordinary proof.

    What, precisely, about this patent was novel, unobvious, and useful in 2004? Because I just don't see it.

  • Re:Extra Extra! (Score:3, Interesting)

    by beelsebob ( 529313 ) on Tuesday October 12, 2010 @02:31PM (#33874054)

    No, no it's not obvious at all – what they've patented is the methods of doing it, not the act of doing it, and those are complex, difficult algorithms to come up with – using specialised hardware to do a different specialised task. There are other ways of doing video encoding on the GPU. The headline is massively inflamatory, they haven't patented video encoding on the GPU at all –merely one way of doing it.

  • by BatGnat ( 1568391 ) on Tuesday October 12, 2010 @02:39PM (#33874158)
    It specifically says GPU, dont sell them as GPU's anymore. Lets call them MFPU's, multi-function processing units (or some other use of the letters MF).
  • Re:Extra Extra! (Score:5, Interesting)

    by onionman ( 975962 ) on Tuesday October 12, 2010 @02:40PM (#33874182)

    No, but providing someone with a unit to calculate (for example) a transformation doesn't mean that you give them an efficient way of computing FFTs. Of course if you give them a general purpose matrix-vector multiplier then it does. In 2004 a graphics card used a fixed function unit, and today it uses a general purpose one. Assumptions about how obvious it is that other applications can be performed don't carry back to previous generations of the hardware.

    I do believe that you are arguing in good faith and that you are not a troll, and I understand your argument. Here are the things which I believe were obvious (to general practitioners in the field) in the 1990s:

    1. Video display and video (de)compression are essentially large linear algebra problems which lend themselves to vector processing.

    2. Video display hardware will continue to improve.

    3. Eventually video display hardware will become sufficiently powerful to perform video (de)compression.

    The reason I believe that those were all obvious to practitioners in the field in the 1990s is because they were obvious to me in the 1990s and I wasn't even a specialist in the field. (I was, however, working on large number-crunchy stuff). I would suspect that the true experts in those fields were well aware of all three of the above points even in the 1980s.

    I'll have to end my participation in this thread, now, and get back to real work...

  • Re:Extra Extra! (Score:1, Interesting)

    by Anonymous Coward on Tuesday October 12, 2010 @04:05PM (#33875390)

    Didn't Microsoft acquire many of the SGI graphics patents at one time?

Real Programmers don't eat quiche. They eat Twinkies and Szechwan food.

Working...