Microsoft Patents GPU-Accelerated Video Encoding 304
An anonymous reader writes "Microsoft has just received a patent that grants the company the rights to GPU-accelerated video encoding, which may be the primary technology that takes advantage of the horsepower of the GPU in today's consumer applications. The broad patent covers tasks to perform motion estimation in videos, the use of the depth buffer of the GPU, to determine comprising, collocating video frames, mapping pixels to texels, frame processing using the GPU, and output of data to the CPU."
Extra Extra! (Score:4, Insightful)
If this doesn't qualify as 'obvious' then we are all doomed.
Re: (Score:2, Interesting)
It may be obvious now, but Microsoft applied for this patent in 2004, well before stuff like OpenCL and CUDA came around.
Was it obvious then? IMHO, yes: I had thought of this as early as 1996 and was disappointed to learn that GPUs at that time lacked sufficient horsepower and, more importantly, sufficient instruction sets.
Re:Extra Extra! (Score:5, Insightful)
It seems that the idea was apparent long before the patent came about. I think the underlying reason that we haven't seen it yet is that the tradeoff value wasn't present yet. The GPU must beat out the CPU by a sufficiently wide price and performance margin for the workload before anybody bothers with specialized code for it.
In fact, this is a math coprocessor revisited. Remember those?
Re:Extra Extra! (Score:5, Informative)
We saw this at SIGGRAPH for YEARS before 2000.
I think that Be Computer had colliding patents - not to mention SGI and nVidia.
Re:Extra Extra! (Score:4, Interesting)
In fact, this is a math coprocessor revisited. Remember those?
Yes, those chips that handled floating-point operations so well that they eventually were integrated directly onto the CPU die itself; ie- 80386 CPU + 387 co-processor evolved into a single 486DX with integrated FPU.
Still, I don't see the why you're comparing them to GPUs... FPUs were small in comparison, even compared to early fixed-function rasterizers from the 90s; today GPUs are multi-billion-transistor chips with hundreds of programmable stream processors (with faster/higher bandwidth memory) that not only cover all of the rendering pipeline, but can do general-purpose computation as well. While small GPUs are getting integrated into future CPUs (AMD Fusion, Intel Sandy Bridge, etc), I'm doubtful discrete graphics will disappear in the way x86 math-coprocessors did, at least for the foreseeable future.
ten or (Score:2)
So, you mean that if I take 11 cylinders instead of 10 or 12, I could get a patent on a new car trunk, just because of that? Or, if I would get patent on 3 CPU word processing, or 4 CPU word processing or double butt *%&#+/@, whatever...
Re: (Score:3, Insightful)
Because while an early FPU may be drastically more simple than a modern GPU, that *is* what a GPU is... It's a big floating point maths unit – it's particularly good at matrix/vector floating point maths.
Re: (Score:2, Informative)
In 1990 I worked on a project using TMS320C30 floating pt. processors to simulate radar imagery under control of a 68040 processor. They weren't called GPUs specifically - I'm sure in lawyer speak that wouldn't affect MicroQuack's patent trolling. But in non-lawyer speak, it surely amounted to video processing using dedicated processors & the only thing different is the packaging - independent chips on a VME board.
Re: (Score:2)
The patent is basically "use the fastest chip in the computer to do the work".
How is that non-obvious?
Re: (Score:2)
So make an add in card with a CPU socket and RAM slots for use in this processing?
Hmm *looks up patent records for CPU RAM Add in cards for processing*
To be honest I thought this was the next step in high end computer setups. At least with the card with the socket and RAM slots, you could change the CPU and RAM to better suite your needs. I do not know to many people who change the GPU or memory on graphic cards. There is a trade off from having the socket vs having the CPU and memory fixed. Is it worth it?
Re: (Score:2)
Re: (Score:3, Informative)
Either there's something a bit more nuanced about the patent....
Um, that's the case with ALL the patents Slashdot has covered. Everybody cries 'obvious' when they oversimplify the task, nobody cites elements of the patent and says 'prior art!' I haven't seen the patent yet, but I have a dollar that says they attempted to patent using a particular instruction set in a way it wasn't intended to pull off a really slick result. That's usually what happens but it takes time to go look that up and we all want the word 'insightful' to appear by our posts.
Re: (Score:3, Informative)
Yes, we do. For instance, I did exactly that with a patent which covered gaming techniques which netrek had included years before.
One problem is that one need merely make one minor variation for a patent to not be covered by prior art. You'd think that the patent would then be totally hemmed in by the prior art and cover only the
Re: (Score:3, Insightful)
That's because they are almost all of the type 'use a [general-purpose device] to do [specific thing].'
No, they're not. That's what the eyeball-grabbing-summary says. The actual patents say "use these 10 steps to achieve this one result". For example: Slashdot ran a story that Microsoft patented page-turning on an e-book reader. There was a big dog-pile as people leapt in to claim prior art, citing a million Flash apps that allowed page turning. Nobody bothered to read the bit that described, in detail, how it was a gesture input that included things like how many pages to turn, what multiple fingers w
Re:Extra Extra! (Score:5, Informative)
> It may be obvious now, but Microsoft applied for this patent in 2004, well before stuff like OpenCL and CUDA came around.
Nevermind OpenCL. How about things like PureVideo, VDPAU and xvmc?
Accelerating video with the video card is about as obvious as it gets.
This sounds like YOU patenting something: Visualize something that's obvious but hasn't quite gotten here to do hardware limitations and then rush off to the patent office.
The only difference between you and Microsoft is that Microsoft has lawyers on retainer.
Re: (Score:2, Insightful)
Why would it be obvious that hardware designed to accelerate 3d rendering - transformation, lighting and rasterisation - can accelerate the compression of video frames?
It seems that you are 'obviously' wrong.
Re:Extra Extra! (Score:5, Insightful)
Why would it be obvious that hardware designed to accelerate 3d rendering - transformation, lighting and rasterisation - can accelerate the compression of video frames?
It seems that you are 'obviously' wrong.
It's seems incredibly obvious to me. Of course, I've worked on FFT code for Cray vector units which were around a long time before 2004. If you can't see the relationship between vector processing, FFTs, and any form of video compression/display, then perhaps you shouldn't be in charge of determining what is "obvious" regarding this particular patent.
I have long felt that our patent system is ridiculous because it allows such silly patents. If something is obvious to an expert in the field, then it shouldn't be patentable.
Re: (Score:2)
No. You are confusing a current graphics card for one of six years ago. It may be obvious now that we can do fast vector processing on a GPU, but they have come along way in six years, each generation aiming more at GPGPU.
In 2004 the 6600 was just making it onto reviewers desks. CUDA was a gleam in some engineers eye and most graphics cards still used fixed function pipelines to perform rendering. Even the latest generation that allowed programable pixel shaders did not automatically accelerate any vectoris
Re: (Score:3, Interesting)
No. You are confusing a current graphics card for one of six years ago. It may be obvious now that we can do fast vector processing on a GPU, but they have come along way in six years, each generation aiming more at GPGPU.
While I totally disagree that it wasn't obvious from well over a decade ago - TI DSP chips (aka vector processors) were being used on video cards from companies like NeXT in the 1990s - lets ignore that and assume your premise. If the patent really applies to such crippled GPUs then it clearly can't apply to GPUs that have been specifically enhanced for such functionality which make the patent moot when granted.
Re: (Score:2)
The words "graphics card" are a complete red herring.
It's obvious that you should use the fastest bit of the computer for heavy processing jobs, whatever that part happens to be...
Re: (Score:3, Informative)
So again I'll ask you; why is it obvious that units designed to do transformation, lighting and rasterisation (not general vector operations) should be good at encoding video?
Because (linear) transformations, lighting and rasterisation are just large matrix-vector operations... as is the FFT which is central to every video compression algorithm. Just because the words are different doesn't mean that the math is different.
Re: (Score:3, Insightful)
No, but providing someone with a unit to calculate (for example) a transformation doesn't mean that you give them an efficient way of computing FFTs. Of course if you give them a general purpose matrix-vector multiplier then it does. In 2004 a graphics card used a fixed function unit, and today it uses a general purpose one. Assumptions about how obvious it is that other applications can be performed don't carry back to previous generations of the hardware.
Re:Extra Extra! (Score:5, Interesting)
No, but providing someone with a unit to calculate (for example) a transformation doesn't mean that you give them an efficient way of computing FFTs. Of course if you give them a general purpose matrix-vector multiplier then it does. In 2004 a graphics card used a fixed function unit, and today it uses a general purpose one. Assumptions about how obvious it is that other applications can be performed don't carry back to previous generations of the hardware.
I do believe that you are arguing in good faith and that you are not a troll, and I understand your argument. Here are the things which I believe were obvious (to general practitioners in the field) in the 1990s:
1. Video display and video (de)compression are essentially large linear algebra problems which lend themselves to vector processing.
2. Video display hardware will continue to improve.
3. Eventually video display hardware will become sufficiently powerful to perform video (de)compression.
The reason I believe that those were all obvious to practitioners in the field in the 1990s is because they were obvious to me in the 1990s and I wasn't even a specialist in the field. (I was, however, working on large number-crunchy stuff). I would suspect that the true experts in those fields were well aware of all three of the above points even in the 1980s.
I'll have to end my participation in this thread, now, and get back to real work...
Re: (Score:3, Interesting)
If you truely understand how software patents worked you'd realise that the time the patent was filed means nothing. You can submit a vague patent now and continue to make edits to it. So all you really need to do is patent flying cars, wait for someone to actually do that then fill in the patent with the competitors implementation.
Microsoft is guilty of filing bogus patents [slashdot.org] already so I'm not so likely to readily believe they've come up with anything.
Re: (Score:2)
If something is obvious to an expert in the field, then it shouldn't be patentable.
A patent is awarded for a clearly described and working implementation of an idea. It isn't enough to say - in a vague sort of way - that the idea alone seems obvious enough in retrospect.
Re: (Score:2, Interesting)
A patent should be awarded for a clearly described and working implementation of an idea.
FTFY.
Re:Extra Extra! (Score:4, Insightful)
Except that conceptually, it's a trivial extension of texture compression, which video cards have supported natively since at least the late 1990s. The only reason we weren't doing video compression is that the video cards weren't fast enough and/or were too power hungry to offer an advantage over CPUs. The patent office should not be awarding patents for discoveries, and that's all GPU-based video decompression really is---discovering that suddenly GPUs are faster than CPUs and things that were impractical (but widely discussed) years before are now practical.
I've only skimmed patent, so it's possible that the summary sucks and that there's something novel and unobvious here, but at a glance, this patent really does look like an explanation for a straightforward mapping of video compression onto a GPU which with the possible exception of the motion estimation would have been obvious a decade ago. For that matter, if you had asked somebody "how would you do motion estimation on a GPU" a decade ago, they probably would have come up with a similar solution.
Then again, it's a software patent, and the design process for nearly all software is obvious to someone with suitable skills in the field, which is why these patents are almost universally crap anyway.
Re:Extra Extra! (Score:4, Interesting)
That's why I'm an advocate of an adversarial patent system, something kinda like this:
You (as a private individual or a company) can sign up to be a "patent examiner", for a minor fee. You specify which areas in which you have expertise (and you (or your employees) may need certain certifications as specified by professional groups in that area, depending). When a patent is submitted, it is required that the patent clearly and specifically state what problem the covered art solves - e.g, in this case, it would be something like "efficiently encoding video using components found in a commodity computer".
Then, a few examiners are picked at random. They're given a day or so in isolation, with whatever reference materials they want to bring (no networked devices, though) to figure out how they would solve the problem. At the end of their isolation, they just need to produce a couple of sketches of how they would go about solving the problem.
If a majority of the examiners (who should be experts in the field of the patent) produce any solution sketches that are largely similar to the patent, the patent is rejected - because clearly, if when experts in the field set their minds to solving that problem they come up with the to-be-patented invention, then it's not novel; it's just an obvious evolution no one else has gotten around to doing yet.
Re: (Score:2)
Modify your process so that if *any* of the examiners come up with the same solution, it's obvious, and I'd be willing to give that a shot.
Oh, and throw out all the rest of the "obvious" patents while you're at it.
As an aside, what does Microsoft think an MPEG decoder card does? Just curious...
Re: (Score:3, Funny)
Just cross-connect the wires and run it backwards.
Re: (Score:2)
Re: (Score:2)
[me] If something is obvious to an expert in the field, then it shouldn't be patentable.
[westlake] A patent is awarded for a clearly described and working implementation of an idea. It isn't enough to say - in a vague sort of way - that the idea alone seems obvious enough in retrospect.
[prgrmr] Under the current patent law in the US, the obviousness tests hinges upon "a person having ordinary skill in the art" [wikipedia.org]; so if it takes an expert to realize how to take an idea and make it real (i.e., the transformative part), then, by definition in the law, it's NOT obvious.
I was asserting what should be, not what actually is :-)
Re:Extra Extra! (Score:5, Insightful)
You're being sarcastic, right? Why would a glorified vector engine be useful for doing video compression, which is basically lots and lots of vector math? It's so obvious that anybody with even basic knowledge of video compression would immediately understand how the two problem spaces map onto one another with no instruction whatsoever.
It's so obvious that ATI released software to do it within a year of when that patent was first filed, which means they were working on it at least a year before that, which means that multiple people independently came up with the idea at the same time, which means it is obvious.
Heck, other companies had already been doing this, and even held patents on it [espacenet.com] five years earlier. Okay, so texture compression and video compression aren't quite the same thing. One deals with a single image, one deals with compressing a series of images.... Yeah, that's not obvious to anyone who has never seen someone make a flipbook during class in elementary school.
Flipbooks are like motion JPEG (Score:2)
Why would a glorified vector engine be useful for doing video compression, which is basically lots and lots of vector math?
Because in October 2004, mainstream video cards weren't necessarily "a glorified vector engine". Many were still very much fixed-function.
One deals with a single image, one deals with compressing a series of images.... Yeah, that's not obvious to anyone who has never seen someone make a flipbook during class in elementary school.
Flipbooks are like motion JPEG: every frame is a keyframe, and no motion compensation is used. The claims in the present patent relate specifically relate to the use of motion estimation data.
Re: (Score:3, Interesting)
Video decompression had been done with GPU assist (including decoding the motion compensation information) for at least a year [tomsguide.com] by NVIDIA hardware before this patent was filed, and maybe earlier. Are you saying that reversing the process---moving from decoding to encoding---is so fundamentally different that it isn't an obvious thing to do? That seems like a prima facie absurd argument that demands ex
Re: (Score:2)
The fast forms for all of these use convolutions based formulae. This are best done using vector processing.
If Nvidia and SGI do not have prior art here I can think of someone who does. IBM and altivec and IBM/Sony and Cell.
Re: (Score:2)
But is it still the same kind of hardware?
With graphics pipelines that can run increasingly complex programs, the GPUs of today are not quite the spcialized devices of several years ago. It seems to me that "running task X on the GPU" is an overly broad and poorly specified claim that should be rejected for lack of accuracy.
Re: (Score:2)
Re: (Score:2)
I wonder how the hell could they actually get this patent through when GPU has been used to speed up both decoding AND encoding for atleast 10 years now. There's LOADS of previous art to prove that
Can you cite three examples from before October 2004, three days prior to the filing date on this patent?
Re: (Score:3, Interesting)
No, no it's not obvious at all – what they've patented is the methods of doing it, not the act of doing it, and those are complex, difficult algorithms to come up with – using specialised hardware to do a different specialised task. There are other ways of doing video encoding on the GPU. The headline is massively inflamatory, they haven't patented video encoding on the GPU at all –merely one way of doing it.
Patenting the mere use of a product? (Score:5, Insightful)
Are they serious? This is virtually the same thing as someone inventing a car and me winning the patent on "driving cars."
Re: (Score:2)
Re: (Score:3, Interesting)
Nothing to see here, move along. (Score:5, Insightful)
Re: (Score:2)
I'm sort of afraid of the mayhem that will occur a bit before the current system will fall apart
Re:Nothing to see here, move along. (Score:5, Insightful)
I'm sort of afraid of the mayhem that will occur a bit before the current system will fall apart
Non-indemnified hardware running rampant! Patents running naked in the streets!
Re: (Score:3, Insightful)
MS making their own thicket? (Score:5, Insightful)
This patent mightn't change much, but it's the weight of the hundreds of patents that's spoiling the AV field.
Microsoft is a member of MPEG-LA, but they pay more royalties than they make from the organisation, so they're probably eager to make their own AV thicket.
* http://en.swpat.org/wiki/Microsoft [swpat.org]
* http://en.swpat.org/wiki/MPEG_LA [swpat.org]
* http://en.swpat.org/wiki/Audio-video_patents [swpat.org]
Badaboom? (Score:2, Informative)
Older than October 2004? (Score:3, Interesting)
But what about programs like Badaboom that already use GPU acceleration in their encoding? Patents confuse me to no end.
The patent application was received in October of 2004 according to the article. So I assume Badaboom would have to precede that or produce some form of prior art preceding that date to defend themselves should Microsoft resort to litigation after failing to agree to a licensing deal with Badaboom's creators. Regardless, a cursory glance proves that Microsoft could out lawyer them whether they are right or not so I believe with a 98% confidence that BadaboomIt is facing some serious liabilities.
Re: (Score:3, Insightful)
The annoying
Re: (Score:2)
If the patent process is working as intended, which limits my comment to an imaginary world, the only thing I could think of is that Microsoft applied for this patent before the prior art (badaboom, MediaCoder, et al) were developed.
In the real world, however, the patent office probably just dropped the ball on prior art.
Re:Badaboom? (Score:5, Informative)
Re: (Score:2)
I was going to go through the claims in detail, but you said it very succinctly. Slashdot never lets things like "facts" get in the way of bashing a patent even though the patent is actually pretty narrow in scope and does not cover modern GPU encoding techniques.
Re: (Score:2)
Re:Badaboom? (Score:5, Insightful)
What is confusing? Microsoft does something. Microsoft applies for a patent on that thing. A patent lawyer who knows very little about the tech in question, has about 600 applications he's currently supposed to be processing, has been instructed that he can't work overtime this week by his boss, but also that he is too far behind on his portfolio and needs to catch up, and who doesn't make near as much as his buddies from law school do to begin with, looks at it. He thinks "I don't even know what half of these words *mean*", then notices that Microsoft filed the patent. Through his haze of pain and frustration he dimly remembers that Microsoft is an "Innovative and economy driving company" and says "fuck it." He hits the "Approve" button.
His boss is happy because his numbers are better this week, and there is no real penalty for approving patents that later get overturned. Even assuming that Microsoft ever attempts to defend the patent rather than just threatening small companies with it in hopes that they'll cave without a court battle.
The things currently wrong with the patent system which this story demonstrates:
1) Patent attorneys often don't understand the tech they are expected to review. This is less of a problem with "real" patents, since the device being patented is just that. A device. If it does what it says it does, in the way it says it does, understanding why isn't all that important. Software is essentially algorithms. If you don't understand them, then judging their uniqueness is difficult.
2) The reviewers in the patent office are phenomenally overworked right now. There are literally tens of thousands of applications backed up. I saw some patent official guy at the end of the Bush administration say that if all applications stopped, right then, he could maybe catch up in a year or two. I don't imagine it's gotten better. Both Bush and Obama have authorized more reviewers, but it seems to be like filling the ocean with a teaspoon.
3) Patent reviewers make a fraction of what patent attorneys in private practice make. This means that they're always looking to get out and get into private firms. Probably not all of them, but like any rational human, most want to make more money and get more respect.
4) There is no real penalty for screwing up. Most patents never get defended in court, because the companies that own them mainly used them as bargaining chips, or to threaten smaller, defenseless, companies. Even if the patent does go to court, it'll take years to invalidate, and no repercussions fall on the approver.
Eliminating software patents would, in one stroke, alleviate or eliminate two of these four problems. Probably the most serious two. It'd be awful nice if it happened. The alternative is probably the whole system collapsing under its own weight eventually.
Re: (Score:2)
This is interesting. I was under the impression that they were mostly patent lawyers and were supposed to call in expert help if they needed it. Thanks for clarifying that for me.
Re: (Score:2)
Leeloo: ...jella boom!
Korben Dallas: Boom. Yeah! I understand boom.
Leeloo: Bada boom.
Korben Dallas: Big... yeah, big bada boom.
Leeloo: Big! Bada big boom! Big! BOOM!
Korben Dallas: Yeah! Big bada boom!
Leeloo: Bada boom!
Korben Dallas: Yeah-hahaha! Big boom! Big bada boom!
2004 (Score:4, Informative)
Far as I know no one was doing this in 2004
Re:2004 (Score:5, Informative)
Still not enough information. Patent claims can change between the original filing and the version that gets granted by amending the patent application. It's done by trolls... get a patent application going and keep it in limbo. Someone else comes to market with something cool, and you add it to your patent application. If it gets approved, viola! You have a patent that is "back-dated" years before the invention hit the market.
I'd be curious to see exactly what the patent said in 2004 and compare that to the then state-of-the-art.
Re: (Score:3, Insightful)
All I have to say is... a GPU is like a CPU. We're gonna get patents for everything twice now, just like with handheld devices?
Re: (Score:3, Insightful)
All I have to say is... a GPU is like a CPU. We're gonna get patents for everything twice now, just like with handheld devices?
Just wait until they start putting GPU's in handheld devices and we're going to see a *hat trick for each patent.
* I would have used 'three-peat' or '3-peat' instead of 'hat trick' but I believe those are trademarked.
Re: (Score:3, Funny)
I was using a GPU to encode video (to S-Video format) using a GPU long before that.
Re: (Score:2)
Video games have been doing this for as long as graphic accelerator cards have been in existence. using tools to specify the exact parameters of what is being rendered and saving the output to a file instead of simply displaying it on the screen does, in my opinion, constitute "bleeding obvious". Additionally, while 3DStudio Max, Maya, or After Effects might not have used GPU acceleration prior to 2004, Amiga had all kinds of video-oriented add-in cards. Hell, Matrox RT-2000 units did real-time 3D effects o
Re: (Score:2)
NVidia's PureVideo product was in development in 2003. See the GeForce 6-series page [wikipedia.org] for some information. I don't know if it was doing encoding on the GPU, but it was definitely doing post-processing and I believe video decoding on the GPU. This line was released in mid-2004, and features of PureVideo were released throughout that year so must have been in development since sometime in 2003.
Re: (Score:3, Insightful)
Running code on a coprocessor is not novel in any way. It's what graphics coprocessors are for. This is like patenting a lemonade stand because it is located on the corner rather than in the middle of the block.
Re: (Score:2)
Re: (Score:2)
Google has no references for "GPU accelerated video encoding" before 2005 [google.com]
I guess the companies that were working on these types of things didn't have available products before 2005. That certainly doesn't exclude graphics card vendors from working on these types of things in their labs - or discussing them at conferences - and graphics card vendors are the exact group of people this type of thing would be 'obvious' to.
Re: (Score:2)
alter the search terms [google.com]. No one used the term "GPU" back then.
Re: (Score:3, Informative)
Try again with "GPU motion compensation" [google.com]. Plenty.
Re: (Score:2)
Smaller Companies (Score:2, Insightful)
This will make it almost impossible for smaller companies to make fast video encoding applications. They will have to start paying royalties if they want to encode video using the GPU in applications such as FRAPS or any video converter. Their products will either have to become more expensive or remain inferior to products made by larger companies.
Re: (Score:2)
Re: (Score:2)
Hold on a mo', bro'.
How can a patent that covers offloading the motion compensation part of encoding video to the GPU NOT apply t
Re: (Score:2)
This will make it almost impossible for smaller companies to make fast video encoding applications. They will have to start paying royalties if they want to encode video using the GPU in applications such as FRAPS or any video converter. Their products will either have to become more expensive or remain inferior to products made by larger companies.
I think that's the whole purpose of filing patents. The greater purpose of patents is to provide incentive to innovate, but for the patent filer, it's to gain a temporary monopoly on an idea, which gives a competitive advantage.
Enforceable? (Score:5, Interesting)
The patent was filed in 2004, and there must be loads of prior art. Companies such as Nvidia and ATI have had GPU-accelerated video encoders for years now.
Regardless, this patent should never have been granted. It's all because of the patent office's massive backlog, and their decision to accept every random patent to reduce it.
Re: (Score:2)
Which begs the question of why it took 6 years to grant the patent. Since 2004, tons of software has done this. If a patent takes more than a month, something is wrong. Because technology can change radically in 6 years.
Re: (Score:3, Insightful)
Re: (Score:2)
I'm 100% sure that nvidia already has GPU accelerated video encoding products on the market, and I know they've been around for years (but I don't know how long exactly). A quick search reveals that Geforce 6 may have had GPU video encoding capabilities (released Apr 2004)...
Whatever the case, it's really shitty for nvidia, considering they already have products, and Microsoft will then come along and ask them for money, even though they still don't have anything of the sort except a piece of paper that say
Prior art is EASILY provable -- this patent is BS. (Score:3, Interesting)
I can PERSONALLY prove prior art on this patent.
I have custom code from a project back in 1997-1998 for a Chromatic Research MPACT video card that used it to offload either MPEG-1 or H.263 video encoding process to the card.
I also have code from the same era that offloads both H.263 and/or MPEG-1 encoding to a video card that is based around a combination of a Trident 9xx series video chip and an 8x8 VCP.
So, I can PROVE I have WORKING code that does what this patent is for that was written in 1998 or earlier.
... says the Anonymous Coward (Score:2)
Re: (Score:2, Interesting)
Re: (Score:2)
It would be cool if you contacted Microsoft and told them you're prepared to licence the rights to them. I'd love to see them wriggle and their reply.
3dfx anyone (Score:2, Interesting)
Listing of prior art (Score:2, Informative)
Crowd sourced obliteration of this patent.
Let's list the prior art. If you know of a patent that is prior art please list it here. If you know of a program or computer science paper or article that is prior art please list it here. Provide links if possible. If you review the patent and find a flaw please list it here with your explanation of what the flaw is. If you find any part "obvious" please indicate why. Good hunting.
How can this patent be received by anyone? (Score:3, Insightful)
This patent is nothing more than a description on how to use a general purpose processor to perform specific tasks. Adding to that, it describes a way to use computers to handle video. And using GPUs to do work is fundamentally very old technology, as they basically are glorified vector processors. So, how can such an obvious and overreaching patent pass regarding such fundamental technology? Is this not a obvious application of this particular technology?
Re: (Score:2)
If Microsoft can patent the ability to "shut down" a computer, and the ability to have a "user interface with buttons," then why can't they patent the ability to "use a GPU for its stated purpose" ? I'm just wondering how long it will be before Microsoft patents the ability to "use a computer to do computations." Did you know that the concept of "using a hash table to lookup information" has also been patented?
I really love how patents on software and technology are awarded by Lawyer's and secretaries who
Re: (Score:2)
This patent is nothing more than a description on how to use a general purpose processor to perform specific tasks.
Welcome to software patents, you must be new here.
Prior Art? (Score:3, Interesting)
I'd Be Pissed (Score:2)
At the risk of being a broken record ... (Score:2)
A) If you haven't read the patent, read it. ...
B) If you don't understand how to read a patent (and odds are you don't if you've never written one or aren't a lawyer, even if you think you do), recognize that your knee-jerk reaction to it may not be accurate
Slashdot has a long and glorious record of flamefests because 99% of its readers don't understand patents or how to read them and think a well-written fairly-narrow patent is covering some broad obvious area.
Now I'm sure the flamefest will happen, anyway
Only covers motion estimation (Score:2)
If you actually read the claims of the patent, it only covers doing motion estimation on the GPU and everything else on the CPU. This seems like a rather obvious part to offload to 2004's marginally-programmable GPUs.
Silly but not obvious (Score:2)
Another meat cutting patent! (Score:2, Insightful)
Not novel (Score:2)
Look, we already went through the era of patenting obvious, well-known process X "with a computer". Then we went through patenting X "on the internet". We look back on that now, and we say, "Duh, putting it on a computer or on the internet didn't make it novel."
Is putting it on a GPU any better? No.
Re: (Score:2)
Isn't one requirement of a patent for it to be non-obvious?!!!
Hey, there, little fella. You must've been trapped down there a long time. Your information packet should be arriving shortly.
Re: (Score:3, Interesting)
Isn't one requirement of a patent for it to be non-obvious?!!!
One of the strongest justifications for patent protection is when you create something that becomes ridiculously obvious once you create it. This is pretty much the most perfect definition of
promote the Progress of Science and useful Arts
that you will ever find.
It was mentioned in another comment that the patent was applied for in 2004 and that as far as that poster knew nobody else was doing it. So... Were people contemplating this back in 2004? The idea of video cards being used for general purpose computing is not very old. Transcoding user-gen
Re: (Score:3, Interesting)
How does it promote the progress of science and useful arts to grant a monopoly on using a processor to encode video? Now Microsoft is the only one who can sell a product that uses a GPU to encode video, a situation that doesn't give me a great deal of hope for the progress of anything.
Re: (Score:3, Interesting)
"Were people contemplating this back in 2004? The idea of video cards being used for general purpose computing is not very old. Transcoding user-generated video from one format to another was not very common until YouTube got popular."
Encoding isn't transcoding, though they're similar. People have been encoding for a long time and always looking to offload any of the task onto some other processor. You live in a time when some encoding can be done in real time with a general purpose CPU. In the 90's, whe
Re: (Score:3, Insightful)
One of the strongest justifications for patent protection is when you create something that becomes ridiculously obvious once you create it.
This idiotic meme needs to die. People are perfectly capable of assessing whether an idea is obvious or not after the fact. Having more information available doesn't mystically make people stupid.
This meme is mainly just a rationalization by those who prefer to replace thinking with soundbites.
---
Like software, intellectual property law is a product of the mind, and