NVIDIA Responds To Intel Suit 215
MojoKid writes "NVIDIA and Intel have always had an interesting relationship, consisting of a dash of mutual respect and a whole lot of under-the-collar disdain. And with situations such as these, it's easy to understand why.
NVIDIA today has come forward with a response to a recent Intel court filing in which Intel alleges that the 'four-year-old chipset license agreement the companies signed does not extend to Intel's future generation CPUs with "integrated" memory controllers, such as Nehalem. NVIDIA CEO Jen-Hsun Huang, had this to say about the whole ordeal: 'We are confident that our license, as negotiated, applies. At the heart of this issue is that the CPU has run its course and the soul of the PC is shifting quickly to the GPU. This is clearly an attempt to stifle innovation to protect a decaying CPU business.'"
Comment removed (Score:3, Insightful)
Re:Decaying CPU business? (Score:5, Insightful)
Define sell. If you mean bundle for virtually free with CPU's (or in some cases cheaper than just a CPU, go Monopoly) then yes they do.
If you mean as an actual product someone would intentionally seek out then Intel sells 0 GPUs.
In fact they count sales of chipsets with integrated graphics as a graphics sale for market share even if that computer also has a discrete graphics card. So if you buy something with an NVIDIA or ATI card and a 945G chipset that counts as graphics sale for Intel even though the graphics chip is never used.
Their integrated graphics actually benchmarks slower than Microsoft's Software DirectX10 implementation (running on a Core i7). If people were more aware of just how poorly Intel integrated chips were they'd probably sell even less.
Sadly, most people aren't aware of the vast difference in performance, and just assume their computer is slow when Aero, The Sims, Spore or Google Earth run poorly.
Until Intel ships Larrabee we won't really know if they can ship a GPU, and that looks to be still over a year away.
Re:Decaying CPU business? (Score:5, Insightful)
Not everybody particularly cares about 3D graphics performance. If you ask the common joe, they probably care more about video performance than 3D performance, as people typically watch videos on their PCs more often than play 3D games.
With that being said, Intel Integrated Graphics tend to do quite well with video, especially HD Video, rendering.
Somebody that cares about 3D graphics performance, because they want to play the latest and greatest games, is going to buy discrete graphics regardless, doesn't matter if the integrated graphics is made by nVidia, ATI, etc.
Re:Decaying CPU business? (Score:5, Insightful)
It is not about bundling. It is about the fact that the vast majority of PC sales are to business customers who want to put desktops under the desks of their employees and don't give a damn about the GPU performance. To those customers, spending the premium for an nVidia GPU is absurd. Hence, they buy inexpensive machines that have GPU's which suck at rendering 3D but are fully functional when it comes to running Office or Email applications. This, btw, is in my opinion the real reason AMD bought ATI. AMD wanted to work toward having a solution for that high volume market, and seemed to think they needed to own ATI to do it.
Many of the people who put together high end machines for gaming and/or other 3D application purposes---the people that buy and value what nVidia has to offer---frequently forget that type of machines they love are a very tiny percentage of the desktop market...
Re:Decaying CPU business? WTF? (Score:1, Insightful)
Re:DX10? That Vista thing? (Score:4, Insightful)
Dude. Even I know GPUs are optimised for compositing. Ray tracing is a way different thing. It has to have a way different system. Pretending it doesn't will not help you here.
You didn't just write the above did you? You show your ignorance. A long time ago they did just compositing, but that was back in the VGA controller days.
Then they evolved to do fixed function rasterization, but those days are over (unless you're Intel doing integrated stuff).
GPUs are MUCH more programmable, and getting more so with each generation. You can do pretty much any floating point math function you want now. Go look up CUDA, and OpenCL they let you basically write C code for the GPU.
Sure the GPUs might not do so well when it comes to brancing, but you'll see that GPU's are being used to do more than just rasterization. Sure razterization would be an important target for NVIDIA/ATI but that doesn't mean it can only draw triangles.
If you look at the paper I linked (which you obviously didn't) it describes how they wrote a ray tracer using NVIDIA CUDA and EXISTING GPUs. If stuff gets more programmable as NVIDIA seems to be targeting, then it will only get easier to write ray tracers which run on the GPU.
If you want proof GPUs do more than rasterization go check out how NVIDIA's GPU tech is now in the Tsubame super computer.
Even Intel is getting into the GPU business with Larrabee, I bet they plan to write a ray tracer for that.
"Ouch and double ouch"? (Score:3, Insightful)
You can plug an NVIDIA GPU card into an Intel motherboard (I did just that for the computer I am using).
I have no idea why Intel wouldn't want Nvidia to make chipsets for core i7. For some reason, even years after AMD bought ATI, the only Intel mainboards which support two linked graphics cards do so through Crossfire. So if Nvidia doesn't make chipsets to support core i7, Intel would be forcing the hardcore gamers to either (a) buy AMD's video chips to use Crossfire or (b) buy AMD's CPU's to use NVidia SLI.
Re:The other way around too (Score:5, Insightful)
Re:Decaying Matrox business? (Score:2, Insightful)