Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
The Courts Government Intel News

NVIDIA Responds To Intel Suit 215

Posted by samzenpus
from the chip-wars dept.
MojoKid writes "NVIDIA and Intel have always had an interesting relationship, consisting of a dash of mutual respect and a whole lot of under-the-collar disdain. And with situations such as these, it's easy to understand why. NVIDIA today has come forward with a response to a recent Intel court filing in which Intel alleges that the 'four-year-old chipset license agreement the companies signed does not extend to Intel's future generation CPUs with "integrated" memory controllers, such as Nehalem. NVIDIA CEO Jen-Hsun Huang, had this to say about the whole ordeal: 'We are confident that our license, as negotiated, applies. At the heart of this issue is that the CPU has run its course and the soul of the PC is shifting quickly to the GPU. This is clearly an attempt to stifle innovation to protect a decaying CPU business.'"
This discussion has been archived. No new comments can be posted.

NVIDIA Responds To Intel Suit

Comments Filter:
  • Creative Labs? (Score:3, Insightful)

    by DigiShaman (671371) on Thursday February 19, 2009 @01:37AM (#26913359) Homepage

    What's next, Creative starts bitching too because their APUs (Audio Processing Unit) are being snuffed out by nVidia and Intel?

    Hey you two, STFU. Your technologies are forever joined at the hip in modern computing. Stop the bitch slapping and grow up.

  • by Jthon (595383) on Thursday February 19, 2009 @02:05AM (#26913505)

    Define sell. If you mean bundle for virtually free with CPU's (or in some cases cheaper than just a CPU, go Monopoly) then yes they do.

    If you mean as an actual product someone would intentionally seek out then Intel sells 0 GPUs.

    In fact they count sales of chipsets with integrated graphics as a graphics sale for market share even if that computer also has a discrete graphics card. So if you buy something with an NVIDIA or ATI card and a 945G chipset that counts as graphics sale for Intel even though the graphics chip is never used.

    Their integrated graphics actually benchmarks slower than Microsoft's Software DirectX10 implementation (running on a Core i7). If people were more aware of just how poorly Intel integrated chips were they'd probably sell even less.

    Sadly, most people aren't aware of the vast difference in performance, and just assume their computer is slow when Aero, The Sims, Spore or Google Earth run poorly.

    Until Intel ships Larrabee we won't really know if they can ship a GPU, and that looks to be still over a year away.

  • by _avs_007 (459738) on Thursday February 19, 2009 @02:18AM (#26913577)

    Not everybody particularly cares about 3D graphics performance. If you ask the common joe, they probably care more about video performance than 3D performance, as people typically watch videos on their PCs more often than play 3D games.

    With that being said, Intel Integrated Graphics tend to do quite well with video, especially HD Video, rendering.

    Somebody that cares about 3D graphics performance, because they want to play the latest and greatest games, is going to buy discrete graphics regardless, doesn't matter if the integrated graphics is made by nVidia, ATI, etc.

  • by Klintus Fang (988910) on Thursday February 19, 2009 @02:49AM (#26913715)

    It is not about bundling. It is about the fact that the vast majority of PC sales are to business customers who want to put desktops under the desks of their employees and don't give a damn about the GPU performance. To those customers, spending the premium for an nVidia GPU is absurd. Hence, they buy inexpensive machines that have GPU's which suck at rendering 3D but are fully functional when it comes to running Office or Email applications. This, btw, is in my opinion the real reason AMD bought ATI. AMD wanted to work toward having a solution for that high volume market, and seemed to think they needed to own ATI to do it.

    Many of the people who put together high end machines for gaming and/or other 3D application purposes---the people that buy and value what nVidia has to offer---frequently forget that type of machines they love are a very tiny percentage of the desktop market...

  • by Anonymous Coward on Thursday February 19, 2009 @03:20AM (#26913889)
    Because it gives them control of the entire platform. nVidia are the only people left with a GPU and chipset but no CPU of their own: Intel, AMD & Via have all three.
  • by Jthon (595383) on Thursday February 19, 2009 @04:31AM (#26914281)

    Dude. Even I know GPUs are optimised for compositing. Ray tracing is a way different thing. It has to have a way different system. Pretending it doesn't will not help you here.

    You didn't just write the above did you? You show your ignorance. A long time ago they did just compositing, but that was back in the VGA controller days.

    Then they evolved to do fixed function rasterization, but those days are over (unless you're Intel doing integrated stuff).

    GPUs are MUCH more programmable, and getting more so with each generation. You can do pretty much any floating point math function you want now. Go look up CUDA, and OpenCL they let you basically write C code for the GPU.

    Sure the GPUs might not do so well when it comes to brancing, but you'll see that GPU's are being used to do more than just rasterization. Sure razterization would be an important target for NVIDIA/ATI but that doesn't mean it can only draw triangles.

    If you look at the paper I linked (which you obviously didn't) it describes how they wrote a ray tracer using NVIDIA CUDA and EXISTING GPUs. If stuff gets more programmable as NVIDIA seems to be targeting, then it will only get easier to write ray tracers which run on the GPU.

    If you want proof GPUs do more than rasterization go check out how NVIDIA's GPU tech is now in the Tsubame super computer.

    Even Intel is getting into the GPU business with Larrabee, I bet they plan to write a ray tracer for that.

  • by seeker_1us (1203072) on Thursday February 19, 2009 @05:05AM (#26914457)
    More like strawman and double strawman. Jen-Hsun Huang talks about GPUs. Intel is talking about chipsets.

    You can plug an NVIDIA GPU card into an Intel motherboard (I did just that for the computer I am using).

    I have no idea why Intel wouldn't want Nvidia to make chipsets for core i7. For some reason, even years after AMD bought ATI, the only Intel mainboards which support two linked graphics cards do so through Crossfire. So if Nvidia doesn't make chipsets to support core i7, Intel would be forcing the hardcore gamers to either (a) buy AMD's video chips to use Crossfire or (b) buy AMD's CPU's to use NVidia SLI.

  • by nanoflower (1077145) on Thursday February 19, 2009 @10:16AM (#26916735)
    Who says Intel isn't willing to license it? Certainly not Intel as they have stated that they were working with Nvidia on a new deal. The problem is that Nvidia thinks the new designs are covered by the old agreement and therefore Nvidia doesn't have to pay any more to use the technology of the new Intel CPU. Intel thinks there is enough change in the I7 design that the old license agreement doesn't apply. That doesn't mean that Intel is unwilling to work out a new agreement with Nvidia for licensing the technology in the I7. The fact that they've been talking with Nvidia for so long suggests that Intel is willing to work with Nvidia but they expect to be paid for it, and Nvidia thinks they shouldn't have to pay. As to who is right and who is wrong? I have no idea. It sounds like Intel has a good argument for the new design being different enough to abrogate the license but without reading the actual contract there's no way to know. So it's going to take the courts to figure this out.
  • by Klintus Fang (988910) on Thursday February 19, 2009 @01:12PM (#26919423)
    I agree. AMD likely wants generic "platform" solutions (where they provide the customer with multiple components of the platfrom rather than just the CPU) for as many markets as they can hit rather than just the basic business segment. That probably is why they chose ATI as opposed to something cheaper.

Power corrupts. And atomic power corrupts atomically.

Working...