Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
The Courts Government Intel News

NVIDIA Responds To Intel Suit 215

MojoKid writes "NVIDIA and Intel have always had an interesting relationship, consisting of a dash of mutual respect and a whole lot of under-the-collar disdain. And with situations such as these, it's easy to understand why. NVIDIA today has come forward with a response to a recent Intel court filing in which Intel alleges that the 'four-year-old chipset license agreement the companies signed does not extend to Intel's future generation CPUs with "integrated" memory controllers, such as Nehalem. NVIDIA CEO Jen-Hsun Huang, had this to say about the whole ordeal: 'We are confident that our license, as negotiated, applies. At the heart of this issue is that the CPU has run its course and the soul of the PC is shifting quickly to the GPU. This is clearly an attempt to stifle innovation to protect a decaying CPU business.'"
This discussion has been archived. No new comments can be posted.

NVIDIA Responds To Intel Suit

Comments Filter:
  • by Libertarian001 ( 453712 ) on Thursday February 19, 2009 @02:17AM (#26913237)
    WTF? Does Intel sell more CPUs than NVIDIA sells GPUs?
  • by 0123456 ( 636235 ) on Thursday February 19, 2009 @02:25AM (#26913299)

    WTF? Does Intel sell more CPUs than NVIDIA sells GPUs?

    Doesn't Intel sell more GPUs (admittedly crappy integrated ones) than Nvidia does?

  • by Anonymous Coward on Thursday February 19, 2009 @03:01AM (#26913489)

    By locking all competitors out of the chipset business, a company can boost margins (and thus boost profit), as opposed to living with decaying margins and lower profitability due to commoditization.

    As standalone CPUs get commoditized, the margins and profitability decay.

    Also if you sell crappy integrated GPUs, you can protect the GPUs from competition and the CPUs from commoditization by bundling them and locking out competitors.

    Intel didn't get to where they are today by not knowing how to play the game. They wouldn't be walking away from their standalone CPU business and move to integrated CPU/GPU if they didn't think their old standalone CPU business would suffer from decaying margins. As they move into this space, it also only makes sense to try to put up barriers to your competitors who might be trying to screw up your future business strategy. Remember how Intel made AMD go try and execute "SlotA" when before they made pin-compatible chips. This is seems like a very similar strategy to try to kick Nvidia out of the Intel eco-system.

  • Typical bluster (Score:4, Interesting)

    by CajunArson ( 465943 ) on Thursday February 19, 2009 @03:05AM (#26913513) Journal

    Jen-Hsun Huang has never been one to keep his trap shut when given the chance... even though Nvidia is in the red right now. Lesson one: When a CEO comes out and tries to use a legal dispute related to a contract as a pulpit to make a religious sermon, he knows he's wrong. See Darl McBride and Hector Ruiz as other examples of dumbass CEO's who love to see themselves in magazines but don't want to be bothered with pesky details like turning a profit or actually competing.
        Intel is #1 in graphics when it comes to shipments... now I'm not saying I'd want to play 3D games on their chips, but guess what: despite what you see on Slashdot, very few users want to play these games. Further, I've got the crappy Intel integrated graphics on my laptop, and Kubuntu with KDE 4.2 is running quite well thanks to the 100% open source drivers that Intel has had it's own employees working on for several years. I'm not saying Intel graphics will play Crysis, but they do get the job done without binary blobs.
        Turning the tables on Huang, the real "fear" here is of Larrabee... this bad-boy is not going to even require "drivers" in the conventional sense, it will be an open stripped-down x86 chip designed for massive SIMD and parallelism... imagine what the Linux developers will be able to do with that not only in graphics but for GPGPU using OpenCL. Will it necessarily be faster than the top-end Nvidia chips? Probably not... but it could mean the end of Nvidia's proprietary driver blobs for most Linux users who can get good performance AND an open architecture... THAT is what scares Nvidia.

  • Re:Typical bluster (Score:5, Interesting)

    by CajunArson ( 465943 ) on Thursday February 19, 2009 @03:11AM (#26913537) Journal

    I hate to respond to myself but: Yeah the market share of Linux is not huge, Nvidia is probably not terrified of losing sales to Larrabee on some desktop Linux boxes (high end supercomputing apps could be an interesting niche they might care about though). However, it is afraid that OEMs will be interested in Larrabee as a discrete card where Intel never had a solution before. Given the problems that Nvidia has had with execution over the last year, and the fact that Intel knows how to keep suppliers happy, THAT is where Nvidia is really afraid.

  • by Ostracus ( 1354233 ) on Thursday February 19, 2009 @04:01AM (#26913771) Journal

    "This, btw, is in my opinion the real reason AMD bought ATI. AMD wanted to work toward having a solution for that high volume market, and seemed to think they needed to own ATI to do it."

    I think you're partially right. If they indeed wanted entry into the business graphics market. Matrox would have been a better purchase. But ATI makes better GPUs and they wanted entry there as well. It's easier to scale down a high-end GPU than it is to raise up a low-end GPU.

  • by GigaplexNZ ( 1233886 ) on Thursday February 19, 2009 @04:09AM (#26913831)

    If you mean as an actual product someone would intentionally seek out then Intel sells 0 GPUs.

    I actively seek out Intel graphics when looking at laptops due to the lower power requirements and better driver support (I hate it when NVIDIA and ATI drivers don't install in Windows as I have to contact the OEM for an older version, and I've always had more issues with the same brands on Linux). I know the performance is abysmal in comparison, but I don't care. You don't want Intel graphics, that's fine and I understand why, but that doesn't mean no-one intentionally seeks them out.

  • by Jthon ( 595383 ) on Thursday February 19, 2009 @04:23AM (#26913909)

    NVIDIA has their laptop drivers on their website so you no longer have to get outdated ones from your OEM. (Took them long enough.)

    As for battery life, have you checked out NVIDIA integrated vs Intel integrated? The discrete systems do suck more power, but I think the integrated chips for NVIDIA/ATI are still better and don't consume more power than Intel integrated.

    Apple is picky about battery life, and they recently switched to all NVIDIA on their laptop line, including the Macbook Air.

    Don't just assume that because it's NVIDIA it's a power hungry monster. Sure the high end graphics cards need their own power substation, but they can do some nice low power stuff when they need to (9400M, Tegra).

  • by Louis Savain ( 65843 ) on Thursday February 19, 2009 @04:35AM (#26913979) Homepage
    Not just any single chip but a homogeneous multicore processor, that is, one that has multiple processing cores. There is difference between a CPU and a multicore processor.
  • by Jthon ( 595383 ) on Thursday February 19, 2009 @05:04AM (#26914131)

    It's not just about games, there are business uses for GPU acceleration. Presentation software could use the GPU to be more dynamic, and render complicated graphs more smoothly. Some complicated PowerPoint presentations get slow, why not use a GPU to accelerate this?

    Perhaps Excel or Matlab could use a GPU to crunch numbers to speed up calculations. Or even use the GPU to make the charts more interactive.

    Perhaps MS has some overhaul to their display system which would allow it to use the GPU to render Word documents with better anti-aliasing and allow large documents to scroll faster. Adobe Acrobat actually supports some GPU acceleration (not on be default I think) which makes PDFs render faster. I know turning on PDF acceleration actually makes me more productive since I can read documents without having to wait for redraw.

    Maybe we can do GPU accelerated vector graphics, for web site and UI rendering. Who knows what could be done to improve the business experience if the option is there.

    NVIDIA expects to change the way people USE the GPU so it's NOT just for rendering 3D pictures anymore.

    Some improvements to business experience might be small, but still give a small boost in productivity.

    All that said, there will always be people who just use a very basic word processor. But these people also don't need Intel's next Core i7 quad mega CPU either. They'll be fine with their P2 running Window 95 if the hardware didn't eventually break down.

    The whole point is that NVIDIA wants to innovate on the GPU so that business, and people can use it in new ways to do stuff they couldn't before. Intel wants to do the same, but require you to buy a bigger CPU. Instead you could get a cheap integrated GPU and CPU combo, and get the same productivity boost you were getting by buying just a bigger CPU before.

  • by Jthon ( 595383 ) on Thursday February 19, 2009 @05:20AM (#26914219)

    There's more to GPU acceleration than gaming.

    What does your wife do? Does she just send e-mail? Then beyond some UI improvements there's not much for her (but those UI improvements could be cool).

    Does she encode music or video's for an iPod? That can be enhanced with the GPU. You can encode movies in faster than realtime on current GPUs. Something you can't do with current CPUs.

    Does she watch YouTube? I saw a demo of a program that runs some fancy filters using the GPU on low quality YouTube like video, and spits out something that looks pretty good. It was something that couldn't be done in real time on a CPU but a mid to low range GPU could do.

    Does she do graphic design? Features like the new Photoshop allow the program to be much more responsive when editing images, large filters also complete in fractions of a second.

    In the simplest cases a better GPU might increase UI responsiveness, and make the experience "smoother". But long term changes will likely change WHAT you do with the GPU.

    NVIDIA at least is trying to change it so GPU acceleration isn't just about gaming. They want the GPU to be a massively parallel processor that your desktop uses when it needs more processing power.

  • by GigaplexNZ ( 1233886 ) on Thursday February 19, 2009 @05:24AM (#26914239)

    NVIDIA has their laptop drivers on their website so you no longer have to get outdated ones from your OEM. (Took them long enough.)

    Only for some models. My old 6600 Go (a very powerful laptop chip for its time) is still unsupported.

    As for battery life, have you checked out NVIDIA integrated vs Intel integrated? The discrete systems do suck more power, but I think the integrated chips for NVIDIA/ATI are still better and don't consume more power than Intel integrated.

    I have, and they aren't particularly appealing. Their performance isn't sufficiently better such that I can perform tasks that I otherwise wouldn't be able to, so the gains are effectively worthless to me. The driver support isn't fixed switching to NVIDIA/ATI integrated either (and is sometimes worse). Battery life is probably comparable, but it would need to be clearly superior for me to consider them.

    Don't just assume that because it's NVIDIA it's a power hungry monster. Sure the high end graphics cards need their own power substation, but they can do some nice low power stuff when they need to (9400M, Tegra).

    I don't assume that, but from what information I have gathered I feel the Intel chips are currently a better fit for my requirements.

  • by GigaplexNZ ( 1233886 ) on Thursday February 19, 2009 @05:59AM (#26914425)

    Intel needs them for high-end graphics

    Not according to Larrabee. [wikipedia.org]

    Nvidia needs them for the CPU

    While that is mostly true, it isn't the whole story. They could solely rely on AMD CPUs (which could either cripple NVIDIA or boost AMD or both), or they could try to weasle around the patent issues and make their own CPU.

  • by DrYak ( 748999 ) on Thursday February 19, 2009 @07:22AM (#26914787) Homepage

    While Intel is trying to lock nVidia and ATI/AMD out of the chipset business by bundling the CPU and the chipset and bridging them with an interconnect - QuickPath - which they won't license to nVidia,
    nVidia on their hand has tried to do exactly the same, locking Intel and ATI/AMD out of the chipset business by bundling them with the GPU and bridging them with a technology that they won't sub-license either : nVidia's SLI.

    nVidia has tried to be the only chipset in town able to do SLI.
    Intel is currently trying to be the only chipset in town usable with Core 7i.

    Meanwhile, I'm quite happy with ATI/AMD which use an open standard* which doesn't require licensing between the CPU and the chipset (HyperTransport) and another industry standard for multiple GPU requiring no special licensing (plain PCIe).

    Thus any component on a Athlon/Phenom + 7x0 chipset + Radeon HD stack could be replaced with any other compatible component (although currently there aren't that many HT-powered CPU to pick from).

    *: The plain simple normal HypterTransport is open. AMD has made proprietary extension for cache coherency in multi-socketed servers. But regular CPUs should work with plain HyperTransport too.

  • by iamhigh ( 1252742 ) on Thursday February 19, 2009 @10:14AM (#26915871)
    If the CPU dies and the GPU takes over, can't we just rename the GPU a CPU? You know since it will be the "Central" Processing Unit?

egrep -n '^[a-z].*\(' $ | sort -t':' +2.0

Working...