Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
The Courts Government Intel News

NVIDIA Responds To Intel Suit 215

MojoKid writes "NVIDIA and Intel have always had an interesting relationship, consisting of a dash of mutual respect and a whole lot of under-the-collar disdain. And with situations such as these, it's easy to understand why. NVIDIA today has come forward with a response to a recent Intel court filing in which Intel alleges that the 'four-year-old chipset license agreement the companies signed does not extend to Intel's future generation CPUs with "integrated" memory controllers, such as Nehalem. NVIDIA CEO Jen-Hsun Huang, had this to say about the whole ordeal: 'We are confident that our license, as negotiated, applies. At the heart of this issue is that the CPU has run its course and the soul of the PC is shifting quickly to the GPU. This is clearly an attempt to stifle innovation to protect a decaying CPU business.'"
This discussion has been archived. No new comments can be posted.

NVIDIA Responds To Intel Suit

Comments Filter:
  • by Libertarian001 ( 453712 ) on Thursday February 19, 2009 @01:17AM (#26913237)
    WTF? Does Intel sell more CPUs than NVIDIA sells GPUs?
    • by 0123456 ( 636235 ) on Thursday February 19, 2009 @01:25AM (#26913299)

      WTF? Does Intel sell more CPUs than NVIDIA sells GPUs?

      Doesn't Intel sell more GPUs (admittedly crappy integrated ones) than Nvidia does?

      • by Anonymous Coward on Thursday February 19, 2009 @02:01AM (#26913489)

        By locking all competitors out of the chipset business, a company can boost margins (and thus boost profit), as opposed to living with decaying margins and lower profitability due to commoditization.

        As standalone CPUs get commoditized, the margins and profitability decay.

        Also if you sell crappy integrated GPUs, you can protect the GPUs from competition and the CPUs from commoditization by bundling them and locking out competitors.

        Intel didn't get to where they are today by not knowing how to play the game. They wouldn't be walking away from their standalone CPU business and move to integrated CPU/GPU if they didn't think their old standalone CPU business would suffer from decaying margins. As they move into this space, it also only makes sense to try to put up barriers to your competitors who might be trying to screw up your future business strategy. Remember how Intel made AMD go try and execute "SlotA" when before they made pin-compatible chips. This is seems like a very similar strategy to try to kick Nvidia out of the Intel eco-system.

        • They wouldn't be walking away from their standalone CPU business and move to integrated CPU/GPU if they didn't think their old standalone CPU business would suffer from decaying margins.

          You seem to be asserting that they would only change business plans if the current plans are losing ground. This is not true. Companies are always looking for ways to make more money and could simply look for something with more potential even if their current approach is still going strong.

          • by morcego ( 260031 )

            If you are talking about self flanking, your analysis is incomplete.

            In this case, a company doesn't shift because of "ways to make more money". They shift because if they don't do it, some other company will adopt that business practice, and virtually kill their current business model. Sometimes this shift even carries a lower profit margin. See more on Gillette vs. Bic, regarding the shaving business.

            This is a classic example of an important marketing strategy, well illustrated on several book (including t

        • by DrYak ( 748999 ) on Thursday February 19, 2009 @06:22AM (#26914787) Homepage

          While Intel is trying to lock nVidia and ATI/AMD out of the chipset business by bundling the CPU and the chipset and bridging them with an interconnect - QuickPath - which they won't license to nVidia,
          nVidia on their hand has tried to do exactly the same, locking Intel and ATI/AMD out of the chipset business by bundling them with the GPU and bridging them with a technology that they won't sub-license either : nVidia's SLI.

          nVidia has tried to be the only chipset in town able to do SLI.
          Intel is currently trying to be the only chipset in town usable with Core 7i.

          Meanwhile, I'm quite happy with ATI/AMD which use an open standard* which doesn't require licensing between the CPU and the chipset (HyperTransport) and another industry standard for multiple GPU requiring no special licensing (plain PCIe).

          Thus any component on a Athlon/Phenom + 7x0 chipset + Radeon HD stack could be replaced with any other compatible component (although currently there aren't that many HT-powered CPU to pick from).

          *: The plain simple normal HypterTransport is open. AMD has made proprietary extension for cache coherency in multi-socketed servers. But regular CPUs should work with plain HyperTransport too.

      • by Jthon ( 595383 ) on Thursday February 19, 2009 @02:05AM (#26913505)

        Define sell. If you mean bundle for virtually free with CPU's (or in some cases cheaper than just a CPU, go Monopoly) then yes they do.

        If you mean as an actual product someone would intentionally seek out then Intel sells 0 GPUs.

        In fact they count sales of chipsets with integrated graphics as a graphics sale for market share even if that computer also has a discrete graphics card. So if you buy something with an NVIDIA or ATI card and a 945G chipset that counts as graphics sale for Intel even though the graphics chip is never used.

        Their integrated graphics actually benchmarks slower than Microsoft's Software DirectX10 implementation (running on a Core i7). If people were more aware of just how poorly Intel integrated chips were they'd probably sell even less.

        Sadly, most people aren't aware of the vast difference in performance, and just assume their computer is slow when Aero, The Sims, Spore or Google Earth run poorly.

        Until Intel ships Larrabee we won't really know if they can ship a GPU, and that looks to be still over a year away.

        • by _avs_007 ( 459738 ) on Thursday February 19, 2009 @02:18AM (#26913577)

          Not everybody particularly cares about 3D graphics performance. If you ask the common joe, they probably care more about video performance than 3D performance, as people typically watch videos on their PCs more often than play 3D games.

          With that being said, Intel Integrated Graphics tend to do quite well with video, especially HD Video, rendering.

          Somebody that cares about 3D graphics performance, because they want to play the latest and greatest games, is going to buy discrete graphics regardless, doesn't matter if the integrated graphics is made by nVidia, ATI, etc.

          • Not everybody particularly cares about 3D graphics performance. If you ask the common joe, they probably care more about video performance than 3D performance, as people typically watch videos on their PCs more often than play 3D games.

            Yea, and then ask more clued up friends why a game they just got runs like crap. Just because they don't care does not mean they don't use it.

            By the way, both desktop machines at my place have unused integrated gfx.

            • by Jthon ( 595383 ) on Thursday February 19, 2009 @02:57AM (#26913745)

              Yea, and then ask more clued up friends why a game they just got runs like crap. Just because they don't care does not mean they don't use it.

              That's exactly what I was getting at. I have friends who aren't die hard gamers who have no idea what a GPU is. But they still like to play games occasionally.

              They go out and buy games like The Sims 2, Spore, or even World of Warcraft (yeah casual people play this) and get frustrated that it runs so poorly.

              I hate to tell them that because they have a low end Intel integrated chip they're just screwed (especially friends with laptops where an upgrade is unheard of). Heck even the lowest end NVIDIA or ATI INTEGRATED chip is over 10 times faster than Intel, and honestly costs only a couple $ more.

              Sure the NVIDIA/ATI integrated GPUs aren't top of the line, but at least with those the game is playable. I know someone who was trying to play some games on their Intel chipset and textures and some other affects are just missing.

              • At least the "Intel Integrated" desktop PCs normally CAN be upgraded with a dedicated graphics card.

                You should see Via's approach: "What graphics slot?"
                A PCI nVidia 5600 was actually an upgrade...

              • by CAIMLAS ( 41445 )

                No joke: aside from the more advanced support for DirectX and the like, an old Fujitsu laptop I've got with an ATI Radeon chipset has better performance with Compiz than an MSI Wind PC with i9x0 graphics (Atom). The old Fujitsu clocks at 500MHz; it's a Celeron.

              • What's really going on here is that in the future (after Core i7) "GPUs" are being viewed as the next big thing.

                Believe it or not, the talks I'm hearing are that "GPUs" are going to be everywhere soon, Intel knows this, and who makes the best GPU? Not Intel.

          • by Jthon ( 595383 )

            True that people aren't looking for to play Crysis, but simple programs like Spore, and even Google Earth benefit from going to an NVIDIA or ATI integrated GPU. There's a visual quality upgrade on even these "casual" games as the Intel chips do such a poor job.

            Also both the ATI and NVIDIA integrated chipsets do a better job of decoding HD video with less CPU usage. (Check out reviews of the ATI 790G and NVIDIA 9400M chipsets if you don't believe me.)

            Plus with OpenCL we might start to see more regular applic

          • by DrSkwid ( 118965 )

            I would say that at least an order of magnitude greater of computers just run email, browsers, spreadsheets and word processors, that's Intel's market.

          • by CAIMLAS ( 41445 )

            Intel graphics are crap at basic 2D desktop manipulation, as well. 7-year-old Nvidia cards outperform the latest, greatest from Intel without breaking a sweat. Same for integrated ATI stuff from the same era (or earlier - ATI Radeon smokes i9x0 in a significantly older laptop, both 2d and with compiz). It's like the difference between a Matrox G200 and anything else from that era.

            And that doesn't even consider the fact that Vista, by default, needs 3D acceleration AND a fast CPU to work well. This isn't suc

        • by Klintus Fang ( 988910 ) on Thursday February 19, 2009 @02:49AM (#26913715)

          It is not about bundling. It is about the fact that the vast majority of PC sales are to business customers who want to put desktops under the desks of their employees and don't give a damn about the GPU performance. To those customers, spending the premium for an nVidia GPU is absurd. Hence, they buy inexpensive machines that have GPU's which suck at rendering 3D but are fully functional when it comes to running Office or Email applications. This, btw, is in my opinion the real reason AMD bought ATI. AMD wanted to work toward having a solution for that high volume market, and seemed to think they needed to own ATI to do it.

          Many of the people who put together high end machines for gaming and/or other 3D application purposes---the people that buy and value what nVidia has to offer---frequently forget that type of machines they love are a very tiny percentage of the desktop market...

          • "This, btw, is in my opinion the real reason AMD bought ATI. AMD wanted to work toward having a solution for that high volume market, and seemed to think they needed to own ATI to do it."

            I think you're partially right. If they indeed wanted entry into the business graphics market. Matrox would have been a better purchase. But ATI makes better GPUs and they wanted entry there as well. It's easier to scale down a high-end GPU than it is to raise up a low-end GPU.

            • by Jthon ( 595383 )

              If all AMD wanted was the low end business market they could have passed on ATI, and just licensed a cheapo core from someone like SiS, or they could have even acquired S3.

              They probably wanted some of the chipset design expertise of the ATI side to create a "Centrino" like platform. That and they thought that the CPU will want to incorporate some of the parallel features of the GPU (Google their Fusion CPU project).

            • by CAIMLAS ( 41445 )

              They most certainly wanted more than just the 2D chipset. If they'd wanted the 2D chipset, they'd have been very well off to have just picked up the Matrox company for a fraction of the price; Matrox always had very fast, well-made 2D cards (and drivers), and their early 3D stuff wasn't half bad. They were also the best cards for Linux for a long time - and may even still be ,for that matter (but I haven't seen any of their stuff in so long I wouldn't know).

              AMD could probably have cornered the "hobbyist Lin

              • Re: (Score:2, Insightful)

                I agree. AMD likely wants generic "platform" solutions (where they provide the customer with multiple components of the platfrom rather than just the CPU) for as many markets as they can hit rather than just the basic business segment. That probably is why they chose ATI as opposed to something cheaper.
          • This, btw, is in my opinion the real reason AMD bought ATI. AMD wanted to work toward having a solution for that high volume market, and seemed to think they needed to own ATI to do it.

            Almost, but I think the real issue is that even traditional business desktops are beginning to need 3D, just for window compositing and "downloading..." animations. With Vista rating the entire computer based on the lowest score of a number of tests, and one of those tests being 3D performance, Intel were forced to up their

        • by GigaplexNZ ( 1233886 ) on Thursday February 19, 2009 @03:09AM (#26913831)

          If you mean as an actual product someone would intentionally seek out then Intel sells 0 GPUs.

          I actively seek out Intel graphics when looking at laptops due to the lower power requirements and better driver support (I hate it when NVIDIA and ATI drivers don't install in Windows as I have to contact the OEM for an older version, and I've always had more issues with the same brands on Linux). I know the performance is abysmal in comparison, but I don't care. You don't want Intel graphics, that's fine and I understand why, but that doesn't mean no-one intentionally seeks them out.

          • Re: (Score:3, Interesting)

            by Jthon ( 595383 )

            NVIDIA has their laptop drivers on their website so you no longer have to get outdated ones from your OEM. (Took them long enough.)

            As for battery life, have you checked out NVIDIA integrated vs Intel integrated? The discrete systems do suck more power, but I think the integrated chips for NVIDIA/ATI are still better and don't consume more power than Intel integrated.

            Apple is picky about battery life, and they recently switched to all NVIDIA on their laptop line, including the Macbook Air.

            Don't just assume t

            • Re: (Score:3, Interesting)

              by GigaplexNZ ( 1233886 )

              NVIDIA has their laptop drivers on their website so you no longer have to get outdated ones from your OEM. (Took them long enough.)

              Only for some models. My old 6600 Go (a very powerful laptop chip for its time) is still unsupported.

              As for battery life, have you checked out NVIDIA integrated vs Intel integrated? The discrete systems do suck more power, but I think the integrated chips for NVIDIA/ATI are still better and don't consume more power than Intel integrated.

              I have, and they aren't particularly appealing. Their performance isn't sufficiently better such that I can perform tasks that I otherwise wouldn't be able to, so the gains are effectively worthless to me. The driver support isn't fixed switching to NVIDIA/ATI integrated either (and is sometimes worse). Battery life is probably comparable, but it would need to be clearly superior for me to consider them.

              Don't just assume that because it's NVIDIA it's a power hungry monster. Sure the high end graphics cards need their own power substation, but they can do some nice low power stuff when they need to (9400M, Tegra).

              I do

              • by Jthon ( 595383 )

                Only for some models. My old 6600 Go (a very powerful laptop chip for its time) is still unsupported.

                That's annoying. I guess it looks like it's mostly just newer stuff up on there so far, and even not all their shipping products are supported.

                I'll have the keep that in mind next time I go laptop shopping.

            • NVIDIA has their laptop drivers on their website so you no longer have to get outdated ones from your OEM. (Took them long enough.)

              This seems like a good time to mention that Mobile Quadro owners can now install the desktop Quadro driver if they feel a need to get that recent. Microsoft seems to have kicked out a pretty recent driver in *Update which appears to be performing so I'm leaving it alone for now. HP last updated my last laptop's Quadro driver softpaq at least a year ago, maybe two...

            • by CAIMLAS ( 41445 )

              Definately! If you don't think Nvidia can produce low-power GPUs, look at their supposedly-upcoming Tegra ARM-based SoC: it's power use is a pittance.

        • by rbanffy ( 584143 )

          Since I don't run Aero, The Sims, Spore and only occasionally play with Google Earth, I don't really care for 3D performance.

          On the other hand, since I only run Windows under VirtualBox (and don't play games under it - BTW, since when /. became a gamer site?), I do care about compatibility and Intel has given me, for the last couple years, the least headaches when it comes to 3D acceleration under Linux. While I would have to think hard and test a lot before buying a new computer with ATI or Nvidia graphics

        • I only need a very little 3d graphics so intel is actually very good... I had an intel chipset in my old laptop and I never had driver issues of any kind...
          In my new laptop I've got an ATI card, which I regret... I have nothing but problems with it... And there is NO open drivers for it... Which is in fact the only reason I went ATI and not nVidia or Intel.
          And unless ATI starts actually delivering drivers for their mobile hd series then I'll be looking for intel for my next laptop, that's for sure...
        • by Nursie ( 632944 )

          "If you mean as an actual product someone would intentionally seek out then Intel sells 0 GPUs."

          False.

          They have very good support for linux, to the extent that unless dual-boot and 3d games are your thing, they are pretty much the best option. Until AMD/ATI start making progress that is.

          Also for business use they are cheap, reliable and a lot less power-hungry than the other two big players. For business desktop/workstation they make a lot of sense.

        • 1. NVidia sells integrated GPUs too, and they too count crappy integrated GPUs as GPUs sold. And yes, even if you later go and buy an ATI 4870, Nvidia still counts it as a GPU sold.

          So it seems to me like the GPs basic point still stands: Intel sells more GPUs than Nvidia. By a metric Nvidia too uses when they willy-wave about their market share being larger than ATI's.

          2. You seem to assume that it's some inescapable misfortune for the users, or that that's somehow not included in the choice to buy this comp

        • by TheLink ( 130905 )
          "If you mean as an actual product someone would intentionally seek out"

          To be honest, in many cases I'd actually pick the intel integrated GPU over say the Nvidia 8400.

          Because in recent times, Nvidia has had a higher than normal failure rate with a whole bunch of their GPUs.

          While the Intel GPUs are crap in performance terms, and have 3D rendering bugs, they appear to be very reliable in hardware terms.

          Whereas I've had Nvidia GPUs dying on me personally (not just reports from other people).

          Thus if you don't g
        • I've deliberately bought several Intel integrated-graphics motherboards; they display an 80x25 text mode fine as you install Debian, and my compute servers only have a monitor attached when they're failing to boot.

          I wish Intel didn't oblige you to buy a super-SLI-gamer motherboard just to run a Core i7 CPU; I bought the chip for the main-memory bandwidth.

    • Is the GPU a commodity yet? No.

      What about the CPU? Probably.

      But I do think nvidia are reaching a bit on this one.

  • Comment removed (Score:3, Insightful)

    by account_deleted ( 4530225 ) on Thursday February 19, 2009 @01:37AM (#26913359)
    Comment removed based on user account deletion
    • by TheLink ( 130905 )
      Actually, Creative are snuffing themselves out.

      Just look at the number of people having problems with Creative's stuff.

      Bad drivers. Bad hardware. Annoying/Bad software.

      For example, we've a SB Live 24 (USB) where if you press the mute on the device, the mute works, but to unmute you have to reset the whole thing! And the recording volume is messed up. Maybe I'm doing something wrong, but I don't have many complaints about the built-in Realtek or Via Audio.

      Then there was that friend who bought one of those au
  • by Ostracus ( 1354233 ) on Thursday February 19, 2009 @01:46AM (#26913397) Journal

    "At the heart of this issue is that the CPU has run its course and the soul of the PC is shifting quickly to the GPU. This is clearly an attempt to stifle innovation to protect a decaying CPU business.""

    Sounds like he reads slashdot.

    • by theheadlessrabbit ( 1022587 ) on Thursday February 19, 2009 @01:55AM (#26913457) Homepage Journal

      "At the heart of this issue is that the CPU has run its course and the soul of the PC is shifting quickly to the GPU. This is clearly an attempt to stifle innovation to protect a decaying CPU business.""

      Sounds like he reads slashdot.

      it sounds more like he NEVER reads slashdot.
      He said nothing about welcoming overlords, Natalie Portman, or hot grits. the phrase IANAL never came up, and no car analogies were used.
      how does he sound like a slashdotter?

      • by chaboud ( 231590 )

        Has it ever bothered anyone else that IANAL looks so much like "I anal," and is that an invitation?

        I just noticed this today, oddly, but it's bugging the hell out of me.

        More on the point, there was no racist ripping, no your/you're/yor or their/there/I-like-my-cat confusion, so, yeah, this isn't very slashdotty. That said, those things are the things that people put out when they *post* on slashdot. I'd wager that readership is 20:1, at least.

      • You must not read Slashdot either, because you completely missed the obvious meme of all:

        "The CPU is dying -- Netcraft confirms it!"

      • In Soviet Russia, meme forgets you!

  • Yeesh, does he ever say anything in public that doesn't sound like drug-addled desperate bluster...? He's like the Ken Kutaragi of the PC world...

  • Typical bluster (Score:4, Interesting)

    by CajunArson ( 465943 ) on Thursday February 19, 2009 @02:05AM (#26913513) Journal

    Jen-Hsun Huang has never been one to keep his trap shut when given the chance... even though Nvidia is in the red right now. Lesson one: When a CEO comes out and tries to use a legal dispute related to a contract as a pulpit to make a religious sermon, he knows he's wrong. See Darl McBride and Hector Ruiz as other examples of dumbass CEO's who love to see themselves in magazines but don't want to be bothered with pesky details like turning a profit or actually competing.
        Intel is #1 in graphics when it comes to shipments... now I'm not saying I'd want to play 3D games on their chips, but guess what: despite what you see on Slashdot, very few users want to play these games. Further, I've got the crappy Intel integrated graphics on my laptop, and Kubuntu with KDE 4.2 is running quite well thanks to the 100% open source drivers that Intel has had it's own employees working on for several years. I'm not saying Intel graphics will play Crysis, but they do get the job done without binary blobs.
        Turning the tables on Huang, the real "fear" here is of Larrabee... this bad-boy is not going to even require "drivers" in the conventional sense, it will be an open stripped-down x86 chip designed for massive SIMD and parallelism... imagine what the Linux developers will be able to do with that not only in graphics but for GPGPU using OpenCL. Will it necessarily be faster than the top-end Nvidia chips? Probably not... but it could mean the end of Nvidia's proprietary driver blobs for most Linux users who can get good performance AND an open architecture... THAT is what scares Nvidia.

    • Re:Typical bluster (Score:5, Interesting)

      by CajunArson ( 465943 ) on Thursday February 19, 2009 @02:11AM (#26913537) Journal

      I hate to respond to myself but: Yeah the market share of Linux is not huge, Nvidia is probably not terrified of losing sales to Larrabee on some desktop Linux boxes (high end supercomputing apps could be an interesting niche they might care about though). However, it is afraid that OEMs will be interested in Larrabee as a discrete card where Intel never had a solution before. Given the problems that Nvidia has had with execution over the last year, and the fact that Intel knows how to keep suppliers happy, THAT is where Nvidia is really afraid.

      • I hate to respond to myself but: Yeah the market share of Linux is not huge, Nvidia is probably not terrified of losing sales to Larrabee on some desktop Linux boxes (high end supercomputing apps could be an interesting niche they might care about though). However, it is afraid that OEMs will be interested in Larrabee as a discrete card where Intel never had a solution before. Given the problems that Nvidia has had with execution over the last year, and the fact that Intel knows how to keep suppliers happy,

    • You are insane if you think driver source code availability makes a measurable difference in the business of either chipmaker.

    • Turning the tables on Huang, the real "fear" here is of Larrabee... this bad-boy is not going to even require "drivers" in the conventional sense, it will be an open stripped-down x86 chip designed for massive SIMD and parallelism...

      What do you mean no drivers? If it doesn't work with existing games it'll be a total failure, so it has to be compatible with DirectX and OpenGL. Which surely means drivers.

      Besides, Nvidia doesn't give a damn about Linux in comparison to games. From that perspective it really doesn't look too imposing. These are my impressions from its Wikipedia entry:

      Performance: 25 cores at 1Ghz gives you Gears of War at 60FPS at 1600x1200 with no anti-aliasing. That places it in the mid-high range of existing G

    • by Maudib ( 223520 )

      "now I'm not saying I'd want to play 3D games on their chips, but guess what: despite what you see on Slashdot, very few users want to play these games"

      Dude, World of Warcraft alone has 12 million subscribers. Its the highest grossing media ever. Do you think Sony and Microsoft have spent billions on gaming because its a niche market?

      2007 Hollywood box office revenue was 9.6 Billion Total revenue in 08 including dvds, tv rights, foreign and domestic was around $50billion.
      2008 Video game sales in the U.S. al

  • by Shag ( 3737 )

    Intel, Nvidia and AMD helped Apple formulate the original proposal for OpenCL... Intel makes Apple's CPUs, Nvidia increasingly makes the GPUs (sometimes 2 in a single laptop). So there's bound to be some smack-talking about CPUs vs. GPUs and all that.

    I think Apple will be the first to have OpenCL support in an OS, and as others follow suit and we see more CPUs and GPUs in machines, this little tiff might conceivably end up meaning... something.

    • Having just seen a friend's presentation on the rise of GPU computing I'm looking forward to seeing OpenCL come out. Just on his test rig of a single Nvidia Tesla with 128 cores, a few hundred Gigaflops of number crunching was very interesting. He was also cheap and didn't throw in 3 more cards to get up to a few teraflops of processing. OpenCL could be a game changer for personal computers.
    • by tyrione ( 134248 )

      Intel, Nvidia and AMD helped Apple formulate the original proposal for OpenCL... Intel makes Apple's CPUs, Nvidia increasingly makes the GPUs (sometimes 2 in a single laptop). So there's bound to be some smack-talking about CPUs vs. GPUs and all that.

      I think Apple will be the first to have OpenCL support in an OS, and as others follow suit and we see more CPUs and GPUs in machines, this little tiff might conceivably end up meaning... something.

      Yes, it could mean Apple moves to AMD/ATi combo integrated and dedicated seeing as AMD will be OpenCL compliant by the time Snow Leopard is here.

  • by seeker_1us ( 1203072 ) on Thursday February 19, 2009 @05:05AM (#26914457)
    More like strawman and double strawman. Jen-Hsun Huang talks about GPUs. Intel is talking about chipsets.

    You can plug an NVIDIA GPU card into an Intel motherboard (I did just that for the computer I am using).

    I have no idea why Intel wouldn't want Nvidia to make chipsets for core i7. For some reason, even years after AMD bought ATI, the only Intel mainboards which support two linked graphics cards do so through Crossfire. So if Nvidia doesn't make chipsets to support core i7, Intel would be forcing the hardcore gamers to either (a) buy AMD's video chips to use Crossfire or (b) buy AMD's CPU's to use NVidia SLI.

  • I'm always uncomfortable when a CEO goes on a crusade like this:

    At the heart of this issue is that the CPU has run its course and the soul of the PC is shifting quickly to the GPU. This is clearly an attempt to stifle innovation to protect a decaying CPU business.

    Errrrrr, I think you'll find it's the other way aroud mate. That is, afterall, why you're maing comments like this?

  • It's exactly this kind of bullshit that has killed PC gaming, and driven most of the video game development towards consoles.

    Absent the "Next-Gen" feel of a PC platform (3D performance has always felt one generation ahead of consoles) players have no incentive to upgrade their PCs every 18 months like they did at the turn of the century.

    In five years, the only games you'll be playing on your PC will be independent titles that can't get licensing agreements on XBL or WiiWare, and flash/browser games.

    Thank yo

    • Boy isn't it just! [steampowered.com]

      That's a peak of 1.6 million concurrent users in the last 48 hours alone. For comparison Microsoft were boasting about Xbox Live reaching 1.5 million concurrent users back in January, the highest peak they'd ever had.

      Your argument about the 'next-gen' concept seems self-refuting. The consoles are already behind PCs a bit in terms of graphics, and in five years they won't be any better but the PC will have moved on considerably. They can't even move on in complexity because they're s
  • Comment removed based on user account deletion
    • In my experience geforce cards are usually seriously CPU limited. Or at least, they can be, and they would be if you paired them with an Atom. On the other hand, maybe nVidia could do a major process shrink on an old geforce?

Some people manage by the book, even though they don't know who wrote the book or even what book.

Working...