Final Words

Today's launch is strange. I tried to convince NVIDIA to release more information about Fermi but was met with staunch resistance from the company. NVIDIA claims that by pre-announcing Fermi's performance levels it would seriously hurt its existing business. It's up to you whether or not you want to believe that.

Last quarter the Tesla business unit made $10M. That's not a whole lot of money for a company that, at its peak, grossed $1B in a single quarter. NVIDIA believes that Fermi is when that will all change. To borrow a horrendously overused phrase, Fermi is the inflection point for NVIDIA's Tesla sales.

By adding support for ECC, enabling C++ and easier Visual Studio integration, NVIDIA believes that Fermi will open its Tesla business up to a group of clients that would previously not so much as speak to NVIDIA. ECC is the killer feature there.

While the bulk of NVIDIA's revenue today comes from 3D graphics, NVIDIA believes that Tegra (mobile) and Tesla are the future growth segments for the company. This hints at a very troubling future for GPU makers - are we soon approaching the Atom-ization of graphics cards?

Will 2010 be the beginning of good enough performance in PC games? Display resolutions have pretty much stagnated, PC games are first developed on consoles which have inferior hardware and thus don't have as high the GPU requirements. The fact that NVIDIA is looking to Tegra and Tesla to grow the company is very telling. Then again, perhaps a brand new approach to graphics is what we'll need for the re-invigoration of PC game development. Larrabee.

If the TAM for GPUs in HPC is so big, why did NVIDIA only make $10M last quarter? If you ask NVIDIA it has to do with focus and sales.

According to NVIDIA, over the past couple of years NVIDIA's Tesla sales efforts have been scattered. The focus was on selling to any customers that could potentially see a speedup, trying to gain some traction for the Tesla business.

Jen-Hsun did some yelling and now NVIDIA is a bit more focused in that department. If Tesla revenues increase linearly from this point, that's simply not going to be enough. I asked NVIDIA if exponential growth for Tesla was in the cards and if so, when would it happen. The answer was yes and with Fermi.

We'll see how that plays out, but if Fermi doesn't significantly increase Tesla revenues then we know that NVIDIA is in serious trouble.

The architecture looks good, Fermi just needs to be priced right. Oh and the chip needs to hurry up and come out.

The RV770 Lesson (or The GT200 Story)
Comments Locked

415 Comments

View All Comments

  • SiliconDoc - Thursday, October 1, 2009 - link

    The R600 was great, you idiot.
    Of course, when hating nvidia is your real gig, I don't expect you to do anything but be parrot off someone else's text and get the idea wrong, get the repeating incorrect.
    -
    The R600 was and is great, and has held up a long time, like the G80. Of course if you actually had a clue, you'd know that, and be aware that you refuted your own attempt at a counterpoint, since the R600 was "great on paper" and also "in gaming machines".

    It's a lot of fun when so many fools self-proof it trying to do anything other than scream lunatic.

    Great job, you put down a really good ATI card, and slapped yourself and your point, doing it. It's pathetic, but I can;t claim it's not SOP, so you have plenty of company.

  • papapapapapapapababy - Wednesday, September 30, 2009 - link

    because both ms and sony are copying nintendo...

    that means, next consoles > minuscule speed bump, low price and (lame) motion control attached. All this tech is useless with no real killer ap EXCLUSIVE FOR THE PC! But hey who cares, lets play PONG at 900 fps !
  • Lonyo - Wednesday, September 30, 2009 - link

    Did you even read the article?
    The point of this tech is to move away from games, so the killer app for it won't be games, but HPC programs.
  • SiliconDoc - Thursday, October 1, 2009 - link

    I think the point is - the last GT200 was ALSO TESLA -- and so of course...
    It's the SECOND TIME the red roosters can cluck and cluck and cluck "it won't be any good" , and "it's not for gaming".
    LOL
    Wrong before, wrong again, but never able to learn from their mistakes, the barnyard animals.
  • Zingam - Thursday, October 1, 2009 - link

    Last time I bought the most expensive GPU available was Riva TNT!
    Sorry but even if they offer this for gamers I won't be able to buy it. It is high above my budget.

    I'd buy based on quality/price/features! And not based on who has the better card on paper in year 20xx.
  • SiliconDoc - Thursday, October 1, 2009 - link

    Well, for that, I am sorry in a sense, but on the other hand find it hard to believe, depending upon your location in the world.
    Better luck if you're stuck in a bad place, and good luck on keeping your internet connection in that case.
  • ClownPuncher - Thursday, October 1, 2009 - link

    Or maybe he has other priorities besides being an asshole.
  • SiliconDoc - Thursday, October 1, 2009 - link

    Being unable, and choosing not to, are two different things.

    And generally speaking ati users are unable, and therefore cannot choose to, because they sit on that thing you talk about being.

    Now that's how you knockout a clown.
  • Lord 666 - Wednesday, September 30, 2009 - link

    That actually just made my day; seeing a VP of Marketing speak their mind.
  • Cybersciver - Friday, October 2, 2009 - link

    Yeah, that was cool.
    Don't know about you guys, but my interest in GPU's is gaming @ 1920X1200. From that pov it looks like Nvidia's about to crack a coconut with a ten-ton press.
    My 280 runs just about everything flat-out (except Crysis naturally)and the 5850 beats it. So why spend more? Most everything's a consul port these days and they aren't slated for an upgrade till 2012, least last I heard.
    Boo hoo.
    Guess that's why multiple-screen gaming strating to be pushed.
    No way Jose.

Log in

Don't have an account? Sign up now