Putting this PhysX Business to Rest

Let me put things in perspective. Remember our Radeon HD 4870/4850 article that went up last year? It was a straight crown-robbing on ATI’s part, NVIDIA had no competitively priced response at the time.

About two hours before the NDA lifted on the Radeon HD 4800 series we got an urgent call from NVIDIA. The purpose of the call? To attempt to persuade us to weigh PhysX and CUDA support as major benefits of GeForce GPUs. A performance win by ATI shouldn’t matter, ATI can’t accelerate PhysX in hardware and can’t run CUDA applications.

The argument NVIDIA gave us was preposterous. The global economy was weakening and NVIDIA cautioned us against recommending a card that in 12 months would not be the right choice because new titles supporting PhysX and new CUDA applications would be coming right around the corner.

The tactics didn’t work obviously, and history showed us that despite NVIDIA’s doomsday warnings - Radeon HD 4800 series owners didn’t live to regret their purchases. Yes, the global economy did take a turn for the worst, but no - NVIDIA’s PhysX and CUDA support hadn’t done anything to incite buyer’s remorse for anyone who has purchased a 4800 series card. The only thing those users got were higher frame rates. (Note that if you did buy a Radeon HD 4870/4850 and severely regretted your purchase due to a lack of PhysX/CUDA support, please post in the comments).

This wasn’t a one time thing. NVIDIA has delivered the same tired message at every single opportunity. NVIDIA’s latest attempt was to punish those reviewers who haven’t been sold on the PhysX/CUDA messages by not sending them GeForce GTS 250 cards for review. The plan seemed to backfire thanks to one vigilant Inquirer reporter.

More recently we had our briefing for the GeForce GTX 275. The presentation for the briefing was 53 slides long, now the length wasn’t bothersome, but let’s look at the content of the slides:

Slides About... Number of Slides in NVIDIA's GTX 275 Presentation
The GeForce GTX 275 8
PhysX/CUDA 34
Miscellaneous (DX11, Title Slides, etc...) 11

 

You could argue that NVIDIA truly believes that PhysX and CUDA support are the strongest features of its GPUs. You could also argue that NVIDIA is trying to justify a premium for its much larger GPUs rather than having to sell them as cheap as possible to stand up to an unusually competitive ATI.

NVIDIA’s stance is that when you buy a GeForce GPU, it’s more than just how well it runs games. It’s about everything else you can run on it, whether that means in-game GPU accelerated PhysX or CUDA applications.

Maybe we’ve been wrong this entire time. Maybe instead of just presenting you with bar charts of which GPU is faster we should be penalizing ATI GPUs for not being able to run CUDA code or accelerate PhysX. Self reflection is a very important human trait, let’s see if NVIDIA is truly right about the value of PhysX and CUDA today.

Another Look at the $180 Price Point: 260 core 216 vs. 4870 1GB The Widespread Support Fallacy
Comments Locked

294 Comments

View All Comments

  • piesquared - Thursday, April 2, 2009 - link

    Must be tough trying to write a balanced review when you clearly favour one side of the equation. Seriously, you tow NV's line without hesitation, including soon to be extinct physx, a reviewer relieased card, and unreleased drivers at the time of your review. And here's the kicker; you ignore the OC potential of AMD's new card, which as you know, is one of it's major selling points.

    Could you possibly bend over any further for NV? Obviously you are perfectly willing to do so. F'n frauds
  • Chlorus - Friday, April 3, 2009 - link

    What?! Did you even read the article? They specifically say they cannot really endorse PhysX or CUDA and note the lack of support in any games. I think you're the one towing a line here.
  • SiliconDoc - Monday, April 6, 2009 - link

    The red fanboys have to chime in with insanities so the reviewers can claim they're fair because "both sides complain".
    Yes, red rooster whiner never read the article, because if he had he would remember the line that neither overclocked well, and that overclocking would come in a future review ( in other words, they were rushed again, or got a chum card and knew it - whatever ).
    So, they didn't ignore it , they failed on execution - and delayed it for later, so they say.
    Yeah, red rooster boy didn't read.
  • tamalero - Thursday, April 9, 2009 - link

    jesus dude, you have a strong persecution complex right?
    its like "ohh noes, they're going against my beloved nvidia, I MUST STOP THEM AT ALL COSTS".
    I wonder how much nvidia pays you? ( if not, you're sad.. )
  • SiliconDoc - Thursday, April 23, 2009 - link

    That's interesting, not a single counterpoint, just two whining personal attacks.
    Better luck next time - keep flapping those red rooster wings.
    (You don't have any decent couinterpoints to the truth, do you flapper ? )
    Sometimes things are so out of hand someone has to say it - I'm still waiting for the logical rebuttals - but you don't have any, neither does anyone else.
  • aguilpa1 - Thursday, April 2, 2009 - link

    All these guys talking about how irrelevant physx and how not so many games use it don't get it. The power of physx is bringing the full strength of those GPU's to bear on everyday apps like CS4 or Badaboom video encoding. I used to think it was kind of gimmicky myself until I bought the "very" inexpensive badaboom encoder and wow, how awesome was that! I forgot all about the games.
  • Rhino2 - Monday, April 13, 2009 - link

    You forgot all about gaming because you can encode video faster? I guess we are just 2 different people. I don't think I've ever needed to encode a video for my ipod in 60 seconds or less, but I do play a lot of games.
  • z3R0C00L - Thursday, April 2, 2009 - link

    You're talking about CUDA not Physx.

    Physx is useless as HavokFX will replace it as a standard through OpenCL.
  • sbuckler - Thursday, April 2, 2009 - link

    No physx has the market, HavokFX is currently demoing what physx did 2 years ago.

    What will happen is the moment HavokFX becomes anything approaching a threat nvidia will port Physx to OpenCL and kill it.

    As far as ATI users are concerned the end result is the same - you'll be able to use physics acceleration on your card.
  • z3R0C00L - Thursday, April 2, 2009 - link

    You do realize that Havok Physics are used in more games than Physx right (including all the source engine based games)?

    And that Diablo 3 makes use of Havok Physics right? Just thought I'd mention that to give you time to change your conclusion.

Log in

Don't have an account? Sign up now