Mirror’s Edge: Do we have a winner?

And now we get to the final test. Something truly different: Mirror’s Edge.

This is an EA game. Ben had to leave before we got to this part of the test, he does have a wife and kid after all, so I went at this one alone.

I’d never played Mirror’s Edge. I’d seen the videos, it looked interesting. You play as a girl, Faith, a runner. You run across rooftops, through buildings, it’s all very parkour-like. You’re often being pursued by “blues”, police offers, as you run through the game. I won’t give away any plot details here but this game, I liked.

The GPU accelerated PhysX impacted things like how glass shatters and the presence of destructible cloth. We posted a video of what the game looks like with NVIDIA GPU accelerated PhysX enabled late last year:

"Here is the side by side video showing better what DICE has added to Mirror's Edge for the PC with PhysX. Please note that the makers of the video (not us) slowed down the game during some effects to better show them off. The slow downs are not performance related issues. Also, the video is best viewed in full screen mode (the button in the bottom right corner)."

 

In Derek’s blog about the game he said the following:

“We still want to really get our hands on the game to see if it feels worth it, but from this video, we can at least say that there is more positive visual impact in Mirror's Edge than any major title that has used PhysX to date. NVIDIA is really trying to get developers to build something compelling out of PhysX, and Mirror's Edge has potential. We are anxious to see if the follow through is there.”

Well, we have had our hands on the game and I’ve played it quite a bit. I started with PhysX enabled. I was looking for the SSD-effect. I wanted to play with it on then take it away and see if I missed it. I played through the first couple of chapters with PhysX enabled, fell in lust with the game and then turned off PhysX.

I missed it.

I actually missed it. What did it for me was the way the glass shattered. When I was being pursued by blues and they were firing at me as I ran through a hallway full of windows, the hardware accelerated PhysX version was more believable. I felt more like I was in a movie than in a video game. Don’t get me wrong, it wasn’t hyper realistic, but the effect was noticeable.

I replayed a couple of chapters and then played some new ones with PhysX disabled now before turning it back on and repeating the test.

The impact of GPU accelerated PhysX was noticeable. EA had done it right.

The Verdict?

So am I sold? Would I gladly choose a slower NVIDIA part because of PhysX support? Of course not.

The reason why I enjoyed GPU accelerated PhysX in Mirror’s Edge was because it’s a good game to begin with. The implementation is subtle, but it augments an already visually interesting title. It makes the gameplay experience slightly more engrossing.

It’s a nice bonus if I already own a NVIDIA GPU, it’s not a reason for buying one.

The fact of the matter is that Mirror’s Edge should be the bare minimum requirement for GPU accelerated PhysX in games. The game has to be good to begin with and the effects should be the cherry on top. Crappy titles and gimmicky physics aren’t going to convince anyone. Aggressive marketing on top of that is merely going to push people like us to call GPU accelerated PhysX out for what it is. I can’t even call the overall implementations I’ve seen in games half baked, the oven isn’t even preheated yet. Mirror’s Edge so far is an outlier. You can pick a string of cheese off of a casserole and like it, but without some serious time in the oven it’s not going to be a good meal.

Then there’s the OpenCL argument. NVIDIA won’t port PhysX to OpenCL, at least not anytime soon. But Havok is being ported to OpenCL, that means by the end of this year all games that use OpenCL Havok can use GPU accelerated physics on any OpenCL compliant video card (NVIDIA, ATI and Intel when Larrabee comes out).

While I do believe that NVIDIA and EA were on to something with the implementation of PhysX in Mirror’s Edge, I do not believe NVIDIA is strong enough to drive the entire market on its own. Cross platform APIs like OpenCL will be the future of GPU accelerated physics, they have to be, simply because NVIDIA isn’t the only game in town. The majority of PhysX titles aren’t accelerated on NVIDIA GPUs, I would suspect that it won’t take too long for OpenCL accelerated Havok titles to equal that number once it’s ready.

Until we get a standard for GPU accelerated physics that all GPU vendors can use or until NVIDIA can somehow convince every major game developer to include compelling features that will only be accelerated on NVIDIA hardware, hardware PhysX will be nothing more than fancy lettering on a cake.

You wanted us to look at PhysX in a review of an ATI GPU, and there you have it.

The Unreal Tournament 3 PhysX Mod Pack: Finally, a Major Title CUDA - Oh there’s More
Comments Locked

294 Comments

View All Comments

  • Psyside - Thursday, April 2, 2009 - link

    Can anyone tell me about the testing metod average or maximum fps? thanks.
  • Jamahl - Thursday, April 2, 2009 - link

    some sites have the gtx275 clearly winning at all games, all resolutions.
  • helldrell666 - Thursday, April 2, 2009 - link

    You can't trust every site you check.especially since most of those sites don't post their funders names on their main page.You must've heard of Hardocp's Kyle who was fired by nvidia because he mentioned that the gtx250 is a renamed 9800gtx.
  • 7Enigma - Thursday, April 2, 2009 - link

    I think this is due to Nvidia shooting themselves in the leg with the 185 drivers. With the performance penalty at the normal resolutions, anyone testing with the 185's is going to get lower results than someone testing with the previous drivers. And I'm sure you could find 10 games that all perform better on ATI/NVIDIA. That's the problem with game selection and the only real answer is what types of games you play and what engines you think will be used heavily for the next 2 years.
  • SiliconDoc - Monday, April 6, 2009 - link

    Well the REAL ANSWER is - if you play at 2650, or even if you don't, and have been a red raging babbling lying idiot red rooster for 6 months plus pretending along with Derek that 2650x is the only thing that matters, now you have a driver for NVidia that whips the ati top dog core...
    If you're ready to reverse 6 months of red ranting and raving for 2560X ati wins it all, just keep the prior NV driver, so the red roosters screaming they now win because they suddenly are stuck at the LOWER REZ tier to claim a win, can be blasted to pieces anyway- at that resolution.
    So - NVidia now has a driver choice - the new for the high rez crown they took from the red fanboy ragers, and the prior driver which SPANKS THE RED CARD AGAIN at the lower rez.
    Make sure to collude with all the raging red roosters to keep that as hush hush as possible.
    1. spank the 790 at lower rezz with the older Nvidia driver
    2. spank the 790 at the highest rez with the new driver
    _______________________

    Don't worry if you can't understand just keep hopping around flapping those litttle wings and clucking so that red gobbler jouces around - don't worry soft PhysX can display that flabby flapper !
  • The0ne - Tuesday, April 7, 2009 - link

    Can someone ban this freaking idiot. The last few posts of his have been nothing but moronic, senseless rants. Jesus Christ, buy a gun and shoot yourself already.
  • SiliconDoc - Tuesday, April 7, 2009 - link

    Ahh, you don't like the points, so now you want death. Perhaps you should be banned, mr death wisher.
    If you don't like the DOZENS of valid points I made, TOO BAD - because you have no response - now you sound like krz1000 and his endless list of names, the looney red rooster that screeches the same thing you just did, then posts a link to youtube with a freaky slaughter video.
    If I wasn't here, the endless LIES would go unopposed, now GO BACK and respond to my points LIKE MAN, if you have anything, which no doubt, you do not.
  • helldrell666 - Thursday, April 2, 2009 - link

    According to xbitlabs, the 4890 beats the gtx285 at 1920x1200 resolution with 4x aa in Cod5, Crysis Warhead, Stalker CS, Fallout 3 and loses in Far Cry2.Here, the 4890 matches in Far Cry 2 and cod5 with some slightly lower fps than the gtx285 in Crysis warhead.

    Strange....
  • 7Enigma - Thursday, April 2, 2009 - link

    That is crazy. There is no way variations should be that huge between the 2 tests, regardless of the area they chose to test in the game. Anandtech has it as essentially a wash, while Xbit has the 4890 20% faster!?! (COD:WaW)
  • 7Enigma - Thursday, April 2, 2009 - link

    Just looked closer at the Xbitlabs review. The card they used was an OC variant that had 900MHz core instead of the stock 850MHz. In certain games that are not super graphically intensive I'm willing to bet at 1920X1200 they may still be core starved and not memory starved so a 50MHz increase may explain the discrepancy.

    I've got to admit you need to take the Xbitlabs article with a grain of salt if they are using the OC variant as the base 4890 in all of their charts....that's pretty shady...

Log in

Don't have an account? Sign up now