Left 4 Dead Analysis

Based on Valve's Source engine, Left 4 Dead can run fairly smoothly with any card we tested at any resolution with the maximum settings. The game is definitely not bad looking either, so getting playable framerates on the Radeon HD 4850 at 2560x1600 is no small feat. We do test with a custom demo that makes use of a heavy swarm of zombies in an outdoor area, and though the performance impact is as heavy as we could make it in our benchmark, it may still be possible to hit situations where the lowest end cards stutter with lots of enemies around at very high resolution.




1680x1050    1920x1200    2560x1600


At 1680x1050 and 1920x1200, CrossFire and single ATI cards tend to do better than their NVIDIA counterparts. The GTX 295 does hang out near the top, though. Oddly, the 9800 GTX+ SLI does better than GT200 2 card solutions. If it were just the single card GTX 295 that performed better than the two card options, we would speculate that there was some bus bandwidth or latency issue that caused problems, but it seems that there's something else limiting the performance of the NVIDIA GT200 SLI options. Of course, with the high framerates we see, we aren't exactly complaining. We recommend turning on triple buffering for this game to both eliminate tearing and minimize the input latency possible when just enabling vsync.




1680x1050    1920x1200    2560x1600


Performance at 2560x1600 shirts the playing field putting SLI and CrossFire on more equal footing. While NVIDIA's GTX 260 options lead the competing 4870 class options, the 4850 does very well against its competition. There are no real disappointments with this game and any multiGPU solution though.

When it comes to scaling, at lower resolutions we see CPU/system limited performance affecting the improvement possible with multiple GPUs. The 9800 GTX and 4850 are the only cards that see any real improvement at 1680x1050, and it's not even that much better at 1920x1200. Moving up to 2560x1600 we finally see most of the players above 50% scaling. The exceptions are the fastest single GPU configurations in this test: the GTX 280 and GTX 285.




1680x1050    1920x1200    2560x1600


Between the two "low" resolutions we test, there's no change in the value lineup even though there are shifts in the performance lineup. At these resolutions, multiGPU options tend not to be a good investment because of the CPU/system limit issue. The exception are the 4850 CrossFire and 9800 GTX+ SLI because of the fact that the single card options are GPU limited and we see better scaling for the money with two GPUs. At higher resolution, value compresses more and the 4870 1GB drops in value while NVIDIA hardware pushes up a bit.

FarCry 2 Analysis Race Driver GRID Analysis
Comments Locked

95 Comments

View All Comments

  • kmmatney - Monday, February 23, 2009 - link

    Especially at the 1920 x 1200 resolution - that resolution is becoming a sweetspot nowadays.
  • just4U - Monday, February 23, 2009 - link

    I disagree. I see people finally moving away from their older 17-19" flat panels directly into 22" wide screens. 24" and 1920/1200 resolutions are no where near the norm.
  • SiliconDoc - Wednesday, March 18, 2009 - link

    Correct, but he said sweet spot because his/her wallet is just getting bulgy enough to comtenplate a movement in that direction... so - even he/she is sadly stuck at "the end user resolution"...
    lol
    Yes, oh well. I'm sure everyone is driving a Mazerati until you open their garage door....or golly that "EVO" just disappeared... must have been stolen.
  • DerekWilson - Monday, February 23, 2009 - link

    The 1GB version should perform very similarly to the two 4850 cards in CrossFire.

    The short answer is that the 1GB version won't have what it takes for 2560x1600 but it might work out well for lower resolutions.

    We don't have a 1GB version, so we can't get more specific than that, though this is enough data to make a purchasing decision -- just look at the 4850 CrossFire option and take into consideration the cheaper price on the 1GB X2.
  • politbureau - Tuesday, June 1, 2010 - link

    I realize this is an older article, however I always find it interesting to read when upgrading cards.

    While I find it admirable that Derek has compared the 'older' GTX 280 SLI scaling, it is unfortunate that he hasn't pointed out that it should perform identically to the GTX 285s if the clocks were the same.

    This was also passed over in the "worthy successor" article, where it does not compare clock for clock numbers - an obvious test, if we want to discover the full value of the die shrink.

    I recently 'upgraded' to 3 GTX 285s from 3 GTX 280s through warranty program with the mfg, and there is little to no difference in performance between the 2 setups. While cabling is more convenient (no 6 to 8 pin adapters), the 285s won't clock any better than my 280s would, Vantage scores are within a couple hundred points of each other at the same clocks (the 280s actually leading), and the temperature and fan speed of the new cards hasn't improved.

    I think this is a valuable point in an article that compares performance per dollar, and while slightly outside the scope of the article, I think it's a probabtive observation to make.

Log in

Don't have an account? Sign up now