Conclusion

So now that we have seen what both ATI and NVIDIA could do with their respective drivers in the previous generation, and how this is apparently influenced by console development, let's first talk a bit about NVIDIA specifically on the subject of PC-native games.

Overall, the performance improvements with the 6800 Ultra are a bit disappointing. With a new architecture we had hoped NVIDIA would have been able to pull out some of their famous performance increases, and this did not happen. This is not to discount the 6800 Ultra, as the entire NV40-based family are all quite strong cards. However, part of the market is working out performance improvements over a card's lifetime in order to improve performance, win the mindshare of buyers by showing them that they're going to get more out of their purchases over time, and thus create better positioning for future products. With only a handful of significant (>10%) performance improvements, it's hard to say NVIDIA really has done this well.

Accordingly, they don't end up sticking to all of our previously listed trends that ATI ended up following. In general, NVIDIA was able to provide a little bit of a general performance improvement with each of their drivers instead of being solely focused around one-time performance boosts, but the most promising time period for a performance boost is still shortly after a game comes out. In fact, the biggest performance NVIDIA experienced appears to be in debugging and optimizing the drivers early in the life of the hardware. As with ATI's Catalyst drivers, there seems to be little reason to do a ForceWare driver upgrade unless there is a specific change in a driver you are interested in, and this likely isn't going to change with the NV40-derrived G70 series.

On the positive side, we reiterate how impressed we are with NVIDIA's lack of performance improvements in 3DMark. If this is truly because they didn't devote resources to doing so, or if they simply couldn't pull it off, is something we may never know, but it's nice to see that their greatest improvements were in real games, not synthetic benchmarks. Similarly, we are glad to see over the last year and a half that they haven't made the same mistake as ATI with regard to shoehorning users into needing to use a bloated toolset to manipulate their cards. The NVIDIA control panel is lean and mean, and that's the way we like it.

As for the burning question of performance improvements with drivers, ATI has clearly shown that they can work magic with their drivers if they need to. This in turn once again reiterates just how much of an impact drivers can have on overall performance, and ultimately purchasing choices. Although even with some of these large performance changes, they would not have likely changed our initial recommendations for what to purchase, it is close to that threshold.

It is clear that just looking at a card once at its introduction is not enough; the performance improvements and corrections offered by later drivers are just too much to ignore. Yet at the same time, other than re-reviewing a dozen cards every time a new driver is released - and even then we can't very well tell the future - we have no way of really measuring this impact other than with hindsight, which is by its very nature too late. We can offer guesses at what kind of performance improvements might be in the pipeline in the future from NVIDIA and ATI, and certainly they will make every effort to tell the world about how proud they are whenever they do manage to come up with such a performance improvement. The reality is that many of the proclaimed improvements are for certain uncommon settings - i.e. running a budget card at high resolutions with AA/AF enabled - and judging by the results of our regression testing, buyers should be aware of the marketing hype that's going on.

At the same time, the increase in games being ported or developed with console versions paints a very different picture. With the current generation consoles, NVIDIA has benefited from a nice performance lead from what we've seen today, as Xbox-only titles have been much harder on ATI than NVIDIA. Since NVIDIA is effectively the de-facto leader in console GPUs for the current generation (ATI only has a chip in the Gamecube, of which few titles are ported over to the PC), there appears to be little reason to expect major performance improvements out of NVIDIA's drivers on Xbox ports, while ATI appears to be able to work out (and need) performance improvements for Xbox ported games.

That said, the number of games that will still be ported from current generation consoles to the PC is dwindling, which brings us to the discussion of the next generation. ATI powers the Xbox360 with an R500-like chip and will power the Wii with what we believe to be an R300-like successor to the Gamecube GPU. Meanwhile, NVIDIA will be powering the Playstation 3 with a straight-up G70-generation chip. If our results from the Xbox ports are any indication, it seems like there will be a natural favoritism in the case of games that are exclusive to one console or another. Ultimately this may mean that the home team for a game will not be improving driver performance for those games due to an inherent lead, while the visiting team will need to try and make some optimizations to catch up. This is in stark contrast to the way PC-native games work, and cross-platform titles are still up in the air entirely. This facet of performance will be an interesting subject to watch over the next few years as all the next generation consoles come out.

In the mean time, getting back to ATI, NVIDIA, and the PC, what have we learned after today? We still can't predict the future of PC games, drivers, or hardware, but after looking at these results and their new track records, it certainly seems like with the introduction of the X1900 and 7900 series, that ATI may be in a better position to offer more substantial performance improvements than NVIDIA. As we've seen, NVIDIA hasn't been able to work a great deal of improvements out of the NV40/G70 architecture in our tests. With the R500 being quite different from venerable R300, we're interested in seeing what sort of improvements ATI can get from their new architecture. If ATI stays ahead in the driver game, they may very well have what it takes to beat out NVIDIA in the majority of games running on the current generation architectures.

The Wildcard: Console Ports
Comments Locked

24 Comments

View All Comments

  • Schugy - Thursday, May 11, 2006 - link

    nVIDIA has done quite a good job with linux drivers. It's simple to run this script and start your favourite game afterwards. nVIDIA's reward will be that I`ll buy a GF7800GT for AGP when it comes out. I wonder why the last anandtech linux story is more than 10 months old.
  • KickusAssius - Sunday, May 14, 2006 - link

    I have owned the 9000, 9500 PRO and 9700 PRO and I just hate ATI's drivers. I had problems with at least half of the games I played. Gradually they fixed certain problems, but other problems were never resolved even after contacting ATI directly via email and I was not the only one.

    Prior to this, I owned a Geforce 256 and Geforce 4 MX and Nvidia's drivers were nothing spectacular, but they always worked.

    Now, I have owned the 6800GT, and the 7800GTX and the drivers have been excellent for a long time now. The only problems I ever had were in CSS, but Nvidia fixed that problem relatively quickly. Also, I have never had a system crash as a result of Nvidia drivers, but several times this happened with my 9700 PRO, (still loved the card though especially when drivers got better). I think that now both sides have excellent drivers, though ATI's control panel is simply annoying.
  • johnsonx - Thursday, May 11, 2006 - link

    The image for Doom3 with the 60.72 driver can't be right; in the text you say there are major rendering issues with th 60.72 driver, but both images (normal and 'moused-over') look fine to me; the image presented as the 60.72 image looks identical to the 61.76 image presented below it.

  • ozzimark - Thursday, May 11, 2006 - link

    the pictures are too small for me to really even see anything
  • johnsonx - Thursday, May 11, 2006 - link

    oh, nevermind, i see it now
  • JarredWalton - Thursday, May 11, 2006 - link

    The rendering error isn't "major" -- that's why we kept the 60.72 results in the graph. You could certainly play the game with the 60.72 drivers and never realized there was a "problem". For anyone else trying to find the anomaly, look for the vertical lines right in the center of the screenshot.
  • synque - Thursday, May 11, 2006 - link

    I think the conclusion the article draws is completely meaningless. Nothing can be said about NVIDIA because they either failed to improve the drivers, or the drivers were close to "optimal" from the start.

    One could speculate that the ATI drivers weren't optimal because they could be improved. But even that'd be guessing, because the driver team most likely optimizes for popular games after they are released (which could lead to special optimizations).

    So I know exactly as much as I knew before reading the article. Weird.
  • z3R0C00L - Thursday, May 11, 2006 - link

    Umm not true,

    You forget that the x8x0 line are essentially built on an improved R300 design. They're not an all new part. This means ATi have gradually worked to improve there drivers. They were optimal to begin with since they're essentially the same driver SDK that reaches to the time of the 9700 Pro.

    I have a feeling that if a 9800 Pro were tested you'd see the same improvements in non video card bottlenecked situations.

    This proves that the Catalyst driver team is superior to the Forceware driver team. But most of us have known this since Catalyst 3.0. Heck even Microsoft has stated this as well as 3rd party driver analysts. ATi's card have less issues then nVIDIA cards and carry with them less issues not fixed from previous releases. There's a simple way of checking this.. read the Driver Release notes from both ATi and nVIDIA.. you'll notice FAR more unresolved issues with nVIDIA drivers then ATi drivers. Many of them major issues.

    One thing nVIDIA is better at, and this is a fact, is supporting older hardware (GeForce2,3,4). ATi's 8500 support is lackluster at best. I can't remember the last time a driver release focused on fixing issues that still exist since Catalyst 3.0 on the 8500 series.
  • Redofrac - Thursday, May 11, 2006 - link

    Since Catalyst 3.0? I'm going to have to disagree with that.
    From firsthand experience owning a 9700 Pro with the earlier catalysts, I have to say that they were crap. Having to try multiple releases of the driver to find one that was actually stable isn't quite what I'd use for the mark of a good driver team. Every once in a while they'd manage to get out a stable release, and I'd stick with that one until the next, lest random games start crashing on me or glitching.

    I can't speak for ATI's current drivers, which I'll assume are much better (dealing with drivers for a 9700 somewhat turned me away from ATI) but I find it a bit hard to say that 3.0 drivers were stable with a straight face.
  • LoneWolf15 - Friday, May 12, 2006 - link

    And I owned a Radeon 9700(non-pro) and had no problems at all; in fact, I'd daresay it's the best card I've ever owned in terms of performance/stability/longevity. But, one experience does not equal all.

    I'm not debating your experience, but what if you had a borderline power supply at the time, for example? That could easily cause some issues.

    Neither of us is a representative example of how the Radeon 95xx/97xx cards worked by ourselves. By and large though, the enthusiast community had very few issues with this series of cards.

Log in

Don't have an account? Sign up now