Conclusion

So now that we have seen what both ATI and NVIDIA could do with their respective drivers in the previous generation, and how this is apparently influenced by console development, let's first talk a bit about NVIDIA specifically on the subject of PC-native games.

Overall, the performance improvements with the 6800 Ultra are a bit disappointing. With a new architecture we had hoped NVIDIA would have been able to pull out some of their famous performance increases, and this did not happen. This is not to discount the 6800 Ultra, as the entire NV40-based family are all quite strong cards. However, part of the market is working out performance improvements over a card's lifetime in order to improve performance, win the mindshare of buyers by showing them that they're going to get more out of their purchases over time, and thus create better positioning for future products. With only a handful of significant (>10%) performance improvements, it's hard to say NVIDIA really has done this well.

Accordingly, they don't end up sticking to all of our previously listed trends that ATI ended up following. In general, NVIDIA was able to provide a little bit of a general performance improvement with each of their drivers instead of being solely focused around one-time performance boosts, but the most promising time period for a performance boost is still shortly after a game comes out. In fact, the biggest performance NVIDIA experienced appears to be in debugging and optimizing the drivers early in the life of the hardware. As with ATI's Catalyst drivers, there seems to be little reason to do a ForceWare driver upgrade unless there is a specific change in a driver you are interested in, and this likely isn't going to change with the NV40-derrived G70 series.

On the positive side, we reiterate how impressed we are with NVIDIA's lack of performance improvements in 3DMark. If this is truly because they didn't devote resources to doing so, or if they simply couldn't pull it off, is something we may never know, but it's nice to see that their greatest improvements were in real games, not synthetic benchmarks. Similarly, we are glad to see over the last year and a half that they haven't made the same mistake as ATI with regard to shoehorning users into needing to use a bloated toolset to manipulate their cards. The NVIDIA control panel is lean and mean, and that's the way we like it.

As for the burning question of performance improvements with drivers, ATI has clearly shown that they can work magic with their drivers if they need to. This in turn once again reiterates just how much of an impact drivers can have on overall performance, and ultimately purchasing choices. Although even with some of these large performance changes, they would not have likely changed our initial recommendations for what to purchase, it is close to that threshold.

It is clear that just looking at a card once at its introduction is not enough; the performance improvements and corrections offered by later drivers are just too much to ignore. Yet at the same time, other than re-reviewing a dozen cards every time a new driver is released - and even then we can't very well tell the future - we have no way of really measuring this impact other than with hindsight, which is by its very nature too late. We can offer guesses at what kind of performance improvements might be in the pipeline in the future from NVIDIA and ATI, and certainly they will make every effort to tell the world about how proud they are whenever they do manage to come up with such a performance improvement. The reality is that many of the proclaimed improvements are for certain uncommon settings - i.e. running a budget card at high resolutions with AA/AF enabled - and judging by the results of our regression testing, buyers should be aware of the marketing hype that's going on.

At the same time, the increase in games being ported or developed with console versions paints a very different picture. With the current generation consoles, NVIDIA has benefited from a nice performance lead from what we've seen today, as Xbox-only titles have been much harder on ATI than NVIDIA. Since NVIDIA is effectively the de-facto leader in console GPUs for the current generation (ATI only has a chip in the Gamecube, of which few titles are ported over to the PC), there appears to be little reason to expect major performance improvements out of NVIDIA's drivers on Xbox ports, while ATI appears to be able to work out (and need) performance improvements for Xbox ported games.

That said, the number of games that will still be ported from current generation consoles to the PC is dwindling, which brings us to the discussion of the next generation. ATI powers the Xbox360 with an R500-like chip and will power the Wii with what we believe to be an R300-like successor to the Gamecube GPU. Meanwhile, NVIDIA will be powering the Playstation 3 with a straight-up G70-generation chip. If our results from the Xbox ports are any indication, it seems like there will be a natural favoritism in the case of games that are exclusive to one console or another. Ultimately this may mean that the home team for a game will not be improving driver performance for those games due to an inherent lead, while the visiting team will need to try and make some optimizations to catch up. This is in stark contrast to the way PC-native games work, and cross-platform titles are still up in the air entirely. This facet of performance will be an interesting subject to watch over the next few years as all the next generation consoles come out.

In the mean time, getting back to ATI, NVIDIA, and the PC, what have we learned after today? We still can't predict the future of PC games, drivers, or hardware, but after looking at these results and their new track records, it certainly seems like with the introduction of the X1900 and 7900 series, that ATI may be in a better position to offer more substantial performance improvements than NVIDIA. As we've seen, NVIDIA hasn't been able to work a great deal of improvements out of the NV40/G70 architecture in our tests. With the R500 being quite different from venerable R300, we're interested in seeing what sort of improvements ATI can get from their new architecture. If ATI stays ahead in the driver game, they may very well have what it takes to beat out NVIDIA in the majority of games running on the current generation architectures.

The Wildcard: Console Ports
Comments Locked

24 Comments

View All Comments

  • LoneWolf15 - Thursday, May 11, 2006 - link

    Currently they are, yes. But some years back, they sucked.

    That's true. However, "some years back" is around the time of the Radeon 8500, far before the 9xxx line or the X800 line. This issue is no longer relevant, and yet people who haven't used ATI cards in years flog this dead horse over and over again.

    ATI isn't perfect; their multimedia cards (i.e. TV tuners) still need work in the software department. However, it's been a long time since ATI has had serious driver issues, and many who haven't had an ATI card since Rage128/Radeon/Radeon 8500 days talk as if things haven't changed.
  • Powermoloch - Thursday, May 11, 2006 - link

    I've been using Ati's drivers for quite sometime, and I noticed a gradual increase of performance from my experience. Especially on the 3dmark scores lol.
  • MrKaz - Thursday, May 11, 2006 - link

    What’s the problem with Control Panel?

    I like it a lot. Ati drop it in 5.11, I keep it installed with driver 6.4 and have no problems.
  • poohbear - Thursday, May 11, 2006 - link

    have u even owned an ATI card? i'm currently running a 6800gt, but my experience w/ the 9800pro was great and i dont know what u're talking about w/ your driver instability comment. maybe u should read the article again, it praises ati's driver team quite a bit.

Log in

Don't have an account? Sign up now