Final Words

For a quick overview:

* Day of Defeat showed that the new (non-public) beta 5.12 driver offers a small but consistent lead in each resolution and setting we tested.

* FarCry shows solid performance gains from dual core even without the driver enhancements (though the 5.12 driver does add a little performance of its own). At 16x12 w/ 4xAA the performance gain becomes negligible.

* Battlefield 2 shows none of the AA modes performing as well with 5.12 as with 5.11, and only the 800x600 mode shows a performance gain worth writing home about.

* Quake 4 shows no significant gains or losses due to the 5.12 driver or dual core.

The quick and dirty assumptions we can make are that Day of Defeat and FarCry have the most to gain from additional CPU performance here. It seems the way the new driver does AA is not as efficient in Battlefield 2. We will also venture a guess that ATI has not focused on optimizing their OpenGL support for dual core. Every other game showed some sign of significant improvement at 800x600, but Quake 4 didn't gain anything from the new driver. Of course, we will have to do more testing to really solidify our guess about OpenGL, and adding a few more games to our analysis will definitely be a good thing.

We are in the midst of testing the driver on older ATI hardware, midrange and low end parts, Intel dual core CPUs, and more. It's going to take us a little while to get this done, but hopefully this quick look is enough to tide everyone over and give us a good idea of what's going on.

Our conclusion after these brief tests is that the new driver is a good first step in the right direction, but that the benefit to end users is minimal at best at this stage. The real benefit will come in when game developers start working on parallelizing their code as much as possible. With FarCry we've seen that it can have an impact on performance, and, like we said, every little bit adds up.

Quake 4 Performance
Comments Locked

56 Comments

View All Comments

  • mbhame - Sunday, December 4, 2005 - link

    Who wrote this article?
  • stephenbrooks - Monday, December 5, 2005 - link

    Derek Wilson
  • PrinceGaz - Sunday, December 4, 2005 - link

    I have an X2 4400+ and like many other people have been forced to revert to the 7x.xx Forceware drivers because the new dual-core drivers cause certain well known OpenGL applications (3DS Max and PaintShop Pro for instance) to hang when trying to start them. If you haven't heard of this problem, just try googling and you'll get plenty of hits.

    I'd rather have nVidia fix bugs before adding new performance enhancing features, but sadly it is all about getting a few extra pecent over ATI in the latest games it seems.
  • hondaman - Monday, December 5, 2005 - link

    Nvidia claims that their drivers have DC optimisations, although i havent seen any review that shows one way or the other if it really does.

    I personally found this "review" to be quite interesting, and hope anandtech does the same for nvidia and their newest drivers.
  • mmp121 - Sunday, December 4, 2005 - link

    Derek,

    Do the drivers show any improvement while using a single core CPU w/HT enabled? Is it supposed to? How does it affect previous generation hardware? Are the tweaks only good for the X1000 hardware? You asked for suggestions, I gave some. Hope to see some of em answered.
  • stephenbrooks - Monday, December 5, 2005 - link

    ^^^ above are good questions
  • johnsonx - Sunday, December 4, 2005 - link

    Seems to me ATI had best get to the bottom of the single-core performance deficit in these 5.12 drivers before they come out of beta. All the fanbois would get their panties in a wad if the new driver hurts performance in the top-end FX-57 gaming rigs. If nothing else, they could include regular and DC-optimized versions of the key driver files and install them based on detecting 1 or 2(+) cores.

    Actually, what might be even better from a marketing point of view is if they have a 'regular' driver that works fine for all systems, and a separate 'dual-core optimized' driver. Nothing gives users the warm fuzzies like being told 'oh, for YOU we have a special, better driver. Later on, once dual-core is almost universal in new systems, they could just unify the driver again.
  • wien - Sunday, December 4, 2005 - link

    Though a good idea, I fear the changes they have made to the driver to "parallellize" it can't be plugged in and out that easily. And if they can't, ATI would have to keep two separate code-trees (single and dual core) for their drivers, and update them both every time they come up with an improvement. What would probably end up happening is that the single core version would be more of less stagnant in terms of development (but with version numbers increasing of course), and the DC version getting the actual improvements. (Or the other way around... for now at least.)
  • Pannenkoek - Sunday, December 4, 2005 - link

    The effort to optimize their dual core drivers to mitigate the single core performance loss is far less than keeping two parallel branches of their drivers in development. This is beta software, it's not as tuned as it can be. We won't know how the performance will be when the driver gets actually released.
  • mlittl3 - Sunday, December 4, 2005 - link

    That's a good idead.

Log in

Don't have an account? Sign up now