The Future of Anti-Aliasing Settings in Question: NVIDIA Discussion
by Derek Wilson on March 15, 2007 8:23 AM EST- Posted in
- GPUs
The Increasing Complexity of AA
As we mentioned, the first major difficulty with antialiasing is compatibility. Generally the burden is on the game developer to assess the capabilities of the hardware on which it is running and make options available to users based on what is available. Problems arise in that game developers aren't able to look into the future and program for hardware that isn't available. For example, X1000 Series and GeForce 8 Series hardware can run Oblivion with both HDR and AA enabled, but at least GeForce 8 hardware wasn't available when Oblivion launched, so the developers don't test for the capability to antialias floating point surfaces and simply disable the AA option when HDR is enabled.
When games that could benefit from AA on current hardware don't offer the option, we have no choice but to look to the driver for support. Of course, we do have bigger problems on the horizon. Some developers are currently choosing options such as deferred rendering for their games. Current techniques make use of multiple render targets (MRTs) to render objects or effects which are later combined to form a final image. MSAA does not play well with this technique, as one of the basic requirements is knowing what surfaces overlap a single pixel on the screen. Forcing AA on in the driver can cause problems in games where MSAA simply will not work. Current examples of this can be seen in these games:
- Ghost Recon: Advanced Warfighter
- Rainbow Six: Vegas
- S.T.A.L.K.E.R.
With the functionality of driver enabled AA dependent on the game being run, graphics hardware makers are not able to guarantee that the user will get the results he or she desires. This means that the driver setting is more like asking the hardware to enable AA if possible. This uncertainty as to the behavior of the feature can cause problems for end users of both AMD and NVIDIA hardware.
As for NVIDIA specifically, its new CSAA (Coverage Sample Antialiasing) technology adds another layer to the complexity of antialiasing settings in the driver. Now, rather than just selecting a desired level of antialiasing, we need to decide to what degree we want to either enhance or override the application. Enhancing only works when AA is enabled in-game as well, and override won't override games that make use of technology that is incompatible with MSAA. While the features function as they should, even some hardcore gamers out there may not know what they are getting when they enable AA in the control panel.
At AnandTech, we have avoided using driver AA settings as much as possible since the initial release of Far Cry which produced inconsistent results between graphics hardware manufacturers when set through their respective control panels. These specific problems were worked out in later driver and game updates, but we find it more effective to rely on the game developer for consistency between common hardware features. Where there is an in-game setting, we use it. For us, other than disabling vsync, driver settings are a last resort.
It is safe to say that AMD and NVIDIA feel the same way. The only way they currently have to inform their users about the lack of support for AA in specific games is though their release notes. No one wants the end user to have a bad experience through glitchy performance.
One of the best ways to make sure gamers stick with in-game settings is to make sure developers offer clearly defined, well documented, and complete settings for features such as AA. In order to better enable this, NVIDIA has been working with Microsoft to enable CSAA through DirectX. With the in-game option for CSAA, users won't have to wade through the driver options and can directly select the type and degree of AA they want applied to their game.
In DirectX 9 and 10, requesting AA on a surface involves determining the level (number of subsamples) of AA and the quality of AA. Most games just set quality to 0, as hardware previous hardware didn't really do anything with this field. The method developers can use to set CSAA in-game is to set the level of AA to either 4 or 8 and then set the quality to 8 or 16 (2 or 4 in DX9, as quality levels are limited to 7 or less). This functionality is exposed in NVIDIA's 100 series driver.
This has had some unwanted side effects though. In the past it hasn't mattered, but some developers would detect the highest quality setting available and select it when enabling AA in-game. These games when paired with NVIDIA's 100 series driver will inadvertently enable 16x CSAA when 4x AA is selected. Currently the games that exhibit this behavior are:
- Battlefield 2
- Battlefield 2142
- Sin Episodes
- Half-Life 2
- Half-Life 2: Lost Coast
- Dark Messiah of Might and Magic
This is only an issue on Vista for now, but 100 series drivers will be coming to XP soon. It isn't that either NVIDIA or these game developers are doing anything wrong, it's just how things ended up working out. The ability to enable CSAA in games does outweigh these minor issues in our minds though. We hope to see this number rise, but currently there are only two games that support enabling CSAA in-game:
- Half-Life 2: Episode 1
- Supreme Commander
So with the 100 series driver, future games will be able to enable all of NVIDIA's AA modes in-game. Setting AA levels in-game is safer than using the hardware makers' driver overrides and more convenient than game specific profiles. Aside from heavily encouraging developers to enable in-game AA settings when possible, NVIDIA is exploring other options to make the gamer aware of the caveats associated with driver settings and encourage the use of override AA as a last resort.
50 Comments
View All Comments
chizow - Thursday, March 15, 2007 - link
Such a headache over a feature that should be streamlined and integrated. Sums up the nature of PC gaming though really. Not enough standards and guidelines, so there's no consistency.Not sure what the best approach would be, but from an end-user standpoint, I'd like to see a dedicated GUI similar to the 3D model in control panel that allows you to adjust image quality settings. Only I'd like to see it reflect actual game performance and available AA settings for any particular game.
I don't know if NV is willing to undertake that level of support, but it would certainly make it easier for the end-user. Either have pre-configured .inf-like profiles for games or the ability to scan and assess games on any given machine and demo/benchmark them.
Maybe the easiest implementation would be a timedemo of sorts. Like you could enable NV CP to run a game in stress test mode where it would just cycle through the different AA settings while you're playing the game, then report a summary of relative image quality and performance. Its not perfect of course, but right now testing is either subjective or a huge PITA.
I'd love to see something like setting a target FPS and then allowing the drivers to enable the highest level of AA that still meets that target FPS. Right now the only way to really do that is to run a lot of tests (or reference reviews) and spend a lot of time changing settings, which takes time away from what you should be doing: playing and enjoying the game.
michal1980 - Thursday, March 15, 2007 - link
couldn't nvidia just provide profiles with some settings set for games? if theres a game that doesn't support something, then have that feature dissabled.bf2 which i play alot, I sett AA in game to 4x (though wish it could go futher),
and then set transaperancy AA in the driver game profile, because the game has no optition for it.
soydios - Thursday, March 15, 2007 - link
I really wouldn't mind having a user-customizable performance profile in the graphics driver for each game. It would give me more control over the game, which I never complain about. For instance, I sometimes use ATi CCC to enable antialiasing in older games, but then I have to delve back into the driver to change it to "Application Preference" when I boot up BF2 or other newer games. It would be far more convenient for me if the driver would automatically load that game's profile when I start the game. The profile should include 3D settings for sure, and maybe color, brightness, even overclocks.poohbear - Thursday, March 15, 2007 - link
i personally set my AA setting to "let the application decide" in the CCC for my x1900xt and in game i usually choose 4x or the "AA high" option. most games allow u to specifiy how much AA to use, so that's fabulous, otherwise if a game doesnt have AA entirely then i force it through CCC. It's kinda annoying, but MOST games support it so i hardly ever have to worry about forcing it through CCC. :)nefariouscaine - Thursday, March 15, 2007 - link
I also use modded drivers most of the time that include the coolbits reg hack in them from the startnefariouscaine - Thursday, March 15, 2007 - link
well, I myself when it comes to such settings usually play the hit or miss game on each app I use. 9 outta 10 times I tweak out the AA settings in the drivers as a number of games don't go up to 8x AA but drivers do as well as have "forced" multisampling and super-sampling options as well.I've not really had too many issues involving this but more so in enabling AF in the drivers. This caused me many a crash in BF2 until I figured it out. It would be great to see more developers including higher level advanced options for graphics in games as the level of hardware continues to increase. I'm a firm believer that hardware shouldn't be bottle necked by the software its running (i'm talking games). There aren't too many games out that tax a 8800GTX and I'd love to see that happen, soon...
I say make some warings that pop up when you enable such changes - that might help some but won't be perfect. I'm happy with the layout of the "classic" nvidia drivers settings but the new gets a big thumbs down from me as its too clumsy to find the advanced settings.
munky - Thursday, March 15, 2007 - link
Modern games not supporting AA are a minority, and I don't see a reason to disable driver-override AA settings.DigitalFreak - Thursday, March 15, 2007 - link
If it's available, I always use in-game AA settings. However, games that have this option are few and far between. Considering how poor Nvidia's driver update schedule has been the last 6 months for anything other than Vista/8800 series, I think Coolbits is the way to go.VIAN - Thursday, March 15, 2007 - link
I hate to be bothered to constantly going into the control panel and changing the settings. So I usually leave the control panel to the highest quality settings, but leave AA, AF, and Vsync as an application preference.I love to use in-game settings to set these. I won't go into the control panel unless I really feel the need to in a game that doesn't support it. Because I also anticipate compatibility issues when forcing something the game doesn't support, I seldom go into the control panel and acknowledge it as a limitation of the game.
mostlyprudent - Thursday, March 15, 2007 - link
I am at best a casual gamer. It has been quite sometime since I tweeked driver settings. However, I have been unhappy with the limitations of in-game settings in several games. If I had the time, I would dig deeper into driver settings to maximize the gaming experience and I think it would be a mistake to limit users to the settings provided by the developer.I don't think there is a problem with having a complicated driver structure (from the user's perspective) as long as there is a relatively simple set of settings adjustments in the game. Those who want more control will have to pay the price of a steep learning curve - as long as there is a good payoff in the end.
Basically, I don't really have a preference as to the approach the driver developer takes as long as it doesn't eliminate the ability to tweek.