Image Quality Analysis Fall 2003: A Glance Through the Looking Glass
by Derek Wilson on December 10, 2003 11:14 PM EST- Posted in
- GPUs
Final Words
Making useful sense of all this information is tricky at best.There is no real-time 3D engine or hardware in existence that does everything the “right way” to all things visible all the time. In order to affect real-time 3D rendering, trade-offs must be made. Only when these trade-offs become highly perceptible are they a problem.
It is the GPU makers' responsibility to implement optimizations in ways that don't negatively impact the image quality of a scene, but there really isn't a way to quantitatively make a decision about any given optimization. That which is acceptable to one person may not be acceptable to another, and it is a tough call to make.
One stop gap is the end user community's perspective on the issues. If it is decided that a particular optimization shouldn't be done by the people who own (or potentially own) a particular card, it is in the GPU makers' best interest to make some changes.
It is in game developers' best interest to work with GPU makers to keep image quality top notch for their game's sake. In fact, rather than concentrating on getting raw frame rate to the end user, IHVs should focus on getting powerful and easy-to-use features to the developer community. So far, ATI has a leg up on ease of use (developers have said that programming has gone quickly and smoothly with ATI cards with NVIDIA code paths taking longer to tweak), while NVIDIA's hardware offers more flexibility (NVIDIA allows much longer shader programs than ATI and offers functionality above the minimum of current APIs). At this point, ATI is in a better position because it doesn't matter if NVIDIA offers more functionality, if the only code that can take advantage of it runs incredibly slow after taking a very long time to develop. Hopefully, we will see more flexibility from ATI and fewer nuances in how programs need to be written from NVIDIA in next year's hardware.
At this point, ATI uses a more visually appealing algorithm for antialiasing, while NVIDIA does a better job calculating texture LOD and does more alpha blending. The question we are asking now is whether or not these optimizations degrade image quality in any real way. We feel that NVIDIA needs to refine its antialiasing algorithms, and ATI needs to do a better job of rendering alpha effects. We are still looking into the real world effects of the distance calculations that ATI uses in determining LOD, but the problem definitely manifests itself in a more subtle way than the other two issues that we have raised.
The decision on what is acceptable is out of our hands, and we can't really declare a clear winner in the area of image quality. We can say that it appears from the tests we've done that, generally, NVIDIA hardware does more work than ATI. Honestly, it is up to the reader to determine what aspects of image quality are important, and how much of what we covered is relevant.
We really don't have a good way to compare pixel shader rendering quality yet. The possible issues with different shader implementations have yet to be seen in a game, and we hope they never will. It is a developer's responsibility to create a game that gives a consistent experience across the two most popular GPUs on the market, and both ATI and NVIDIA have the ability to produce very high quality shader effects. Each architecture has different limitations that require care when programming, and we will still have to wait and see whether or not there will be image quality differences when more DX9 games hit the shelves.
For now, we are committed to bringing to light as much information as possible about image quality and optimizations in graphics hardware. Armed with this information, individuals will be able to come to their own conclusions about which optimizations go too far and which serve their intended purpose. We hope that all the details that we have brought to light have served their purpose in helping our readers to make informed decisions about graphics hardware.
35 Comments
View All Comments
zhangping0233 - Thursday, January 5, 2012 - link
Thank you for your post. Strongly recomend that you used the xeccon flashlight.DerekWilson - Friday, December 12, 2003 - link
#23The FX cards support full DX9. In fact, they support more than the minimum to be DX9 cards and support some features that people speculate will be in DX10 (fp32, longer shader programs, etc...).
Unfortuntely, this doesn't make the FX cards worth any more.
Part of the reason ATI is able to lead so well in performance is that they don't support many of these features. And when games come out that can take advantage of them, it isn't likely that FX cards will run those features at very high framerates.
The only thing FX feature support does for NVIDIA is give them one more generation of experience in supporting those features. Of course, it remains to be seen what they will do with that.
ATI has proven that it can do a good job of hitting the nail on the head with actually leading the release of DX9 with full DX9 support. If they can do the same for DX10, then they will be doing very well.
When DX10 emerges, things will get very interesting...
Pumpkinierre - Friday, December 12, 2003 - link
So does this meam NVidia FX series cards are now true DX9 cards or DX9 compatible (whatever that means?) or partly DX9 but better at DX8.1 and hopefully DX10?MOwings - Friday, December 12, 2003 - link
This article was excellent. The explanations of all the technologies used in these cards was very clear. I thought the NVidia screenshots were brighter (the flashlight pic in Halo, the first pic in UT on the front right of the scnene). It seems NVdidia is being more accurate in their methodologies. To me, correct lighting is more important than the antialiasing so I would tend to prefer the NVidia. However, I doubt I would really notice any difference at full game speed and so maybe it is better to get the card that is fastest (ATI), although both seem to be plenty fast enough with their current drivers. Tough call. If there are stability issues with ATI drivers that might swing me although this is the first I have heard of problems with ATI's latest cards and drivers.virtualgames0 - Thursday, December 11, 2003 - link
#17...I'm in the exact same shoes as you are in. I have gone from ATI 9700pro to nvidia geforcefx 5900 from game incompatibility issues. Half my OpenGL games would crash. Tried every driver.. only to find that other games crash while my old game is fixed. Switched to nvidia, not one problem
However, if you are lucky and do not have problems, I would agree the 9700pro had far superior AA quality, and takes less performance hit doing AA, but nvidia is good for me since 2xAA is all i need when I use 1600x1200 resolution
DerekWilson - Thursday, December 11, 2003 - link
In response to the article, Scali over at Beyond3d put up a thread about the alpha blending issues we observed on the ATI cards:http://www.beyond3d.com/forum/viewtopic.php?t=9421
He's written a program to test the accuracy of your cards alpha blending, which is kinda cool (and also confirms that ATI is a little off in the calculation).
The theory Scali has is that the problem is due to ATI substituing in a couple shifts to save a division.
We are definitely going to continue looking into the issue, and thanks, everyone, for your feedback.
Shinei - Thursday, December 11, 2003 - link
Exactly, Araczynski. I mean, I love my UT2k3 with 4xAA/8xAF(64-tap), I really do, but I don't NEED it to enjoy the game. As long it plays fast and looks better than the games before it, it doesn't matter if I have to dial back my AA/AF; hell, Halo won't even run above 10fps if I don't knock my filter to bilinear, let alone whining about AF quality at 8x!! Sometimes people just need to realize that IQ is a secondary concern to getting the game to run in the first place; suffer in vain with a Ti4200 in Halo at 1024x768 and then tell me that the "lower" IQ for an FX makes it less worthwhile than a Ti or an ATI.araczynski - Thursday, December 11, 2003 - link
aside from the rest of the points, as far as i'm personally concerned, AA is a waste of time anyway. The only time i bother to use it is when i'm forced to run something in 800x600 or less (assuming it even supports D3D).Other then that I will always choose a higher resolution over an AA'd lower one. Personally i prefer the sharpnees/clarity of higher resolutions to the look of AA, but that's my opinion and my taste, I know others think the opposite, which is fine by me.
Point is, i wish revewiers would stop focusing on AA quality/speed as if EVERYBODY gives a rat's hiny about it.
Ineptitude - Thursday, December 11, 2003 - link
Having owned a few ATI and Nvidia cards I can say that I agree with the article. Finding a difference in image quality between the current top models is mostly subjective. It is amazing to see the amount of bickering over the subject.What I don't understand is the failure to mention all of the driver problems people have experienced with ATI cards. At this point I will probably never buy an ATI product again due to the poor drivers. Nvidia drivers don't crash my machine.
I've got a 9800 going cheap if anybody wants it.
tyski - Thursday, December 11, 2003 - link
Everybody here who says the article 'concluded suddenly and without concluding anything' could not have read the article. The whole point of the article is that the author provided enough information for the reader to come his/her own conclusions.The only conclusion that Derek made was that nVidia does more work to get approximately the same thing done. If you understand anything about real time hardware, this is not a good thing.
Having read every article Derek has written so far, I think this is probably the best one. Unbiased throughout. And if you want a screenshot for every possible combination of AA, AF, and resolutions, then go buy the cards and see how long it really takes to perform this many benchmarks. There is such a thing as article deadlines.
Tyrel