Image Quality Analysis Fall 2003: A Glance Through the Looking Glass
by Derek Wilson on December 10, 2003 11:14 PM EST- Posted in
- GPUs
Tomb Raider: Angel of Darkness
Tests here were too tricky to get close enough to the same frame on both cards for a difference image. We can see, however, that the ground on the ATI card has more lighting effects. We can see the same thing in the frame with anisotropic filtering and antialiasing turned on.
ATI 4xAA/8xAF (Click to enlarge.)
NVIDIA 4xAA/8xAF (Click to enlarge.)
Again, ATI does a better job at antialiasing in this game than the NVIDIA card.
Also, ATI has helped us track down the motion issue that we've been seeing on their cards. ATI supports using a separate filtering scheme for texture magnification (filtering when the screen pixels are smaller than texels) and minification (filtering when the screen pixels are larger than texels). NVIDIA hardware requires that magnification be done using the same filtering method as minification. Since TRAOD only requests anisotropic filtering be done on texture minification, NVIDIA does anisotropic filtering just fine while ATI doesn't do anisotropic filtering on magnification. The difference in filtering methods causes the flickering effect by which we have been bothered, and could be fixed via a patch from EIDOS; though, ATI reports that EIDOS is unwilling to do so. ATI could fix the problem themselves by doing application detection (determining that TRAOD is running and then adjusting settings specific to that game), but ATI is unwilling to take this step (even though it would be valid and helpful) in order to avoid the controversy.
35 Comments
View All Comments
Nate420 - Monday, December 22, 2003 - link
I think the people bitching about the article being biased to one side or another are biased themselves. Let go of your hangups with ATI and Nvidia and read the article again. IMO the article was well written and about as unbiased as it could be. The fine folks at Anandtech are human after all.titus - Saturday, December 20, 2003 - link
Here's something no one pointed out:Check the Jedi Knight: Jedi Academy screenshots, and notice how the NVIDIA employs the light sabre alpha-blending almost in *post* processing, so that even the head and body (which obstruct our view of the light sabre) glow as much as where it isn't obstructed.
ATI's alpha blending works in that only visible areas 'glow', however, where it does glow, it is as irregular as heck.
Point is, they both have faults with alpha blending.
valnar - Wednesday, December 17, 2003 - link
Derek #30,I bought a Leadtek GeForce 4 Ti4400 some time ago on the review that it was a speedy, stable and "cool" card for DirectX games. It was all those, but a piece of visual junk otherwise. DVD's looked horrible because of the lack of a gamma adjustment. I had to settle for brightness/contrast alone which made it washed out. The overall contrast between 0 black and 255 white in any photo program was no where near my current Radeon 9600, or even an older Matrox G400. It makes it hard to do any kind of calibration for photo work, let alone enjoy the occasional DVD.
My 2 cents...
DaveBaumann - Sunday, December 14, 2003 - link
“WRT dx 10, it still seems to me that nvidia is closer to ps3 support than ati. I will definitely give the Meltdown presentations some more in depth study time though.”Derek – As I said, PS3.0 is already part of *DX9*, DX10 will be PS/VS4.0. Please read out DirectX Next article to gain an understanding of the directions DX10 is taking
http://www.beyond3d.com/articles/directxnext/
We need to fill in the gap between DX9 PS/VS2.0 and DX10 PS/VS4.0, being the update to DX9.0 that will allow hardware targets for PS/VS3.0. Regardless of whether NVIDIA are closer to PS3.0 or not that doesn’t mean that ATI will have issues support it – ATI hadn’t support true multisampling FSAA before R300 and yet leapfrogged NVIDIA in this respect, NVIDIA hadn’t supported PS1.4 before the FX series and yet they went to PS2.0 Extended – what they currently support doesn’t necessarily have much bearing on what they are going to support.
“I know (as do many who were present when David Kirk was speaking about the issue) that the FX hardware supports Multiple Render Targets.”
FX natively supports MET’s, not MRT’s. The FX series would probably need the driver to support MRT’s by packing and unpacking instructions.
I asked a developer about the difference between MRT’s and MET’s and this was his reply:
“MRT are a lot more flexible, effectively upto 4 render-target can be rendered to simulatously, the only real restrictions are same size and for some chipsets (ATI but not AFAIK PowerVR) same format.
MET are special textures that are wider than usual (usual being upto 4 channels (i.e. RGBA), MET aren't really anything special there just >4 channel textures (RGBAXYZW). It may be possible in MET's are different formats for each 'logical' texture. i.e. 8bit per channel RGBA and float16 XY.
MRT are much better as you can mix and match standard render targets as you like. MET have to be special cases where MRT can be used much more like any render target.
Both are handy but MRT is the perferred route (AFAIK MET were added to Dx9 to try and cope with the GFFX style pack/unpack render targets). MRT are much more the theorytical idea, you can choose to render to multiple places at teh same time.”
[Derek] “I'm not sure if flexible float buffers fall into the same catagory, but I will look into it.”
NVIDIA has some odd addressing restrictions, outside of DX’s standard requirements (for any revision), with their float texture support that restricts use for output buffers a conditional flag needs to be added to DX9 to support these restriction on NV’s hardware. This flag is likely to be added to the update to DX9 that will also allow hardware PS/VS3/0 support next year.
DerekWilson - Sunday, December 14, 2003 - link
Dave (27+29),Thanks for responding in so thuroughly.
WRT dx 10, it still seems to me that nvidia is closer to ps3 support than ati. I will definitely give the Meltdown presentations some more in depth study time though.
I wasn't quite clear on a couple things with NV supporting DX9... Not all of the features are exposed with current drivers even thought the hardware is capable of them. I know (as do many who were present when David Kirk was speaking about the issue) that the FX hardware supports Multiple Render Targets. We could not get any confirmation on a reason this feature is not yet enabled in software (and I will leave the speculations to the reader). I'm not sure if flexible float buffers fall into the same catagory, but I will look into it.
Of course, my original assesment of these facts not mattering except from an experience for NVIDIA standpoint still stands.
I really appreciate your comments Dave, please keep them coming.
And Valnar (#25):
I am really looking forward to doing just what you are asking for in future reviews. I don't feel comfortable with doing such analysis on any real current display technology, and am looking for ways to capture the output of cards over DVI. I can use photoshop for as much analysis as necessary to compare images at that point (including spectrum/brightness and crispness or blurryness of text).
Of course, this stuff won't test the cards ramdac, which is still very important. I'm hoping to also come up with some solid 2D benches in the coming year that should include such analysis.
Please keep the comments and suggestions coming as I am very open to including the kinds of information you all want.
Thanks,
Derek Wilson
DaveBaumann - Saturday, December 13, 2003 - link
"So what about the missing 24bit fpu in the FX series which have 16 and 32bit the latter being very slow. Doesnt this mean that the FX series is not even compliant?"No. DirectX9 specification say that the *minimum* precision requirement for "full precision" pixel shaders is FP24 - this being the minimum requirement means that anything above that is also deemed as full precision. FP16 is the minimum requirement for partial precision shaders.
"And given the scarcity of DX9 games at present, inevitable that FXs will run into trouble in the future"
Its not an inevitability, but it may make developers live a little more difficult in the interim since, as NVIDIA themselves point out, they will have to work a little harder to make them optimal for FX hardware. Whether or not that means that some development houses will put off implementing serious use of DX9 for the time being is a real question.
However, I'd say that the DX9 talk doesn't tell the whole story as ATI's shaders appear more effective even with many DX8 shaders in many cases. For instance, look at these performance numbers with a test application:
http://www.beyond3d.com/forum/viewtopic.php?p=1991...
GF FX 5950, 52.16:
PS 1.1 - Simple - 938.785889M pixels/sec
PS 1.4 - Simple - 885.801453M pixels/sec
9800 PRO
PS 1.1 - Simple - 1489.901123M pixels/sec
PS 1.4 - Simple - 1489.822754M pixels/sec
Pumpkinierre - Saturday, December 13, 2003 - link
Thank you 27 for explaining compliant versus full DX9. So what about the missing 24bit fpu in the FX series which have 16 and 32bit the latter being very slow. Doesnt this mean that the FX series is not even compliant? And given the scarcity of DX9 games at present, inevitable that FXs will run into trouble in the future,DaveBaumann - Friday, December 12, 2003 - link
“The FX cards support full DX9. In fact, they support more than the minimum to be DX9 cards and support some features that people speculate will be in DX10 (fp32, longer shader programs, etc...).”Derek, the FX series supports the full features required for DX9 “compliancy”, they do not support “full” DX9 – there are numerous optional features within DirectX specifications that IHV’s can choose to support or not. Two such features within DX9 is the support of flexible float buffers and MRT’s, neither of which the FX series do support.
The lack of float buffer support is causing some head scratching among developers as it would generally make their life easier (check with some to see if they would find it useful if all vendors would support it) – float buffers are already use in some game which means that the FX series can’t support some options (check Tomb Raider). MRT’s also have their uses, and we’ve published some research work from one developer on lighting which is made easier by the support of MRT’s:
http://www.beyond3d.com/articles/deflight/
As for the “speculation of the FX’s support for what will be in DX10” please check the Micorsoft Meltdown presentations – they list the specification requirements for Pixel Shader 3.0 compliant which is already spec’ed within DX9 and the FX series falls short of this requirement.
“Part of the reason ATI is able to lead so well in performance is that they don't support many of these features.”
And NVIDIA don’t support some features that are also within DX9 – I’m not sure you can claim that ATI has better DX9 performance as they both have elements that each other support and don’t support.
BlackShrike - Friday, December 12, 2003 - link
I don't get it. You guys are content for sub perfect products? When I buy something, I want it to be the best for my money. But you guys don't seem to care for the best image quality possible while getting high frame rates. Isn't that why you buy these high end cards?Another thing, how come anandtech doesn't do an article by finding the minimum frame rates. Remember how we found out ATI and Nvidia sometimes have high average frame rates, but very low minimum frame rates? Now this would be the deciding factor for me. Image quality plus consistent frame rates.
Really guys, I expected more from Anandtech readers.
valnar - Friday, December 12, 2003 - link
I read the headline of this article and actually thought it was going to be about image quality. No shocker that it wasn't.This article talked about rendering quality, not image quality. I'd really like somebody to bring up a color spectrum analyzer. I'd like to hear about text crispness and/or fuzzyness of displaying text at 6 point. I'd like to hear about color accuracy and brightness/contrast/hue/sat/gamma capabilities.
Oh well.