ATI's X1000 Series: Extended Performance Testing
by Derek Wilson on October 7, 2005 10:15 AM EST- Posted in
- GPUs
Doom 3 Performance
NVIDIA hardware just runs Doom 3 better than ATI hardware, and as we saw before, the case hasn't changed with the new R/RV5xx GPUs from ATI. The light and shadows in Doom 3's engine play a huge role in the game, and the algorithms and API (OpenGL) just tend to favor NVIDIA's architecture and drivers.
The 7800 GTX and 7800 GT both out-perform the X1800 XT across the board without AA enabled. The 6800 GT manages to run faster than the X1800 XL, and the 6600 GT leads the X1600 XT by huge margins. The X1300 Pro stops being playable after 1024x768, which really doesn't bode well for a $150 card.
Performance falls off faster with AA enabled, but that is to be expected. The 7800 GTX and 7800 GT just increase their ability to out-perform the X1800 series here, but the X1600 XT becomes more competitive with the 6600 GT this time around. Of course, neither one really does that well at 1024x768 with 4xAA - 44 FPS is playable, but just barely.
Enabling AA drops performance by a similar proportion on the X1800 and 7800 series parts at high resolutions, with low resolutions favoring NVIDIA hardware. In another twist that spits in the face of the trends that we have seen, the X1600 XT handles AA much better than the 6600 GT and shows a lower percent impact than most of the other cards in the test.
NVIDIA hardware just runs Doom 3 better than ATI hardware, and as we saw before, the case hasn't changed with the new R/RV5xx GPUs from ATI. The light and shadows in Doom 3's engine play a huge role in the game, and the algorithms and API (OpenGL) just tend to favor NVIDIA's architecture and drivers.
The 7800 GTX and 7800 GT both out-perform the X1800 XT across the board without AA enabled. The 6800 GT manages to run faster than the X1800 XL, and the 6600 GT leads the X1600 XT by huge margins. The X1300 Pro stops being playable after 1024x768, which really doesn't bode well for a $150 card.
Performance falls off faster with AA enabled, but that is to be expected. The 7800 GTX and 7800 GT just increase their ability to out-perform the X1800 series here, but the X1600 XT becomes more competitive with the 6600 GT this time around. Of course, neither one really does that well at 1024x768 with 4xAA - 44 FPS is playable, but just barely.
Enabling AA drops performance by a similar proportion on the X1800 and 7800 series parts at high resolutions, with low resolutions favoring NVIDIA hardware. In another twist that spits in the face of the trends that we have seen, the X1600 XT handles AA much better than the 6600 GT and shows a lower percent impact than most of the other cards in the test.
93 Comments
View All Comments
nserra - Friday, October 7, 2005 - link
I agree.When doing some article the site must say if they are doing a preview, review or overview.
Questar - Friday, October 7, 2005 - link
"High quality anisotropic filtering is definitely something that we have begged of NVIDIA and ATI for a long time and we are glad to see it, but the benefits just aren't that visible in first-person shooters and the like."So you like all the texture shimmering on a 7800?!?
DerekWilson - Friday, October 7, 2005 - link
We will absolutely be looking further in depth on the shimmering issue.But texture shimmering and the impact of ATI's new High Quality AF option aren't the same problem. Certainly angle independant AF will help games where both ATI and NV have shimmering issues, but those instances occur less often and in things like space and flight games.
I don't like shimmering, and I do like the option for High Quality AF. But I simply wanted to say that the option for High Quality AF is not worth the price difference.
PrinceGaz - Friday, October 7, 2005 - link
We're not talking about ATI's new angle-independent HQ AF option. It's nVidia's over-agressive trilinear-filtering optimisations that all 7800 series cards are doing, almost to the point of it being bilinear-filtering. They did that a couple of years ago and are doing it again now, but only on the 7800 series cards (6800 and under get normal filtering).If you want an example of this, just look at the transitions between mipmaps on the 7800 in the first review of the new ATI cards. I'm not talking about spikes on certain angles, but how the 7800 almost immediately jumps from one mipmap to the next, whereas ATI blends the transition far better. In fact, that is the main thing that struck me about those AF patterns in the review.
Over-agressive trilinear-optimisation is a problem even on 6800 series cards after supposedly disabling it in the drivers (it reduces the impact of it). I just wish it could be turned off entirely as some games need full true trilinear filtering to avoid shimmering.
DerekWilson - Saturday, October 8, 2005 - link
I know what you are talking about.The issue is that *I* was talking about the new HQ AF option in ATI hardware in the sentence Questar quoted in the original post in this thread.
He either thought I was talking about good AF in general or that the HQ AF has something to do with why ATI doesn't have a texture shimmering problem.
I just wanted to clear that up.
Also, the real problem with NVIDIA hardware is the combination of trilinear and anisotropic optmizations along side the "rose" style angle dependant AF. Their "brilinear" method of waiting until near the mipmap transition to blend textures is a perfectly fine solution if just using trilinear filtering (the only point of which is to blurr the transition lines between mipmaps anyway).
TheInvincibleMustard - Friday, October 7, 2005 - link
Hard|OCP did some image quality comparisons between the 7800GT and the X1800XL in their "X1000" launch article, and there was a noticable difference between ATi's HQAF and nVidia's AF, and in a FPS no less. Add in the fact that they pretty much said that you could enable HQAF for hardly any performance drop, and that's a pretty nice point in ATi's favor.I think that AnandTech should look at an IQ comparison again, if they're not seeing any difference.
-TIM
nserra - Friday, October 7, 2005 - link
I agree. New image quality tests must be done.Or maybe nvidia cards with 2 x performance of Ati, but with xgi/sis image quality is OK.
I don’t think so.
S3 and XGI have been plagued by their texture quality (image quality). But no one cares if those problems come from an Nvidia card.
X8xx was supposed to offer lower image quality than R3xx, but no one really has showed that.
bob661 - Friday, October 7, 2005 - link
I've never experienced image quality issues on NVidia or ATI cards. They both look the same to me. YMMV.ChrisSwede - Friday, October 7, 2005 - link
I was wondering what card available now that compares to my 9800 PRO? i.e. which card should I look for in reviews and equate to mine??Maybe none? :)
Thanks
ChrisSwede - Friday, October 7, 2005 - link
Thanks