ATI's New High End and Mid Range: Radeon X1950 XTX & X1900 XT 256MB
by Derek Wilson on August 23, 2006 9:52 AM EST- Posted in
- GPUs
A Faster, Cheaper High-End
While the X1900 XTX made its debut at over $600USD, this new product launch sees a card with a bigger, better HSF and faster memory debuting at a much lower "top end" price of $450. Quite a few factors play into this, not the least of which is the relatively small performance improvement over the X1900 XTX. We never recommended the X1900 XTX over the X1900 XT due to the small performance gain, but those small differences add up and with ATI turning their back on the X1900 XTX for its replacement. We can finally say that there is a tangible difference between the top two cards offered by ATI.
This refresh part isn't as different as other refresh parts, but the price and performance are about right for what we are seeing. Until ATI brings out a new GPU, it will be hard for them to offer any volume of chips that run faster than the X1950 XTX. The R5xx series is a very large 384 million transistor slice of silicon that draws power like its going out of style, but there's nothing wrong with using the brute force method every once in a while. The features ATI packed in the hardware are excellent, and now that the HSF is much less intrusive (and the price is right) we can really enjoy the card.
Speaking of the thermal solution, it is worth noting that ATI has put quite abit of effort into improving the aural impact of its hardware. The X1900 XTX is not only the loudest card around, but it also possesses a shrill and quite annoying sound quality. In contrast, the X1950 XTX is not overly loud even during testing when the fan runs at full speed, and the sound is not as painful to hear. We are also delighted to find that ATI no longer spins the fan at full speed until the drivers load. After the card spins up, it remains quiet until it gets hot. ATI has upgraded their onboard fan header to a 4-pin connection (following in the footsteps of NVIDIA and Intel), allowing them a more fine grained control over their fan speed.
While the X1950 XTX is not as quiet as NVIDIA's 7900 GTX solution, it is absolutely a step in the right direction. That's not to say their aren't some caveats to this high end launch.
Even before the introduction of SLI, every NVIDIA GPU had the necessary components to support multiple GPU configurations in silicon. Adding an "over the top" SLI bridge connector to cards has resulted in the fact that nearly every NVIDIA card sold is capable of operating in multi-GPU mode. While lower end ATI products don't require anything special to work in tandem, the higher end products have needed a special "CrossFire" branded card with an external connector and dongle capable of receiving data from a slave card.
While this isn't necessarily a bad solution to the problem, it is certainly less flexible than NVIDIA's implementation. In the past, in order to run a high end multi-GPU ATI configuration, a lower clocked (compared to the highest speed ATI cards) more expensive card was needed. With the introduction of X1950 CrossFire, we finally have an ATI multi-GPU solution available at the highest available clock speed offered and at the same price as a non-CrossFire card.
While this may not be a problem for us, it might not end up making sense for ATI in the long run. Presumably, they will see higher margins from the non-CrossFire X1950 card, but the consumer will see no benefit from staying away from CrossFire. (Note that the CrossFire cable still offers a second DVI port.) In fact, the benefits of having a CrossFire version are fairly significant in the long run. As we mentioned, 2 CrossFire cards can be used in CrossFire with no problem, each card could be used as a master in other systems offering greater flexibility and a higher potential resale value in the future.
If the average consumer realizes the situation for what it is, we could see some bumps in the road for ATI. It's very likely that we will see lower availability of CrossFire cards, as the past has shown a lower demand for such cards. Now that ATI has taken the last step in making their current incarnation of multi-GPU technology as attractive and efficient as possible, we wouldn't be surprised if demand for CrossFire cards comes to completely eclipse demand for the XTX. If demand does go up for the CrossFire cards, ATI will either have a supply problem or a pricing problem. It will be very interesting to watch the situation and see which it will be.
Before we move on to the individual game tests, lets take a look at how the X1950 XTX stacks up against its predecessor the X1900 XTX. Mouse over the links below the image to look at the performance difference between the X1950 XTX and the X1900 XTX at that resolution.
1280 x 1024 1920 x 1440 2048 x 1536
For our 29% increase in memory clock speed, we are able to gain at most an 8.5% performance increase in SC:CT. This actually isn't bad for just a memory clock speed boost. Battlefield 2 without AA took home the least improvement with a maximum of 2.3% at our highest resolution.
Our DirectX games seem to show a consistently higher performance improvement with AA enabled due to memory speed. This is in contrast to our OpenGL games (Quake 4 and F.E.A.R.) which show a pretty constant percent improvement at each resolution with AA enabled while scaling without AA improves as resolution increases. Oblivion improvement seems to vary between 2% and 5%, but this is likely due to the variance of our benchmark between runs.
74 Comments
View All Comments
JarredWalton - Wednesday, August 23, 2006 - link
We used factory overclocked 7900 GT cards that are widely available. These are basically guaranteed overclocks for about $20 more. There are no factory overclocked ATI cards around, but realistically don't expect overclocking to get more than 5% more performance on ATI hardware.The X1900 XTX is clocked at 650 MHz, which isn't much higher than the 625 MHz of the XT cards. Given that ATI just released a lower power card but kept the clock speed at 650 MHz, it's pretty clear that there GPUs are close to topped out. The RAM might have a bit more headroom, but memory bandwidth already appears to be less of a concern, as the X1950 isn't tremendously faster than the X1900.
yyrkoon - Wednesday, August 23, 2006 - link
I think its obvious why ATI is selling thier cards for less now, and that reason is alot of 'tech savy' users, are waiting for Direct3D 10 to be released, and want to buy a capable card. This is probably to try an entice some people into buying technology that will be 'obsolete', when Direct3D 10 is released.Supposedly Vista will ship with Directx 9L, and Directx 10 (Direct3D 10), but I've also read to the contrary, and that Direct3D 10 wont be released until after Vista ships (sometime). Personally, I couldnt think of a better time to buy hardware, but alot of people think that waiting, and just paying through the nose for a Video card later, is going to save them money. *shrug*
Broken - Wednesday, August 23, 2006 - link
In this review, the test bed was an Intel D975XBX (LGA-775). I thought this was an ATI Crossfire only board and could not run two Nvidia cards in SLI. Are there hacked drivers that allow this, and if so, is there any penalty? Also, I see that this board is dual 8x pci-e and not dual 16x... at high resolutions, could this be a limiting factor, or is that not for another year?DerekWilson - Wednesday, August 23, 2006 - link
Sorry about the confusion there. We actually used an nForce4 Intel x16 board for the NVIDIA SLI tests. Unfortunately, it is still not possible to run SLI on an Intel motherboard. Our test section has been updated with the appropriate information.Thanks for pointing this out.
Derek Wilson
ElFenix - Wednesday, August 23, 2006 - link
as we all should know by now, Nvidia's default driver quality setting is lower than ATi's, and makes a significant difference in the framerate when you use the driver settings to match the quality settings. your "The Test" page does not indicate that you changed the driver quality settings to match.DerekWilson - Wednesday, August 23, 2006 - link
Drivers were run with default quality settings.Default driver settings between ATI and NVIDIA are generally comparable from an image quality stand point unless shimmering or banding is noticed due to trilinear/anisotropic optimizations. None of the games we tested displayed any such issues during our testing.
At the same time, during our Quad SLI followup we would like to include a series of tests run at the highest possible quality settings for both ATI and NVIDIA -- which would put ATI ahead of NVIDIA in terms of Anisotropic filtering or in chuck patch cases and NVIDIA ahead of ATI in terms of adaptive/transparency AA (which is actually degraded by their gamma correction).
If you have any suggestions on different settings to compare, we are more than willing to run some tests and see what happens.
Thanks,
Derek Wilson
ElFenix - Wednesday, August 23, 2006 - link
could you run each card with the quality slider turned all the way up, please? i believe that the the default setting for ATi, and the 'High Quality' setting for nvidia. someone correct me if i'm wrong.thanks!
michael
yyrkoon - Wednesday, August 23, 2006 - link
I think as long as all settings from both offerings are as close as possible per benchmark, there is no real gripe.Although, some people seem to think it nessisary to run AA as high resolutions (1600x1200 +), but I'm not one of them. Its very hard for me to notice jaggies even at 1440x900, especially when concentrating on the game, instead of standing still, and looking with a magnifying glass for jaggies . . .
mostlyprudent - Wednesday, August 23, 2006 - link
When are we going to see a good number of Core 2 Duo motherboards that support Crossfire? The fact that AT is using an Intel made board rather than a "true enthusiast" board says something about the current state of Core 2 Duo motherboards.DerekWilson - Wednesday, August 23, 2006 - link
Intel's boards are actually very good. The only reason we haven't been using them in our tests (aside from a lack of SLI support) is that we have not been recommending Intel processors for the past couple years. Core 2 Duo makes Intel CPUs worth having, and you definitely won't go wrong with a good Intel motherboard.