ATI's Late Response to G70 - Radeon X1800, X1600 and X1300
by Derek Wilson on October 5, 2005 11:05 AM EST- Posted in
- GPUs
Budget Performance
For budget performance, we feel that 1024x768 is the proper target resolution. People spending near the $100 mark can't expect to acheive performance at high resolutions. But with current hardware, we can play games at moderate resolutions without loosing any features.
The X1300 is targeted at the budget market, but we focued on testing our X1300 Pro against slightly higher performing parts because of it's pricing. The X1300 does quite well versus the traditional low end 6200 TC and X300 parts, but can't really compete with the 6600 GT which is priced near the $149 MSRP of the X1300 Pro.
Under Doom 3 (and many OpenGL applications) NVIDIA holds a lead over ATI hardware. While it is understandable that the X1300 Pro isn't able to match preformance with NVIDIA's $150 6600 GT, the $250 MSRP X1600 XT laggs far behind as well. It is quite interesting to note that the X1600 closes that gap (and performs slightly better than the 6600 GT) when 4xAA and 8xAF are enabled at this resolution. But at such low res, the better bet is to increase the setting to 1280x1024 with no AA where the 6600 GT maintains about a 20% performance lead. Doom 3 is also a fairly low contrast game, meaning that jagged edges are already hard to see.
Under Valve's Day of Defeat: Source, the latest resurrection of a past title by Valve (and also the first to feature HDR), The 6600 GT and X800 perform on par with what we would expect while the more expensive X1600 XT lags behind and the X1300 looks to perform where a budget card should. Enabling 4xAA and 8xAF on this game closes the gap between the 6600 GT and X1600 XT: they both run at about 48 fps under this setting, followed by the X800 at nearly a 43 fps average.
Far Cry provides a victory for the X1600 XT over the 6600 GT, but we still have the expensive X1300 Pro lagging it's closer cost competitor by a large margin.
Everquest II on very high quality mode shows the X1600 XT to lead this segment in performance. Current ~$100 parts are shown to perform horribly at this setting scoring single digit framerates. The X1300 Pro is definitely playable at very high quality at 1024x768 (which we would recommend over a lower quality setting at a higher resolution). Extreme quality still doesn't perform very well on any but the most expensive cards out there and really doesn't offer that much more interms of visual quality.
When testing Splinter Cell: Chaos Theory, the new X1000 series of cards give a very good performance. This time around, the X800 and 6600 GT don't perform equally, and it looks as though the additions to the RV5xx architecture can make quite a difference depending on the game being played.
To see the continuing saga of the X1600 XT, we will take a look at midrange performace numbers at 1280x960.
For budget performance, we feel that 1024x768 is the proper target resolution. People spending near the $100 mark can't expect to acheive performance at high resolutions. But with current hardware, we can play games at moderate resolutions without loosing any features.
The X1300 is targeted at the budget market, but we focued on testing our X1300 Pro against slightly higher performing parts because of it's pricing. The X1300 does quite well versus the traditional low end 6200 TC and X300 parts, but can't really compete with the 6600 GT which is priced near the $149 MSRP of the X1300 Pro.
Under Doom 3 (and many OpenGL applications) NVIDIA holds a lead over ATI hardware. While it is understandable that the X1300 Pro isn't able to match preformance with NVIDIA's $150 6600 GT, the $250 MSRP X1600 XT laggs far behind as well. It is quite interesting to note that the X1600 closes that gap (and performs slightly better than the 6600 GT) when 4xAA and 8xAF are enabled at this resolution. But at such low res, the better bet is to increase the setting to 1280x1024 with no AA where the 6600 GT maintains about a 20% performance lead. Doom 3 is also a fairly low contrast game, meaning that jagged edges are already hard to see.
Under Valve's Day of Defeat: Source, the latest resurrection of a past title by Valve (and also the first to feature HDR), The 6600 GT and X800 perform on par with what we would expect while the more expensive X1600 XT lags behind and the X1300 looks to perform where a budget card should. Enabling 4xAA and 8xAF on this game closes the gap between the 6600 GT and X1600 XT: they both run at about 48 fps under this setting, followed by the X800 at nearly a 43 fps average.
Far Cry provides a victory for the X1600 XT over the 6600 GT, but we still have the expensive X1300 Pro lagging it's closer cost competitor by a large margin.
Everquest II on very high quality mode shows the X1600 XT to lead this segment in performance. Current ~$100 parts are shown to perform horribly at this setting scoring single digit framerates. The X1300 Pro is definitely playable at very high quality at 1024x768 (which we would recommend over a lower quality setting at a higher resolution). Extreme quality still doesn't perform very well on any but the most expensive cards out there and really doesn't offer that much more interms of visual quality.
When testing Splinter Cell: Chaos Theory, the new X1000 series of cards give a very good performance. This time around, the X800 and 6600 GT don't perform equally, and it looks as though the additions to the RV5xx architecture can make quite a difference depending on the game being played.
To see the continuing saga of the X1600 XT, we will take a look at midrange performace numbers at 1280x960.
103 Comments
View All Comments
mlittl3 - Wednesday, October 5, 2005 - link
I'll tell you how it is a win. Take a 8 less pipeline architecture, put it onto a brand new 0.90nm die shrink, clock the hell out of the thing, consume just a little more power and add all the new features like sm3.0 and you equal the competition's fastest card. This is a win. So when ATI releases 1,2,3 etc. more quad pipes, they will be even faster.I don't see anything bob. Anandtech's review was a very bad one. ALL the other sites said this was is good architecture and is on par with and a little faster than nvidia. None of those conclusions can be drawn from the confusing graphs here.
Read the comments here and you will see others agree. Good job, ATI and Nvidia for bringing us competition and equal performing cards. Now bob, go to some other sites, get a good feel for which card suits your needs, and then go buy one. :)
bob661 - Wednesday, October 5, 2005 - link
I read the other sites as well as AT. Quite frankly, I trust AT before any of the other sites because their methodology and consistancy is top notch. HardOCP didn't even test a X1800XT and if I was an avid reader of their site I'd be wondering where that review was. I guess I don't see it your way because I only look for bang for the buck not which could be better if it had this or had that. BTW, I just got some free money (no, I didn't steal it!) today so I'm going to pick up a 7800GT. :)Houdani - Wednesday, October 5, 2005 - link
One of the reasons for the card selections is due to the price of the cards -- and was stated as such. Just because ATI is calling the card "low-end" doesn't mean it should be compared with other low-end cards. If ATI prices their "low-end" card in the same range as a mid-range card, then it should rightfully be compared to those other cards which are at/near the price.But your point is well taken. I'd like to see a few more cards tossed in there.
Madellga - Wednesday, October 5, 2005 - link
Derek, I don't know if you have the time for this, but a review at other website showed a huge difference in performance at the Fear Demo. Ati was in the lead with substantial advantage for the maximum framerates, but near at minimum.http://techreport.com/reviews/2005q4/radeon-x1000/...">http://techreport.com/reviews/2005q4/radeon-x1000/...
As Fear points towards the new generation of engines, it might be worth running some numbers on it.
Also useful would be to report minimum framerates at the higher resolutions, as this relates to good gameplay experience if all goodies are cranked up.
Houdani - Wednesday, October 5, 2005 - link
Well, the review does state that the FEAR Demo greatly favors ATI, but that the actual shipping game is expected to not show such bias. Derek purposefully omitted the FEAR Demo in order to use the shipping game instead.allnighter - Wednesday, October 5, 2005 - link
Is it safe to assume that you guys might not have had enough time with these cards to do your usuall in-depth review? I'm sure you'll update for us to be able to get the full picture. I also must say that I'm missing the oc part of the review. I wanted to see how true it is taht these chips can go sky hig.> Given the fact that they had 3 re-spins it may as well be true.TinyTeeth - Wednesday, October 5, 2005 - link
...an Anandtech review.But it's a bit thin, I must say. I'm still missing overclocking results and Half-Life 2 and Battlefield 2 results. How come no hardware site has tested the cards in Battlefield 2 yet?
From my point of view, Doom III, Splinter Cell, Everquest II and Far Cry are the least interesting games out there.
Overall it's a good review as you can expect from the absolutely best hardware site there is, but I hope and expect there will be another, much larger review.
Houdani - Wednesday, October 5, 2005 - link
The best reason to continue benchmarking games which have been out for a while is because those are the games which the older GPUs were previously benched. When review sites stop using the old benchmarks, they effectively lose the history for all of the older GPU's, and therefore we lose those GPUs in the comparison.Granted, the review is welcome to re-benchmark the old GPUs using the new games ... but that would be a significant undertaking and frankly I don't see many (if any) review sites doing that.
But I will throw you this bone: While I think it's quite appropriate to use benchmarks for two years (maybe even three years), it would also be a good thing to very slowly introduce new games at a pace of one per year, and likewise drop one game per year.
mongoosesRawesome - Wednesday, October 5, 2005 - link
they have to retest whenever they use a different driver/CPU/motherboard, which is quite often. I bet they have to retest every other article or so. Its a pain in the butt, but thats why we visit and don't do the tests ourselves.Madellga - Wednesday, October 5, 2005 - link
Techreport has Battlefield 2 benchmarks, as Fear, Guild Wars and others. I liked the article, recommend that you read also.