NVIDIA GeForce 8800 GTS 512 & GeForce 8800 GT 256MB: Playing with Memory and G92
by Anand Lal Shimpi on December 11, 2007 12:00 AM EST- Posted in
- GPUs
The 8800 GT 256MB: Here at Last
The 8800 GTS 512 isn't the only new NVIDIA card we'll be looking at today. Remember the 256MB 8800 GT that NVIDIA promised us for less than $200? We don't exactly have that, but we've got a $229 XFX GeForce 8800 GT 256MB, which is pretty close.
The Alpha Dog Edition XXX we have runs at a slightly overclocked 650MHz core clock, 1.6GHz shader clock and with 256MB of GDDR3 running at a 1.6GHz data rate. That's an 8% higher core clock, 7% higher shader clock and 11% lower memory clock than a standard 512MB 8800 GT. The card is available and considerably cheaper than the $300 512MB cards floating around, so we'll look at whether losing 256MB of frame buffer matters all that much later on in the review.
The Test
Note that the results from this article can't be compared to those from our 8800 GT and Radeon HD 3800 articles, we're using different hardware, updated drivers and in some cases updated benchmarks to keep up with the latest game patches.
We aren't going to recap some of the basic performance comparisons we did in the two aforementioned reviews, so if you want to know how the 8800 GT stacks up against older cards or how the older GTSes perform, be sure to consult those articles.
Our test platform for this article is as follows:
Test Setup | |
CPU | Intel Core 2 Extreme QX6850 |
Motherboard | ASUS P5E3 Deluxe |
Video Cards | AMD Radeon HD 3870 AMD Radeon HD 3850 NVIDIA GeForce 8800 Ultra NVIDIA GeForce 8800 GTS 512 NVIDIA GeForce 8800 GT (512MB & 256MB) |
Video Drivers | AMD: Catalyst 7.11 NVIDIA: 169.12 |
Hard Drive | Seagate 7200.9 300GB 8MB 7200RPM |
RAM | 4x1GB Corsair XMS3 DDR3-1066 7-7-7-20 |
Operating System | Windows Vista Ultimate 32-bit |
56 Comments
View All Comments
chizow - Tuesday, December 11, 2007 - link
This is probably the first time I've felt an AT review wasn't worth reading and definitely the first time I've said a review done by Anand wasn't worth reading. The conclusion is correct, but for very different reasons. There is no 10-15% advantage (as many would consider that significant enough a reason to pay $50 more), there is NO advantage of getting a G92 GTS over a G92 GT.http://www.firingsquad.com/hardware/nvidia_geforce...">Firing Squad Review
When looking over this review, pay special attention to:
Leadtek GeForce 8800 GT Extreme (680MHz core/1.0GHz memory)
vs.
XFX GeForce 8800 GTS 512MB XXX (678MHz core/986MHz memory)
Almost no difference at all in performance.......
Acragas - Tuesday, December 11, 2007 - link
Did you read all the way to the end of the Firing Squad review? Because at the end, they seem to leave no doubt that the 8800GTS 512 is certainly the superior card. I <3 EVGA's step up program.They conclude:
Given the GeForce 8800 GTS 512MB’s outstanding performance though, this higher price tag is definitely justified. The 8800 GTS 512MB cards blazed through all of our benchmarks, with performance generally falling anywhere between the GeForce 8800 GT and the GeForce 8800 GTX, while a card that’s been overclocked can put up numbers that are higher than the GTX in some cases.
If you’ve got $400 to spend on a graphics upgrade this Christmas, the GeForce 8800 GTS 512MB is without a doubt the card we recommend. In fact, we wouldn’t be surprised if the GeForce 8800 GTS 512MB ends up stealing sales away from the GeForce 8800 GTX.
chizow - Tuesday, December 11, 2007 - link
Why would I need to read their conclusion when their data allows you to come to your own? I'm sure they were blinded by the stark contrast in their pretty graphs without realizing they showed there was virtually no difference in performance between the parts at the same clock speed.Granted, the dual-slot cooler would allow you to run at higher clock speeds, but for a $50-100 difference in price is a better cooler and 16SP and 8 tmu/tau that yield 0-2% difference in performance worth it?
zazzn - Tuesday, December 11, 2007 - link
i foolishly also bought a 8800gts like 4 months ago and now the GTs are out stomping them and for cheaper. i feel like a fool and XFX doesnt offer a step up program next time i buy its a evga for sure...I m so sour right now about the situation consdering i needed a new psu from 450 to 600 which also cost me 150 and most likely wouldnt have needed it if i bought the gt now since it requires less power.
how crap is that
can zou post the results of a old 88vs a new 88gts
Kelly - Tuesday, December 11, 2007 - link
Isn't the power consumption of 3870 vs 8800GT512 a bit odd compared to previous findings?Here are the numbers I am wondering about
idle/load
8800GT: 146/269 (difference:123)
3870: 122/232 (difference:110)
Compare this to
http://www.anandtech.com/video/showdoc.aspx?i=3151...">http://www.anandtech.com/video/showdoc.aspx?i=3151...
8800GT: 165/209 (difference:44)
3870: 125/214 (difference:89)
Or am I not doing the comparison correctly?
Thanks for a nice review as always!
Spoelie - Tuesday, December 11, 2007 - link
The original review results were a bit strange, the gap between the 3850/3870 was way too great for a simple clock bump between them, also DDR4 should consume less power than DDR3. So these values seem more right, the gap between idle and load is bigger because they used a quad core cpu in this article and a dual core in the previous one.Khato - Tuesday, December 11, 2007 - link
Well, the load results from this article in comparison to the previous bring light to a disturbing fact. If the definition of 'load' is a game and we're never CPU limited, then the performance of the graphics card is going to scale the CPU power usage accordingly, giving the impression that faster cards draw far more power than they actually do. On the flipside, if we're CPU limited (which might have been the case in the previous review) then CPU power is about constant, and the high end cards are idling more often, giving the impression that they're more efficient than they really are.It'd be interesting to see the % CPU utilization for each card.
trajan - Tuesday, December 11, 2007 - link
I promise I'm not paid to say this, but I feel like the new GTS plus EVGA's step up program just saved me a load of cash. I (foolishly?) bought a superclocked EVGA 8800GTS 640mb card almost 3 months ago, right before the 8800GT came out. Yeah, bad timing. But when I checked online I still have 18 days left on my step-up.So, very ironically, I am upgrading from a $395 dollar card to a $360 card, paying $10 in shipping both ways. I don't get a refund, so I essentially will paid $420 for a $360 part, but what a huge upside -- I got a great card 3 months ago and am now getting a great upgrade almost free.
I say "finally" in the subject because switching from the superclocked 8800GTS 640 to a 8800 GT just didn't seem worth it, especially given how much money I'd be losing .. I kept hoping something better would come around even if it cost more, since I can upgrade to any sub-$400 card just by paying shipping..
Viditor - Tuesday, December 11, 2007 - link
My question is this...If an 8800 GT 512 is $300-$350, and 2 x HD3850s are a total of $358, how do they compare in performance (in other words, do the Xfired 3850s outperform the 8800GT 512, and if so by how much)?
chizow - Tuesday, December 11, 2007 - link
That's basically what it comes down to with the G92 vs. G80. Another big difference between the G80s and G92s that the review failed to mention is the 24 vs. 16 ROP advantage G80 maintains over G92; a lead which the increased clock speeds can't make up for.Anyways, pretty clear the G92 does better in shader intensive games (newer ones) with its massive shader ops/sec advantage over the G80, but falls short when you enable AA or high resolutions where fillrate becomes more important.
In the end I think the result is better for gamers but it doesn't look like there's a definitive winner this time around. Its basically trading performance at various settings/games but for the end-user, the benefit is that you can get great performance at a much better price by giving up the ultra high-end settings (1920+ w/AA), which at this point are borderline playable now anyways.