NVIDIA GeForce 8800 GT: The Only Card That Matters
by Derek Wilson on October 29, 2007 9:00 AM EST- Posted in
- GPUs
The First PCIe 2.0 Graphics Card
NVIDIA's 8800 GT is the "world's first consumer GPU to support PCI Express 2.0." Although AMD's Radeon HD 2400/2600 have PCIe 2.0 bandwidth, they don't implement the full spec, leaving the 8800 GT technically the first full PCIe 2.0 GPU. Currently, the only motherboard chipset out that that could take advantage of this is Intel's X38. We have yet to play with benchmarks on PCIe 2.0, but we don't expect any significant impact on current games and consumer applications. Currently we aren't bandwidth limited by PCIe 1.1 with its 4GB/sec in each direction, so it's unlikely that the speed boost would really help. This sentiment is confirmed by game developers and NVIDIA, but if any of our internal tests show anything different we'll certainly put a follow-up together.
PCIe 2.0 itself offers double the speed of the original spec. This means pairing a x16 PCIe 2.0 GPU with a x16 electrical PCIe 2.0 slot on a motherboard will offer 8GB/sec of bandwidth upstream and downstream (16GB/sec total bandwidth). This actually brings us to an inflection point in the industry: the CPU now has a faster connection to the GPU than to main system memory (compared to 800MHz DDR2). When we move to 1066MHz and 1333MHz DDR3, system memory will be faster, but for now most people will still be using 800MHz memory even with PCIe 2.0. PCIe 3.0 promises to double the bandwidth again from version 2.0, which would likely put a graphics card ahead of memory in terms of potential CPU I/O speed again. This will still be limited by the read and write speed of the graphics card itself, which has traditionally left a lot to be desired. Hopefully GPU makers will catch up with this and offer faster GPU memory read speeds as well.
For now, the only key point is that the card supports PCIe 2.0, and moving forward in bandwidth before we need it is a terrific step in enabling developers by giving them the potential to make use of a feature before there is an immediate need. This is certainly a good thing, as massively parallel processing, multiGPU, physics on the graphics card and other GPU computing techniques and technologies threaten to become mainstream. While we may not see applications that push PCIe 2.0 in the near term, moving over to the new spec is an important step, and we're glad to see it happening at this pace. But there are no real tangible benefits to the consumer right now either.
The transition to PCIe 2.0 won't be anything like the move from AGP to PCIe. The cards and motherboards are backwards and forwards compatible. PCIe 1.0 and 1.1 compliant cards can be plugged into a PCIe 2.0 motherboard, and PCIe 2.0 cards can be plugged into older motherboards. This leaves us with zero impact on the consumer due to PCIe 2.0, in more ways than one.
NVIDIA's 8800 GT is the "world's first consumer GPU to support PCI Express 2.0." Although AMD's Radeon HD 2400/2600 have PCIe 2.0 bandwidth, they don't implement the full spec, leaving the 8800 GT technically the first full PCIe 2.0 GPU. Currently, the only motherboard chipset out that that could take advantage of this is Intel's X38. We have yet to play with benchmarks on PCIe 2.0, but we don't expect any significant impact on current games and consumer applications. Currently we aren't bandwidth limited by PCIe 1.1 with its 4GB/sec in each direction, so it's unlikely that the speed boost would really help. This sentiment is confirmed by game developers and NVIDIA, but if any of our internal tests show anything different we'll certainly put a follow-up together.
PCIe 2.0 itself offers double the speed of the original spec. This means pairing a x16 PCIe 2.0 GPU with a x16 electrical PCIe 2.0 slot on a motherboard will offer 8GB/sec of bandwidth upstream and downstream (16GB/sec total bandwidth). This actually brings us to an inflection point in the industry: the CPU now has a faster connection to the GPU than to main system memory (compared to 800MHz DDR2). When we move to 1066MHz and 1333MHz DDR3, system memory will be faster, but for now most people will still be using 800MHz memory even with PCIe 2.0. PCIe 3.0 promises to double the bandwidth again from version 2.0, which would likely put a graphics card ahead of memory in terms of potential CPU I/O speed again. This will still be limited by the read and write speed of the graphics card itself, which has traditionally left a lot to be desired. Hopefully GPU makers will catch up with this and offer faster GPU memory read speeds as well.
For now, the only key point is that the card supports PCIe 2.0, and moving forward in bandwidth before we need it is a terrific step in enabling developers by giving them the potential to make use of a feature before there is an immediate need. This is certainly a good thing, as massively parallel processing, multiGPU, physics on the graphics card and other GPU computing techniques and technologies threaten to become mainstream. While we may not see applications that push PCIe 2.0 in the near term, moving over to the new spec is an important step, and we're glad to see it happening at this pace. But there are no real tangible benefits to the consumer right now either.
The transition to PCIe 2.0 won't be anything like the move from AGP to PCIe. The cards and motherboards are backwards and forwards compatible. PCIe 1.0 and 1.1 compliant cards can be plugged into a PCIe 2.0 motherboard, and PCIe 2.0 cards can be plugged into older motherboards. This leaves us with zero impact on the consumer due to PCIe 2.0, in more ways than one.
90 Comments
View All Comments
DukeN - Monday, October 29, 2007 - link
This is unreal price to performance - knock on wood; play oblivion at 1920X1200 on a $250 GPU.Could we have a benchmark based on the Crysis demo please, how one or two cards would do?
Also, the power page pics do not show up for some reason (may be the firewall cached it incorrectly here at work).
Thank you.
Xtasy26 - Monday, October 29, 2007 - link
Hey Guys,If you want to see Crysis benchmarks, check out this link:
http://www.theinquirer.net/gb/inquirer/news/2007/1...">http://www.theinquirer.net/gb/inquirer/.../2007/10...
The benches are:
1280 x 1024 : ~ 37 f.p.s.
1680 x 1050 : 25 f.p.s.
1920 x 1080 : ~ 21 f.p.s.
This is on a test bed:
Intel Core 2 Extreme QX6800 @2.93 GHz
Asetek VapoChill Micro cooler
EVGA 680i motherboard
2GB Corsair Dominator PC2-9136C5D
Nvidia GeForce 8800GT 512MB/Zotac 8800GTX AMP!/XFX 8800Ultra/ATI Radeon HD2900XT
250GB Seagate Barracuda 7200.10 16MB cache
Sony BWU-100A Blu-ray burner
Hiper 880W Type-R Power Supply
Toshiba's external HD-DVD box (Xbox 360 HD-DVD drive)
Dell 2407WFP-HC
Logitech G15 Keyboard, MX-518 rat
Xtasy26 - Monday, October 29, 2007 - link
This game seems real demanding. If it is getting 37 f.p.s. at 1280 x 1024, imagine what the frame rate will be with 4X FSAA enabled combined with 8X Anistrophic Filtering. I think I will wait till Nvidia releases there 9800/9600 GT/GTS and combine that with Intel's 45nm Penryn CPU. I want to play this beautiful game in all it's glory!:)Spuke - Monday, October 29, 2007 - link
Impressive!!!! I read the article but I saw no mention of a release date. When's this thing available?Spuke - Monday, October 29, 2007 - link
Ummm.....When can I BUY it? That's what I mean.EODetroit - Monday, October 29, 2007 - link
Now.http://www.newegg.com/Product/ProductList.aspx?Sub...">http://www.newegg.com/Product/ProductLi...18+10696...
poohbear - Wednesday, October 31, 2007 - link
when do u guys think its gonna be $250? cheapest i see is $270, but i understand when its first released the prices are jacked up a bit.EateryOfPiza - Monday, October 29, 2007 - link
I second the request for Crysis benchmarks, that is the game that taxes everything at the moment.DerekWilson - Monday, October 29, 2007 - link
we actually tested crysis ...but there were issues ... not with the game, we just shot ourselves in the foot on this one and weren't able to do as much as we wanted. We had to retest a bunch of stuff, and we didn't get to crysis.
yyrkoon - Monday, October 29, 2007 - link
Yes, I am glad instead of purchasing a video card, I instead changed motherboard/CPU for Intel vs AMD. I still like my AM2 Opteron system a lot, but performance numbers, and the effortless 1Ghz OC on the ABIT IP35-E/(at $90usd !) was just too much to overlook.I can definitely understand your 'praise' as it were when nVidia is now lowering their prices, but this is where these prices should have always been. nVidia, and ATI/AMD have been ripping us, the consumer off for the last 1.5 years or so, so you will excuse me if I do not show too much enthusiasm when they finally lower their prices to where they should be. I do not consider this to be much different than the memory industry over charging, and the consumer getting the shaft(as per your article).
I am happy though . . .