Gigabyte Dual GPU: nForce4, Intel, and the 3D1 Single Card SLI Tested
by Derek Wilson on January 6, 2005 4:12 PM EST- Posted in
- GPUs
Half-Life 2 Performance
Unfortunately, we were unable to test the Intel platform under Half-Life 2. We aren't quite sure how to explain the issue that we were seeing, but in trying to run the game, the screen would flash between each frame. There were other visual issues as well. Due to these issues, performance was not comparable to our other systems. Short of a hard crash, this was the worst possible kind of problem that we could have seen. It is very likely that this could be an issue that NVIDIA may have fixed between 71.20 and 66.81 on Intel systems, but we are unable to test any other driver at this time.We are also not including the 6800 Ultra scores, as the numbers that we were using as a reference (from our article on NVIDIA's official SLI launch) were run using the older version 6 of Half-Life 2 as well as older (different) versions of our timedemos.
Continuing our trend with Half-Life 2 and graphics card tests, we've benched the game in 5 different levels in two resolutions with and without 4xAA/8xAF enabled. We've listed the raw results in these tables for those who are interested. For easier analysis, we've taken the average performance of each of the 5 level tests and compared the result in our graphs below.
Half-Life 2 1280x1024 noAA/AF | |||||
at_c17_12 | at_canals_08 | at_coast_05 | at_coast_12 | at_prison_05 | |
NVIDIA GeForce 6600 GT | 76.1 | 71.3 | 115 | 94.9 | 80 |
2x NVIDIA 6600 GT (AMD SLI) | 77.2 | 98.8 | 118.5 | 117.5 | 116.6 |
Gigabyte 2x6600GT 3D1 | 77.3 | 99.1 | 118.7 | 117.9 | 118.2 |
Half-Life 2 1600x1200 noAA/AF | |||||
at_c17_12 | at_canals_08 | at_coast_05 | at_coast_12 | at_prison_05 | |
NVIDIA GeForce 6600 GT | 61.1 | 55.7 | 91.5 | 69.3 | 57.6 |
2x NVIDIA 6600 GT (AMD SLI) | 73.5 | 85.8 | 110.8 | 104.9 | 92.9 |
Gigabyte 2x6600GT 3D1 | 73.6 | 87 | 111.4 | 106 | 94.6 |
Half-Life 2 1280x1024 4xAA/8xAF | |||||
at_c17_12 | at_canals_08 | at_coast_05 | at_coast_12 | at_prison_05 | |
NVIDIA GeForce 6600 GT | 40.5 | 40.1 | 74.8 | 54.9 | 45.1 |
2x NVIDIA 6600 GT (AMD SLI) | 45.2 | 47.8 | 92.4 | 81 | 61 |
Gigabyte 2x6600GT 3D1 | 45.7 | 47.8 | 94.1 | 82.8 | 62.3 |
Half-Life 2 1600x1200 4xAA/8xAF | |||||
at_c17_12 | at_canals_08 | at_coast_05 | at_coast_12 | at_prison_05 | |
NVIDIA GeForce 6600 GT | 27.3 | 27.2 | 43.3 | 32.7 | 28.1 |
2x NVIDIA 6600 GT (AMD SLI) | 33.8 | 35.3 | 58.1 | 49.2 | 39.8 |
Gigabyte 2x6600GT 3D1 | 33.8 | 35.3 | 59.3 | 50.8 | 40.6 |
The 3D1 averages about one half to one frame higher in performance than two stock clocked 6600 GT's in SLI mode. This equates to a difference of absolutely nothing with frame rates of near 100fps. Even the overclocked RAM doesn't help the 3D1 here.
We see more of the same when we look at performance with anti-aliasing and anisotropic filtering enabled. The Gigabyte 3D1 performs on par with 2 x 6600GT cards in SLI. With the RAM overclocked, this means that the bottleneck under HL2 is somewhere else when running in SLI mode. We've seen GPU and RAM speed to impact HL2 performance pretty evenly under single GPU conditions.
43 Comments
View All Comments
johnsonx - Friday, January 7, 2005 - link
To #19:from page 1:
"....even if the 3D1 didn't require a special motherboard BIOS in order to boot video..."
In other words, the mainboard BIOS has to do something special to deal with a dual-GPU card, or at least the current implementation of the 3D1.
What NVidia should do is:
1. Update their drivers to allow SLI any time two GPU's are found, whether they be on two boards or one.
2. Standardize whatever BIOS support is required for the dual GPU cards to POST properly, and include the code in their reference BIOS for the NForce4.
At least then you could run a dual-GPU card on any NForce4 board. Maybe in turn Quad-GPU could be possible on an SLI board.
bob661 - Friday, January 7, 2005 - link
#19I think the article mentioned a special bios is needed to run this card. Right now only Gigabyte has this bios.
pio!pio! - Friday, January 7, 2005 - link
#18 use a laptopFinalFantasy - Friday, January 7, 2005 - link
Poor Intel :(jcromano - Friday, January 7, 2005 - link
From the article, which I enjoyed very much:"The only motherboard that can run the 3D1 is the GA-K8NXP-SLI."
Why exactly can't the ASUS SLI board (for example) use the 3D1? Surely not just because Gigabyte says it can't, right?
Cheers,
Jim
phaxmohdem - Friday, January 7, 2005 - link
ATI Rage Fury MAXX Nuff said...lol #6 I think you're on to something though. Modern technology is becoming incredibly power hungry I think that more steps need to be taken to reduce power consumption and heat production, however with the current pixel pushing slugfest we are witnessing FPS has obviously displaced these two worries to our beloved Video card manufacturers. At some point though when consumers refuse to buy the latest Geforce or Radeon card with a heatsink taking up 4 Extra PCI slots, I think that they will get the hint. I personally consider a dual slot heatsink solution ludicrous.
Nvidia, ATI, Intel, AMD... STOP RAISING MY ELECTRICITY BILL AND ROOM TEMPERATURE!!!!
KingofCamelot - Friday, January 7, 2005 - link
#16 I'm tired of you people acting like SLI is only doable with an NVIDIA motherboard, which is obviously not the case. SLI only applies to the graphics cards. On motherboards SLI is just a marketing term for NVIDIA. Any board with 2 16x PCI-E connectors can pull off SLI with NVIDIA graphics cards. NVIDIA's solution is unique because they were able to split a 16x line and give each connector 8x bandwidth. Other motherboard manufacturer's are doing 16x and 4x.sprockkets - Thursday, January 6, 2005 - link
I'm curious to see how all those lame Intel configs by Dell and others pull off SLI long before thie mb came out.Regs - Thursday, January 6, 2005 - link
Once again - history repeats itself. Dual core SLI solutions are still a far reach from reality.Lifted - Thursday, January 6, 2005 - link
Dual 6800GT's???? hahahahahehhehehehahahahah.Not laughing at you, but those things are so hot you'd need a 50 pound copper heatsink on the beast with 4 x 20,000 RPM fans running full boar just to prevent a China Syndrome.
Somebody say dual core? Maybe with GeForce 2 MX series cores.