New Ultra High End Price Point With GeForce 8800 Ultra
by Derek Wilson on May 2, 2007 9:00 AM EST- Posted in
- GPUs
The GeForce 8800 Ultra
Physically, the layout of the board is no different, but NVIDIA has put quite a bit of work into their latest effort. The first and most noticeable change is the HSF.
We have been very happy with NVIDIA's stock cooling solutions for the past few years. This HSF solution is no different, as it offers quiet and efficient cooling. Of course, this could be due to the fact that the only real changes are the position of the fan and the shape of the shroud.
Beyond cooling, NVIDIA has altered the G80 silicon. Though they could not go into the specifics, NVIDIA indicated that layout has been changed to allow for higher clocks. They have also enhanced the 90nm process they are using to fab the chips. Adjustments targeted at improving clock speed and reducing power (which can sometimes work against each other) were made. We certainly wish NVIDIA could have gone into more detail on this topic, but we are left to wonder exactly what is different with the new revision of G80.
As far as functionality is concerned, no features have changed between the 8800 GTX and the 8800 Ultra. What we have, for all intents and purposes, is an overclocked 8800 GTX. Here's a look at the card:
While we don't normally look at overclocking with reference hardware, NVIDIA suggested that there is much more headroom available in the 8800 Ultra than on the GTX. We decided to put the card to the test, but we will have to wait until we get our hands on retail boards to see what end users can realistically expect.
Using nTune, we were able to run completely stable at 684MHz. This is faster than any of our 8800 GTX hardware has been able to reach. Shader clock increases with core clock when set under nTune. The hardware is capable of independent clocks, but currently NVIDIA doesn't allow users to set the clocks independently without the use of a BIOS tweaking utility.
We used RivaTuner to check out where our shader clock landed when setting core clock speed in nTune. With a core clock of 684MHz, we saw 1674MHz on the shader. Pushing nTune up to 690 still gave us a core clock of 684MHz but with a shader clock of 1728MHz. The next core clock speed available is 702MHz which also pairs with 1728MHz on the shader. We could run some tests at these higher speeds, but our reference board wasn't able to handle the heat and locked up without completing our stress test.
It is possible we could see some hardware vendors release 8800 Ultra parts with over 100MHz higher core clocks than stock 8800 GTX parts, which could start to get interesting at the $700+ price range. It does seem that the revised G80 silicon may be able to hit 700+ MHz core clocks with 1.73GHz shader clocks with advanced (read: even more expensive) cooling solutions. That is, if our reference board is actually a good indication of retail parts. As we mentioned, we will have to wait and see.
Physically, the layout of the board is no different, but NVIDIA has put quite a bit of work into their latest effort. The first and most noticeable change is the HSF.
We have been very happy with NVIDIA's stock cooling solutions for the past few years. This HSF solution is no different, as it offers quiet and efficient cooling. Of course, this could be due to the fact that the only real changes are the position of the fan and the shape of the shroud.
Beyond cooling, NVIDIA has altered the G80 silicon. Though they could not go into the specifics, NVIDIA indicated that layout has been changed to allow for higher clocks. They have also enhanced the 90nm process they are using to fab the chips. Adjustments targeted at improving clock speed and reducing power (which can sometimes work against each other) were made. We certainly wish NVIDIA could have gone into more detail on this topic, but we are left to wonder exactly what is different with the new revision of G80.
As far as functionality is concerned, no features have changed between the 8800 GTX and the 8800 Ultra. What we have, for all intents and purposes, is an overclocked 8800 GTX. Here's a look at the card:
While we don't normally look at overclocking with reference hardware, NVIDIA suggested that there is much more headroom available in the 8800 Ultra than on the GTX. We decided to put the card to the test, but we will have to wait until we get our hands on retail boards to see what end users can realistically expect.
Using nTune, we were able to run completely stable at 684MHz. This is faster than any of our 8800 GTX hardware has been able to reach. Shader clock increases with core clock when set under nTune. The hardware is capable of independent clocks, but currently NVIDIA doesn't allow users to set the clocks independently without the use of a BIOS tweaking utility.
We used RivaTuner to check out where our shader clock landed when setting core clock speed in nTune. With a core clock of 684MHz, we saw 1674MHz on the shader. Pushing nTune up to 690 still gave us a core clock of 684MHz but with a shader clock of 1728MHz. The next core clock speed available is 702MHz which also pairs with 1728MHz on the shader. We could run some tests at these higher speeds, but our reference board wasn't able to handle the heat and locked up without completing our stress test.
It is possible we could see some hardware vendors release 8800 Ultra parts with over 100MHz higher core clocks than stock 8800 GTX parts, which could start to get interesting at the $700+ price range. It does seem that the revised G80 silicon may be able to hit 700+ MHz core clocks with 1.73GHz shader clocks with advanced (read: even more expensive) cooling solutions. That is, if our reference board is actually a good indication of retail parts. As we mentioned, we will have to wait and see.
68 Comments
View All Comments
ssidbroadcast - Wednesday, May 2, 2007 - link
$300-$200 more for an overclock? That's it?For that much more money, buy a GTX, take off the stupid heatsink that takes up a whole slot, and spend the extra 200-300 on a decent water cooler.
nVidia is getting just plain arrogant now. C'mon, AMTi... pull it together!
MadBoris - Wednesday, May 2, 2007 - link
Obviously it's the best nvidia could do with the time they had to compete with the R600. They came out with something to maintain perfromance crown, better than nothing.Obviously not worth the price...
Question is...HOW DOES THIS CARD OVERCLOCK?
How fast can this card really go with core and mem?
sxr7171 - Thursday, May 3, 2007 - link
Yeah it is a trophy card. They had to do it for good PR. Now if anyone actually buys it, I guess that's a bonus for Nvidia. There are a good number of people in the world with more money than they know what to do with. This is for those people who buy $300 shirts regularly or don't think twice before dropping $20,000 on a sofa.bob4432 - Wednesday, May 2, 2007 - link
this kind of b.s. from a company will mean i will probably never buy another nvidia gpu, and after my next build another m/b based on their chipset. this is a complete joke and i can't wait for amd/ati to put out something soon. the x1950x is a good card but needs a big brother that is dx10.this has got to be the dumbest thing i have seen since the killernic...this move should even make the fanboys question their allegiance.
sxr7171 - Thursday, May 3, 2007 - link
You are the fanboy for getting so riled up over this and thinking that you shouldn't buy Nvidia's currently superior products because of it. If you have $300 to spend on a video card, there is nothing that beats an 8800GTS now. What does Nvidia releasing a BS $830 card have to do with the excellent price/performance you have been able to get from their other products since late last year.The dumbest thing I have seen is people who will wait to buy something only from a specific company to get poorer performance for their dollar. It is the same kind of sucker who would buy the 8800GTX Ultra.
ss284 - Wednesday, May 2, 2007 - link
This kind of BS happens because ATI can't come out with anything to beat the 8800gtx, even 6+ months after it was released. Nvidia is price gouging because they really have no competition. The R600 is a complete joke, hopefully the coming reviews will shed some light on why.defter - Wednesday, May 2, 2007 - link
Can you tell me what is the difference between $999 Q6700 and $530 Q6600? The price difference is huge, $470...coldpower27 - Wednesday, May 2, 2007 - link
Unlocked Multiplier and 266MHZ more.mlambert890 - Friday, May 4, 2007 - link
Also, the 266 Mhz more and the "Extreme" branding means that the silicon tested higher. People seem to not want to place value on that, but then they get pissed if they're the one that buys the cheap part that will NOT o'clock. If you opt for the budget version, just realize that its a gamble. The "Extreme" parts are essentially geared towards o'clocking and should oclock.For example, getting my QX6700 to 3.2 was effortless and 3.46 required only a minor voltage bump. On water and with a bit more voltage I can do 3.7 but it gets hotter than I would like so I keep it at 3.46. Many people do better than I with the QX6700. This is ALL just multiplier also... NO FSB o'clock so NO need for better RAM and a mobo that isnt picky with FSB o'clocking.
personally, I think there is a LOT of value in all of that. People who dont can buy the cheaper part and feel the Extreme is a "ripoff".
The QX6700 vs. Q6600 is NOT analgous to this situation with the 8800GTX Ultra. NVidia is being ridiculous.
Staples - Wednesday, May 2, 2007 - link
This is just like CPUs the past few years which are 10% faster, the price is usually 50% or more. This is getting crazy without ATI in the market.