NVIDIA's GeForce 7800 GTX Hits The Ground Running
by Derek Wilson on June 22, 2005 9:00 AM EST- Posted in
- GPUs
No More Shader Replacement
The secret is all in compilation and scheduling. Now that NVIDIA has had more time to work with scheduling and profiling code on an already efficient and powerful architecture, they have an opportunity. This generation, rather than build a compiler to fit hardware, they were able to take what they've learned and build their hardware to better fit a mature compiler already targeted to the architecture. All this leads up to the fact that the 7800 GTX with current drivers does absolutely no shader replacement. This is quite a big deal in light of the fact that, just over a year ago, thousands of shaders were stored in the driver ready for replacement on demand in NV3x and even NV4x. It's quite an asset to have come this far with hardware and software in the relatively short amount of time NVIDIA has spent working with real-time compilation of shader programs.All these factors come together to mean that the hardware is busy more of the time. And getting more things done faster is what it's all about.
So, NVIDIA is offering a nominal increase in clock speed to 430MHz, just a little more memory bandwidth (256bit memory buss running at a 1.2GHz data rate), 1.33x vertex pipelines, 1.5x pixel pipelines, and various increases in efficiency. These all work together to give us as much as double the performance in extreme cases. If the performance increase can actually be realized, we are looking at a pretty decent speed increase over the 6800 Ultra. Obviously, in the real world we won't be seeing a threefold performance increase in anything but a bad benchmark. In cases where games are CPU limited, we will likely see a much lower increase in performance, but performance double that of the 6800 Ultra is entirely possible in very shader limited games.
In fact, EPIC reports that under certain Unreal Engine 3 tests they currently see two to 2.4x improvements in framerate over the 6800 Ultra. Of course, UE3 is not finished yet and there won't be games out based on the engine for a while. We don't usually like reporting performance numbers from software that hasn't been released, but even if these numbers are higher than we will see in a shipping product, it seems that NVIDIA has at least gotten it right for one developer's technology. We are very interested in seeing how next generation games will perform on this hardware. If we can trust these numbers at all, it looks like the performance advantage will only get better for the GeForce 7800 GTX until Windows Graphics Foundation 2.0 comes along and inspires new techniques beyond SM3.0 capabilities.
Right now, each triangle that gets fed through the vertex pipeline, there are many pixels inside the object that needs her help.
Bringing It All Together
Why didn't NVIDIA build a part with unified shaders?
Every generation, NVIDIA evaluates alternative architectures, but at this time they don't feel that a unified architecture is a good match to the current PC landscape. We will eventually see a unified shader architecture from NVIDIA, but it will not likely be until DirectX itself is focused around a unified shader architecture. At this point, vertex hardware doesn't need to be as complex or intricate as the pixel pipeline. As APIs develop more and more complex functionality it will be advantageous for hardware developers to move towards a more generic and programmable shader unit that can easily adapt to any floating point processing need.
As pixel processing is currently more important than vertex processing, NVIDIA is separating the two in order to focus attention where it is due. Making hardware more generic usually makes it necessarily slower, but explicitly targeting a specific aspect of something can often improve performance a great deal.
When WGF 2.0 comes along and geometry shaders are able to dynamically generate vertex data inside the GPU we will likely see an increased burden on vertex processing as well. Being able to programmatically generate vertex data will help to remove the burden on the system to supply all the model data to the GPU.
127 Comments
View All Comments
Johnmcl7 - Wednesday, June 22, 2005 - link
If they're too busy for the article, that's fair enough, the point is they should put it up when they've had time to check it over, rather than rush an article up that isn't ready to be published.John
IronChefMoto - Wednesday, June 22, 2005 - link
Regarding the "shame on Anandtech" comments -- y'all ever think they were too busy sh*tting themselves at the performance of this card to really pay that much attention to the article? ;-)IronChefMorimoto
Johnmcl7 - Wednesday, June 22, 2005 - link
The prices I've seen here in the UK for the 7800s here are around 400 pounds, the 6800 Ultras are currently around 300 pounds. So quite an increase over the NV40s but not unacceptable given the performance, I'm sure they'll come down in price once the early adopters have had their fill.John
yacoub - Wednesday, June 22, 2005 - link
#26 - You must be new to the market, relatively speaking. I remember quite well the days when high-end new videocards were at MOST $400, usually $350 or less when they debuted. It was more than a year or two ago though, so it might have been before your time as a PC gamer.rimshot - Wednesday, June 22, 2005 - link
Not sure why the price is so high in North America, here in Aus you can get a 7800GTX for the same price as a 6800GT ($850AU).nitromullet - Wednesday, June 22, 2005 - link
"What no Crossfire benchies? I guess they didn't wany Nvidia to loose on their big launch day."Ummm... maybe because CrossFire was paper launched at Computex, and no one (not even AT) has a CrossFire rig to benchmark? nVidia is putting ATI to shame with this launch and the availability of the cards. Don't you think if ATI had anything worth a damn to put out there they would?
All that aside... I was as freaked out as the rest of you by these benchmarks at first (well moreso than some actually, becuase I just pulled the $600 trigger last night on an eVGA 7800GTX from the egg). However, these graphs are clearly messed up, and some appear to have already been fixed. I guess someone should have cut Derek off at the launch party yesterday.
blckgrffn - Wednesday, June 22, 2005 - link
Very disapointed at the fit and finish of this article. Anandtech is supposed to have the best one, not a half baked one :( I even liked HardOCP better even with their weird change the levels of everything approach - at least it has a very good discussion of the differences between MS and SS AA and shows some meaningful results at high res as well.Shame on Anandtech :(
fishbits - Wednesday, June 22, 2005 - link
Good release.Can we get a couple of screen shots with the transparency AA?
"Maybe this one of the factors that will lead to the Xbox360/PS3 becoming the new gaming standard as opposed to the Video Card market pushing the envelope."
Yeah, because the graphics components in consoles don't require anything but three soybeans and a snippet of twine to make. They're ub3r and free! Wait, no, you pay for them too eventually even if not in the initial console purchase price. Actually I think the high initial price of next gen graphics cards is a sign of health for PC gaming. There are some folks not only willing to pay high dollars for bleeding edge performance, they're willing to pay even higher dollars than they were in the past for the top performers. Spurs ATI/Nvidia to keep the horsepower coming, which drives game devs to add better and better graphics, etc.
"They only reveresed a couple of labels here and there, chill out. It's still VERY OBVIOUS which card is which just by looking at the performance!"
Eh, I use benchmarks to learn more about a product than what my pre-conceived notions tell me it "ought" to be. I don't use my pre-conceived notions to accept and dismiss scientific benchmarks. If the benches are wrong, it is a big deal. Doesn't require ritual suicide, just fixing and maybe better quality control in the future.
Thresher - Wednesday, June 22, 2005 - link
2x6800GT costs almost the same amount as this single card and gives up nothing in performance.The price of this thing is ridiculous.
rubikcube - Wednesday, June 22, 2005 - link
Just wanted to say thanks for starting your benchmarks at 1600x1200. It really makes a difference in the usability of the benchmarks.