Microsoft's Xbox 360, Sony's PS3 - A Hardware Discussion
by Anand Lal Shimpi & Derek Wilson on June 24, 2005 4:05 AM EST- Posted in
- GPUs
PlayStation 3’s GPU: The NVIDIA RSX
We’ve mentioned countless times that the PlayStation 3 has the more PC-like GPU out of the two consoles we’re talking about here today, and after this week’s announcement, you now understand why.
The PlayStation 3’s RSX GPU shares the same “parent architecture” as the G70 (GeForce 7800 GTX), much in the same way that the GeForce 6600GT shares the same parent architecture as the GeForce 6800 Ultra. Sony isn’t ready to unveil exactly what is different between the RSX and the G70, but based on what’s been introduced already, as well as our conversations with NVIDIA, we can gather a few items.
Despite the fact that the RSX comes from the same lineage as the G70, there are a number of changes to the core. The biggest change is that RSX supports rendering to both local and system memory, similar to NVIDIA’s Turbo Cache enabled GPUs. Obviously rendering to/from local memory is going to be a lot lower latency than sending a request to the Cell’s memory controller, so much of the architecture of the GPU has to be changed in order to accommodate this higher latency access to memory. Buffers and caches have to be made larger to keep the rendering pipelines full despite the higher latency memory access. If the chip is properly designed to hide this latency, then there is generally no performance sacrifice, only an increase in chip size thanks to the use of larger buffers and caches.
The RSX only has 60% of the local memory bandwidth of the G70, so in many cases it will most definitely have to share bandwidth with the CPU’s memory bus in order to achieve performance targets.
There is one peculiarity that hasn’t exactly been resolved, and that is about transistor counts. Both the G70 and the RSX share the same estimated transistor count, of approximately 300.4 million transistors. The RSX is built on a 90nm process, so in theory NVIDIA would be able to pack more onto the die without increasing chip size at all - but if the transistor counts are identical, that points to more similarity between the two cores than NVIDIA has led us to believe. So is the RSX nothing more than the G70? It’s highly unlikely that the GPUs are identical, especially considering that the sheer addition of Turbo Cache to the part would drive up transistor counts quite a bit. So how do we explain that the two GPUs are different, yet have the same transistor count and one is supposed to be more powerful than the other? There are a few possible options.
First and foremost, you have to keep in mind that these are not exact transistor counts - they are estimates. Transistor count is determined by looking at the number of gates in the design, and multiplying that number by the average number of transistors used per gate. So the final transistor count won’t be exact, but it will be close enough to reality. Remember that these chips are computer designed and produced, so it’s not like someone is counting each and every transistor by hand as they go into the chip.
So it is possible that NVIDIA’s estimates are slightly off for the two GPUs, but at approximately 10 million transistors per pixel pipe, it doesn’t seem very likely that the RSX will feature more than the 24 pixel rendering pipelines of the GeForce 7800 GTX, yet NVIDIA claims it is more powerful than the GeForce 7800 GTX. But how can that be? There are a couple of options:
The most likely explanation is attributed to nothing more than clock speed. Remember that the RSX, being built on a 90nm process, is supposed to be running at 550MHz - a 28% increase in core clock speed from the 110nm GeForce 7800 GTX. The clock speed increase alone will account for a good boost in GPU performancewhich would make the RSX “more powerful” than the G70.
There is one other possibility, one that is more far fetched but worth discussing nonetheless. NVIDIA could offer a chip that featured the same transistor count as the desktop G70, but with significantly more power if the RSX features no vertex shader pipes and instead used that die space to add additional pixel shading hardware.
Remember that the Cell host processor has an array of 7 SPEs that are very well suited for a number of non-branching tasks, including geometry processing. Also keep in mind that current games favor creating realism through more pixel operations rather than creating more geometry, so GPUs aren’t very vertex shader bound these days. Then, note that the RSX has a high bandwidth 35GB/s interface between the Cell processor and the GPU itself - definitely enough to place all vertex processing on the Cell processor itself, freeing up the RSX to exclusively handle pixel shader and ROP tasks. If this is indeed the case, then the RSX could very well have more than 24 pipelines and still have a similar transistor count to the G70, but if it isn’t, then it is highly unlikely that we’d see a GPU that looked much different than the G70.
The downside to the RSX using the Cell for all vertex processing is pretty significant. Remember that the RSX only has a 22.4GB/s link to its local memory bandwidth, which is less than 60% of the memory bandwidth of the GeForce 7800 GTX. In other words, it needs that additional memory bandwidth from the Cell’s memory controller to be able to handle more texture-bound games. If a good portion of the 15GB/s downstream link from the Cell processor is used for bandwidth between the Cell’s SPEs and the RSX, the GPU will be texture bandwidth limited in some situations, especially at resolutions as high as 1080p.
This option is much more far fetched of an explanation, but it is possible, only time will tell what the shipping configuration of the RSX will be.
93 Comments
View All Comments
MDme - Friday, June 24, 2005 - link
now i know what to buy :)SuperStrokey - Friday, June 24, 2005 - link
lol, thats funnybldckstark - Friday, June 24, 2005 - link
Having a PS2 and an XBOX I was not even thinking about buying a PS3 since the XBOX kicks the PS2's ace. (IMHO). After reading this article I have much more respect for the PS3 and now I don't have any idea which onw I will buy. My wife may force me to buy the PS3 if the 360 isn't as backward compatible as most want it to be.Maybe I will just use my unusually large brain to create a PS360 that will play everything. Oooh, wait, I gotta get a big brain first. Then a big p3nis. Or maybe just a normal one.
Furen - Friday, June 24, 2005 - link
#37: supposedly yes. Since it will have to be through hardcore emulation there will be issues (but of course). It wont be fully transparent like the ps2 but rather you'll have profiles saved on your harddrive which will tell the system how to run the games.SuperStrokey - Friday, June 24, 2005 - link
I havnt been following the 360 too much (im a self admitted nintendo fanboy), but will it be backward compatible too? I heard it was still up in the air but as PS3 is going to be and revolution is going to be (bigtime) i would assume that 360 will be too right?ZobarStyl - Friday, June 24, 2005 - link
#32 is right: how many games get released for all 3 console with only minor, subtle differences between them? Most of the time, first party stuff is the only major difference between consoles. Very few 3rd party games are held back from the 'slower' consoles; most are just licensing deals (GTA:SA on PS2, for example). And if you look back, of the first party games lineup, XBox didn't have the most compelling of libraries, in my opinion.yacoub - Friday, June 24, 2005 - link
imo, the revolution will be a loser in more than just hardware. i can't remember the last time i actually wanted to play any of the exclusive nintendo games. actually, i think for about one day i considered a gamecube for metroid but then i saw it in action at a friend's place and was underwhelmed by the gameplay. forget mario and link, give me splinter cell or gran tourismo or forza or... yeah you get the idea.nserra - Friday, June 24, 2005 - link
#27If you read the article carefully, you will see that since they are "weaker" pipelines, the 48 will perform like 24 "complete" ones.
I think with this Ati new design, there will be games where the performance will be much better, equal or worst.
But that’s the price to pay for complete new designs.
On paper Ati design is much more advance, in fact reminds the VOODOO2 design where there are more than one chip doing things. I think I prefer some very fancy graphics design over a double all easy solution.
Taracta - Friday, June 24, 2005 - link
With 25.5 Gbs of bandwith to memory, is OoO (Out Of Order processing) necessary? Isn't OoO and its ilk bandwith hiding solutions? I have an issue with regards to Anandtech outlook on the SPPs of the CELL processor (I could be wrong). I consider the SPPs to be full fledge Vector Processors and not just fancy implementation of MMX, SSE, Altivec etc, which seems to be Anandtech's outlook. As full fledge Vector Processors they are orders of magnitude more flexible than that and as Vector Processors comparing them to Scalar Processors is erroneous.Another thing, RISC won the war! Don't believe, what do you call a processor with a RISC core with a CISC hardware translator around it? CISC? I think not, it's a RISC processor. x86 did win the procesor war but not by beating them but joining them and by extension CISC loss. Just needed to clear that up. The x86 instruction set won but the old x86 CISC architecture loss. The x86 insrtuction set will always win, fortunately for AMD because the Itanium was to have been their death. No way could they have copied the Itanium in this day and age which come to think of it is very unfortunate.
From you have the processor the runs x86 the best you will always win. Unless you can get a toehold in the market with something else such as LINUX and CELL!
CuriousMike - Friday, June 24, 2005 - link
If it's a 3rd party game, it won't matter (greatly) which platform you pick, because developers will develop to the least-common-denominator.In the current generation, about the best one could hope for is slightly higher-res textures and better framerate on XBOX over ps2/gc.
IMO, pick your platform based on first-party games/series you're looking forward to. Simple as that.