AMD's Quad FX: Technically Quad Core
by Anand Lal Shimpi on November 30, 2006 1:16 PM EST- Posted in
- CPUs
Imagine for a moment you're at the decision making table at AMD; you are at least a year away from introducing an updated micro-architecture to refresh your now aging K8 design and your chief competitor just introduced faster and cooler CPUs than anything in your lineup. To make matters worse, this very same competitor enjoys a manufacturing advantage and has also announced that it will begin the transition to quad-core even earlier than originally expected, starting at the end of 2006. The earliest you can even hope to release a quad-core CPU is the middle of 2007. What do you do?
AMD's first move made sense, and that was to dramatically reduce the pricing of its entire lineup to remain competitive. Most computer components are not things you can buy and sell off of emotions alone, and thus something that performs worse must cost less. Through the price drops AMD actually ended up with a fairly attractive dual core lineup, although our similarly aggressive pricing from Intel meant that the most attractive AMD CPUs were the cheapest ones.
But what was AMD to do about the quad-core race? Even though Intel would release its first quad-core CPUs this year, less than 1% of all shipments would feature four cores. It won't be until the end of 2007 before more than 5% of Intel's shipments are quad-core CPUs. But would the loss in mindshare be great enough if Intel already jumped ahead in the race to more cores?
Manufacturing a quad-core Athlon 64 or Opteron on AMD's current 90nm process simply isn't feasible; AMD would end up with a chip that is too big and too hot to sell, not to mention that it would put an even greater strain on AMD's manufacturing which is already running at capacity.
With the 90nm solution being not a very good one, there's always the "wait until 2007" option, which honestly seemed like a very good one to us. We just mentioned that Intel wasn't going to be shipping many of these quad-core CPUs and the majority of users, even enthusiasts who are traditionally early adopters, will stay away from quad-core until 2007 at the earliest to begin with.
Then there's the third option, the one AMD ended up taking; instead of building quad-core on 90nm or waiting until next year, around April/May of 2006 AMD decided that it had a better solution. AMD would compete in the quad-core race by the end of 2006 but with two dual core CPUs running in a desktop motherboard.
Of course dual-core, dual-socket is nothing new, as AMD has been offering that on Opteron platforms for quite a while now. But the difference is that this new platform would be designed for the enthusiast, meaning it would come equipped with a performance tuned (and tweakable) BIOS, tons of USB ports, support for SLI, etc... Most importantly, unlike an Opteron system, this dual socket desktop platform would run using regular unbuffered DDR2 memory.
Back then the platform was called 4x4, and honestly it was about as appealing as a pickup truck. The platform has since matured and thanks to a very impressive chipset from NVIDIA and aggressive pricing from AMD, what's now known as Quad FX may actually have some potential. Today we're here to find out if AMD's first four-core desktop platform is a viable competitor to Intel's Kentsfield, or simply an embarrassing knee-jerk reaction.
88 Comments
View All Comments
Nighteye2 - Thursday, November 30, 2006 - link
I'm interested in that as well. NUMA will be an important part of 4x4 performance - so why isn't NUMA used in the benchmark, or at least mentioned. NUMA is the advantage of having 2 sockets - having NUMA disabled in this benchmark by using an OS that does not support it unfairly cripples the 4x4 performance.Viditor - Thursday, November 30, 2006 - link
Agreed...I think that one of the reasons that AMD delayed release of this so long is that they wanted to show it on Vista instead of WinXP. It seems to me that there would be a substantial difference between the 2...
Viditor - Thursday, November 30, 2006 - link
As a follow up on just how important NUMA is for 4x4, check out http://babelfish.altavista.com/babelfish/trurl_pag...">this review which actually compares the 2...There is a DRASTIC difference between performance on XP and Vista!
Accord99 - Friday, December 1, 2006 - link
Most of the difference is running in 64-bit mode. The extra bandwidth didn't help the FX-74 in the megatasking bench. They didn't do any game benchmarks but based on past reviews of NUMA, the FX-74 will probably keep on losing to the FX-62 in games.Viditor - Friday, December 1, 2006 - link
I'm not sure I agree...there's a 22.5% increase in performance there, and I haven't seen anything like that on the 64 bit version of 3DS Max before...
Not to mention that Vista isn't known as a real speed demon (quite the opposite) for these apps...
What the 64bit version does is allow for larger scene use and stability, not so much faster rendering.
photoguy99 - Friday, December 1, 2006 - link
Sorry totally wrong -
64-bit can make a big difference in performance depending on the app. Remember you can process 64 bits of data in a typical instruction instead of 32, so theoretically twice as much pixel data at a time for rendering.
Some apps may not show the full benefit it depends on how they are coded and compiled, but it's definitely a real potential for speedup.
Bottom line is 64-bit could easily account for a bigger performance increase than NUMA.
Kiijibari - Friday, December 1, 2006 - link
You see that he refers already to 3DS MAX .. I have not investigated this, but if he refers to it, then I trust him on that one ...
Futhermore I miss synthetical Sandra Mem bandwidth benches .. these should easily show what is going on there ...
Anyways a 4x4 review without mentioning the XP - NUMA problem is just not worth reading it ... Sorry Anand ...
cheers
Kiijibari
Anand Lal Shimpi - Friday, December 1, 2006 - link
The performance deficit seen when running latency sensitive single and dual threaded applications exists even in a NUMA-aware OS (I've confirmed this under Vista). I'm still running tests under Vista but as far as I see, running in a NUMA-aware OS doesn't seem to change the performance picture at all.Take care,
Anand
Kiijibari - Saturday, December 2, 2006 - link
Hi Anand,first of all, thanks for your reply.
Then, if there is really no performance difference, then I would double check the BIOS, if you have really disabled node interleave.
Furthermore there seems to be a BIOS bug, with the SRAT ACPI tables, which are necessary for NUMA. It would be nice, if you can dig up some more information about that topic.
Clearly, that would be not your fault, but AMD's.
cheers
Kiijibari
Anand Lal Shimpi - Saturday, December 2, 2006 - link
From what I can tell the Node Interleave option in the BIOS is doing something. Disabling it (enabling NUMA) results in lower latencies than leaving it enabled, but still not as slow as running with a single socket.CPU-Z offers the following latencies for the three configurations:
2S, NUMA On: 168 cycles
2S, NUMA Off: 205 cycles
1S: 131 cycles
From my discussions with AMD last week, this behavior is expected. I will do some more digging to see if there's anything else I'm missing though.
Take care,
Anand