AMD's 65nm Preview Part 2 - The Plot Thickens (Updated with Information from AMD)
by Anand Lal Shimpi on December 21, 2006 12:12 AM EST- Posted in
- CPUs
Gaming Performance & Power Usage
Quake 4 was the first application that really showed us the performance penalty you incur when moving to Brisbane, in this case the older core is about 4% faster. If you take into account that we're looking at performance at 1600 x 1200 with a GeForce 8800 GTX, in more GPU limited situations you're unlikely to notice the performance difference, but at more CPU limited situations the delta could likely grow even larger than 4%.
As the Core 2 processors are pushing much more data to the GPU than their competitors, average power consumption is generally much higher - it's the expense of greater performance in this case. The performance per watt charts take into account both factors and give you more of a breakdown of efficiency. Despite the decrease in performance, the reduction in power consumption gives the new Brisbane cores the efficiency advantage over most of their predecessors.
Oblivion didn't show a real impact in performance due to the slower Brisbane cores, but it clearly favors Intel's Core 2 architecture over AMD's.
52 Comments
View All Comments
OcHungry - Wednesday, January 3, 2007 - link
I was wondering if there was a way to PM or email you.I would like to bring to your attention a few concerns regarding the forum and the moderators that you need to be aware of. Is there an email or can I PM you in any way?
I would appreciate your respond.
Bayad shoma bedaneed.
tashakore
SilverMirage - Saturday, December 23, 2006 - link
I understand that we are looking at limiting games by the CPU and not the graphics card, but lets be a little realistic...some of those performance per watt figures aren't what an end user is going to be seeing when comparing his options.Wwhat - Saturday, December 23, 2006 - link
Perhaps a weird thought but AMD bought that new ZRAM process that theoretically could put a huge cache on a small space with the flaw that it had so-so latency, then they later bought version 2 of the same but so far they didn't use it.Now what if this is an experiment with that technology, or a preamble, because at some time you would expect them to start using stuff they bought the license off, although at the time of the ZRAM announcement people were projecting use in a very far future that might not be what AMD has in mind, or the situation might have changed as to the performance of current versions of ZRAM.
What do you think, any link to it?
Tujan - Friday, December 22, 2006 - link
If you look,you'll see a story here in Anandtech.com for the first AMD X2s dated June 2005. The processor there was the 'top notch,for something unheard of in the processor industry. And even so doing with DDR400 memory.What is strange in these processor scenarios. There is Moores law. And there is the 'business quarter law. The article here,of wich I am commenting to,if the first of the detail in the "new processors"-the 65Nm processors. The AMDX2xs where also "new" when the 2005 articles was published. Today,as of reading this article,the "old"(and then top notch)is no longer being supported ..or as per use,no longer maintaining the subject cache sizes. AND well,point me if I am not correct,but the "old"processors are no longer going to be manufactured.
Well I have usually considered that in these events of "new,and old"technology that somehow,if something is 18 months past.I can actually afford it. This changes a little with the new Intel setups. Since the Intel lineups finally seem to break this cycle of the previous as I explained here.
For an entire year,I saw article after article,putting AMDs top notch as base line for performance. Ney user could address something of a culminating relationship to what had performance and what they could afford. And AMDs "top notch" an "industry standard" wich of course nobody could afford. At least if somebody was passing the buck,it wasn't happening with me.
Annyway,I would just like to say I wish AMD good luck.Yet I cannot be ashamed to say that I can now put together a system for less than 1000$ with the same parts as that '"top notch"industry standard seen for the permiating part of 2005.
Am I better for thinking that way ? 18 months passed,and my dollars are spendable but not supported ? I dont think that anybody could consider an AMD setup a 'low-end setup if for example the 4800+1Mb l2 can be had for 230$.
That is the enevitable consequence is the final exhaustion of the supply of the component. Yet I could say that I have a "top notch" 2005 version of AMD technology. With 2007 PARTS! Being on the exhausted end,I dont know who could feel better about this.
Wish AMD luck. Still with their record,I probably should say that I do not wish to look forward to exhaustion,at the same time as extingusishment,as in the mentioning of taking a break to what one pays for.
mino - Thursday, December 21, 2006 - link
Just my pint into the fire:X2 4200+EE & GF6150 board (MSI K9NGM2-FID)
$240 (170+75)
E6300 $ G965 board (ASUS P5B-VM)
$285 (185+100)
Conclusion:
Anything cheaper is K8 vs. Netburst so Intel is no contender.
Anything more expensive is K8 vs. C2D clock/clock so AMD is no contender (4800+65nm is more expensive than E6400 it matches by perf.)
For decent IGP-free boards the difference is comparable.
So, going for
stock performance the choice is simple:
<$300 for CPU+MB combo go AMD X2
>$300 for CPU+MB combo go Intel C2D
for overclocking:
<$240 for CPU+MB combo go AMD X2
$240-$285 -> wait and then Intel C2D
>$285 for CPU+MB combo go Intel C2D
for power consumption (i.e. PC's for office use):
AMD X2 3800+EE to 5000+EE (anything above or down is a waste of money in this case)
for single core:
<$190 for CPU + MB combo -> go AMD A64
$190-$230 for CPU + MB combo -> go AMD X2 at $240
>$240 see dual-core recommendations
That IMO sums up the whole Desktop PC market as of now.
mino - Thursday, December 21, 2006 - link
$190-$230 for CPU + MB combo -> go AMD X2 at $240should be:
$190-$230 for CPU + MB combo -> go AMD X2 3800+ at $200
mino - Thursday, December 21, 2006 - link
The prices are Newegg based.Shintai - Thursday, December 21, 2006 - link
Anand, can you test the 65nm K8 with lower res in games and a broader selection of games so we can more truely see the difference?http://www.firingsquad.com/hardware/amd_athlon_64_...">http://www.firingsquad.com/hardware/amd_athlon_64_...
If these numbers hold water, then 65nm K8s is a disaster in terms of gaming performance.
mongoosesRawesome - Thursday, December 21, 2006 - link
I'd be interested in seeing these same performance/watt graphs using the 965 chipset. The 680i is a power hog.mino - Thursday, December 21, 2006 - link
The same power hog is 590SLI for AMD.Actually 965 vs. RD580 would hurt Intel even more ... So, go figure.