Intel Core 2 Extreme X6800 Preview from Taiwan
by Anand Lal Shimpi & Gary Key on June 6, 2006 7:35 PM EST- Posted in
- CPUs
A few months have passed since our original foray into the world of Conroe, and official naming has been announced for the processor. What we've been calling Conroe is now known as Core 2 Duo, with the Extreme Edition being called Core 2 Extreme. Initial availability of the Core 2 Duo and Core 2 Extreme processors remains unchanged from Intel's original estimates of "early Q3".
At this year's Spring IDF Intel made the unusual move of allowing us and other press to spend some quality time benchmarking its upcoming Conroe processor. Unfortunately we were only allowed to benchmark those games and applications that Intel loaded on the system, and while we did our due diligence on the system configuration we still prefer to benchmark under our own terms.
We're happy to report that we gathered enough parts to build two systems while in Taiwan for Computex. We managed to acquire a Socket-AM2 motherboard equipped with an Athlon 64 FX-62 and a P965 motherboard equipped with a Core 2 Extreme X6800 2.93GHz at our hotel, along with two sets of 2x1GB of DDR2-800 (only 5-5-5-12 modules though), a pair of Hitachi 7K250 SATA hard drives, and two NVIDIA GeForce 7900 GTXes (one for each system) - it helps that all the major players have offices in Taiwan. Of course we happened to pack some power supplies, monitors, keyboards and mice in our carry-on luggage, as well as copies of Windows XP, Quake 4, F.E.A.R., Battlefield 2, SYSMark 2004 and Winstone 2004.
When faced with the choice of testing Conroe or sleeping , we stayed up benchmarking (we'll blame it on the jet lag later). The stage was set: Intel's Core 2 Extreme vs. AMD's recently announced FX-62, and while it's still too early to draw a final verdict we can at least shed more light on how the battle is progressing. Keep in mind that we had a very limited amount of time with the hardware as to not alert anyone that it was missing and being used for things it shouldn't be (not yet at least), so we weren't able to run our full suite of tests. We apologize in advance and promise we'll have more when Conroe launches, but for now enjoy.
The Test
In case we weren't clear: we acquired, built, installed and tested these two test systems entirely on our own and without the help of Intel.
CPU: | AMD Athlon 64 FX-62 (2.80GHz) Intel Core 2 Extreme X6800 (2.93GHz) |
Motherboard: | nForce 590-SLI Socket-AM2 Motherboard Intel P965 Motherboard |
Chipset: | NVIDIA nForce 590 SLI Intel P965 Chipset |
Chipset Drivers: | nForce 9.34 Beta Intel 7.3.3.1013 |
Hard Disk: | Hitachi Deskstar T7K250 |
Memory: | DDR2-800 5-5-5-12 (1GB x 2) |
Video Card: | NVIDIA GeForce 7900 GTX |
Video Drivers: | NVIDIA ForceWare 91.28 Beta |
Desktop Resolution: | 1280 x 1024 - 32-bit @ 60Hz |
OS: | Windows XP Professional SP2 |
134 Comments
View All Comments
IntelUser2000 - Tuesday, June 6, 2006 - link
So... when Pentium 4 came out with Willamette core, it was better since it was new right?? I am sure before Conroe benchmarks, people would have thought 20% performance gain by CPU alone was crazy, like the person claiming that would be from another planet.
bob661 - Tuesday, June 6, 2006 - link
Hmmm...no mention of better here. Maybe a wormhole sucked that word right off of my page transported it to the year STFU.
Maybe 10 years ago but nowadays, everything but Celerons are fast. Where did I say better?
saratoga - Tuesday, June 6, 2006 - link
20% gain from the CPU is nothing. You get that every couple months, usually.Anyway, you're missing the point. AM2 was not meant to improve performance, it was meant to cut costs. DDR1 has passed the inflection point relative to DDR2, and AMD needs to get off of it before it sinks. AM2 allows this to happen. Essentially, it maintains the status quo.
It'll be the K8L that saves AMD (well, assuming it ever comes out).
IntelUser2000 - Tuesday, June 6, 2006 - link
You must be definitely not understanding me.
20% before=architecture+clock speed
Core's 20-30% is architecture alone, clock speed will add on top of that. And that's over the FASTEST CPU out there now. Core is 50-70% faster than higher clocked Pentium D. Nothing had that much of an improvement.
JarredWalton - Tuesday, June 6, 2006 - link
Realistically, no... FSB performance has some impact overall, but generally not more than 5-10%, especially once you get past a certain point. FSB-533 to FSB-800 showed reasonable increases in quite a few applications. 800 to 1066 didn't help all that much, and I would wager 1333 is not truly necessary. Of course, Intel needs the higher FSB speeds due to the CPU-to-NB-to-RAM pathway, whereas AMD connects to the RAM directly and has a separate connection to the NB.The only real question now is: when will K8L arrive, and how much will it help? I can't answer the latter, but the former looks to be late 2006/early 2007 AFAIK.
classy - Tuesday, June 6, 2006 - link
The scores are somewhat not telling the whole story. I bet with AA/AF the FX62 bandwidth starts to flex its muscle some. The FX is clocked a little slower as well not mention this is the very best from Intel. While I'll be buying a cheaper one :), truth is Core 2 is damn nice but far from distancing itself from AMD. It looks like with 65mm alone AMD may be able to challenge for the crown. Something many thought earlier wouldn't be possible.Calin - Wednesday, June 7, 2006 - link
AA and AF are entirely and completely video card dependent. As you increase graphic quality, the processor will wait more and more for the video card. Benchmarking how fast a processor waits isn't so interesting.also, the FX62 is the very best from AMD, and even with 65nm AMD would need to increase the clock speed by 25% to equal the top Conroe
IntelUser2000 - Tuesday, June 6, 2006 - link
Same BS over and over and over and over again for the doubters/skeptics. You guys will never learn.
This is a CPU test. If you want to see graphics benchmarks, don't get the highest end CPU, get the Semprons and the Celeron D's with X1900XT Crossfire.
Games are lower resolutions are put exactly to show the CPU performance is CONSISTENT over variety of applications.
Plus, the people who really care about gaming and play competitively(even somewhat) will see that CPU matters a lot for performance since they play at 1024x768 so they don't notice lag spikes. I have seen competitive gamers wonder why they have lag spikes and they are pissed off about it when they are getting 80+ fps in benchmarks.
classy - Tuesday, June 6, 2006 - link
Hahahaha I won't get into what my online name was when I was gaming, been in slight retirement. Lets just say I was one of the best damn railers on the net and it is clear you have no f'in clue what your talking about. Just so you know most configs limit fps and most main stream cpus can easily supply plenty of power, so your babble about cpus is a joke. Lag is almost always related to ram or the video card except at the highest resolutions. The point is that more than likely in real world use you probably won't see a difference at all. And if you find a gamer today more concerned about the cpu than his graphics, it is clear he should step away from the keyboard and mouse.IntelUser2000 - Tuesday, June 6, 2006 - link
Yes I do know what I am talking about. It's you who doesn't. None of the real world people I have seen use low latency memory. They all use generic samsung memory. There is a pattern I noticed. The one who knows about hardware aren't really good gamers, and the one who's hardcore gamer that plays good enough to win prizes don't know so much about hardware, I guess they don't have time for both.
CPU will matter for a competitive gamer simply because they will run at low resolutions(I am comparatively speaking here not low by 640x480) to avoid lag. Lag in competition=bad, so they do anything to avoid it. As I said, my friend has Dell M170 laptop, that's with Pentium M 2.0GHz/533Mhz FSB/2MB L2, 1GB DDR2, 5400RPM drive and Geforce 7800GTX Go 256MB. He runs at resolutions where some newer games don't look better than older games because he runs it at so low. He doesn't put graphics effects like Bloom since it inteferes with his play. And he does play GOOD, in everything first person shooter.
They think its the graphics card that matters, but if they notice lag on a laptop that good, it won't get much better getting X1900 or whatever top end now as those top end GPUs aren't faster at 1024x768 by much anyway, it becomes a CPU bottleneck.