The Dark Knight: Intel's Core i7
by Anand Lal Shimpi & Gary Key on November 3, 2008 12:00 AM EST- Posted in
- CPUs
The Chips
With a new microarchitecture comes a new naming system and while it makes sense for Intel to ditch the Duo/Quad suffixes that's about the only sensible thing that we get with Nehalem's marketing. The new name has already been announced, Nehalem is officially known as the Intel Core i7 processor. Model numbers are back of course and the three chips that Intel is announcing today are the 965, 940 and 920. The specs break down like this:
Processor | Clock Speed | QPI Speed (GT/sec) | L3 Cache | Memory Speed Support | TDP | Unlocked? | Price |
Intel Core i7-965 Extreme Edition | 3.20GHz | 6.4 | 8MB | DDR3-1066 | 130W | Yes | $999 |
Intel Core i7-940 | 2.93GHz | 4.8 | 8MB | DDR3-1066 | 130W | No | $562 |
Intel Core i7-920 | 2.66GHz | 4.8 | 8MB | DDR3-1066 | 130W | No | $284 |
Obviously there's no changing Intel's naming system now, but I'd just like to voice my disapproval with regards to the naming system. It just doesn't sound very good.
These chips aren't launching today, Intel is simply letting us talk about them today. You can expect an official launch with availability by the end of the month.
The Socket
By moving the memory controller on-die Intel dramatically increased the pincount of its processor. While AMD's Phenom featured a 940-pin pinout, Intel's previous Core 2 processors only had 775 contact pads on their underside. With three 64-bit DDR3 channels however, Intel's Core i7's ballooned to 1366 pads making the chip and socket both physically larger:
The downside to integrating a memory controller is that if there are any changes in memory technology or in the number of memory channels, you need a new socket. Sometime in 2009 Intel will introduce a cheaper Nehalem derivative with only a 2-channel memory controller, most likely to compete in the < $200 CPU price points. These CPUs will use a LGA-1156 socket, but future 8-core versions of Nehalem will use LGA-1366 like the CPUs we're reviewing here today.
The larger socket also requires a bigger heatsink, here's a look at the new Intel reference cooler:
From left to right: 45nm Core 2 Duo cooler, 45nm Core 2 Quad cooler, 45nm Core i7 Cooler
73 Comments
View All Comments
Kaleid - Monday, November 3, 2008 - link
http://www.guru3d.com/news/intel-core-i7-multigpu-...">http://www.guru3d.com/news/intel-core-i...and-cros...bill3 - Monday, November 3, 2008 - link
Umm, seems the guru3d gains are probably explained by them using a dual core core2dou versus quad core i7...Quad core's run multi-gpu quiet a bit better I believe.tynopik - Monday, November 3, 2008 - link
what about those multi-threading tests you used to run with 20 tabs open in firefox while running av scan while compressing some files while converting something else while etc etc?this might be more important for daily performance than the standard desktop benchmarks
D3SI - Monday, November 3, 2008 - link
So the low end i7s are OC'able?
what the hell is toms hardware talking about lol
conquerist - Monday, November 3, 2008 - link
Concerning x264, Nehalem-specific improvements are coming as soon as the developers are free from their NDA.See http://x264dev.multimedia.cx/?p=40">http://x264dev.multimedia.cx/?p=40.
Spectator - Monday, November 3, 2008 - link
can they do some CUDA optimizations?. im guessing that video hardware has more processors than quad core intel :PIf all this i7 is new news and does stuff xx faster with 4 core's. how does 100+ core video hardware compare?.
Yes im messing but giant Intel want $1k for best i7 cpu. when likes of nvid make bigger transistor count silicon using a lesser process and others manufacture rest of vid card for $400-500 ?
Where is the Value for money in that. Chukkle.
gramboh - Monday, November 3, 2008 - link
The x264 team has specifically said they will not be working on CUDA development as it is too time intensive to basically start over from scratch in a more complex development environment.npp - Monday, November 3, 2008 - link
CUDA Optimizations? I bet you don't understand completely what you're talking about. You can't just optimize a piece of software for CUDA, you MUST write it from scratch for CUDA. That's the reason why you don't see too much software for nVidia GPUs, even though the CUDA concept was introduced at least two years ago. You have the BadaBOOM stuff, but it's far for mature, and the reason is that writing a sensible application for CUDA isn't exactly an easy task. Take your time to look at how it works and you'll understand why.You can't compare the 100+ cores of your typical GPU with a quad core directly, they are fundamentaly different in nature, with your GPU "cores" being rather limited in functionality. GPGPU is a nice hype, but you simply can't offload everything on a GPU.
As a side note, top-notch hardware always carries price premium, and Intel has had this tradition with high-end CPUs for quite a while now. There are plenty of people who need absolutely the fastest harware around and won't hesitate paying it.
Spectator - Monday, November 3, 2008 - link
Some of us want more info.A) How does the integrated Thermal sensor work with -50+c temps.
B) Can you Circumvent the 130W max load sensor
C) what are all those connection points on the top of the processor for?.
lol. Where do i put the 2B pencil to. to join that sht up so i dont have to worry about multiply settings or temp sensors or wattage sensors.
Hey dont shoot the messenger. but those top side chip contacts seem very curious and obviously must serve a purpose :P
Spectator - Monday, November 3, 2008 - link
Wait NO. i have thought about it..The contacts on top side could be for programming the chips default settings.
You know it makes sence.Perhaps its adjustable sram style, rather than burning connections.
yes some technical peeps can look at that. but still I want the fame for suggesting it first. lmao.
Have fun. but that does seem logical to build in some scope for alteration. alot easier to manufacture 1 solid item then mod your stock to suit market when you feel its neccessary.
Spectator.