Intel Atom D510: Pine Trail Boosts Performance, Cuts Power
by Anand Lal Shimpi on December 21, 2009 12:01 AM EST- Posted in
- CPUs
3dsmax 9 - SPECapc 3dsmax CPU Rendering Test
Today's desktop processors are more than fast enough to do professional level 3D rendering at home. To look at performance under 3dsmax we ran the SPECapc 3dsmax 8 benchmark (only the CPU rendering tests) under 3dsmax 9 SP1. The results reported are the rendering composite scores:
You shouldn't begin to think about doing any serious 3dsmax work on an Atom, but technically the D510 would be about 6% faster at it than the older Atom 330.
Cinebench R10
Created by the Cinema 4D folks we have Cinebench, a popular 3D rendering benchmark that gives us both single and multi-threaded 3D rendering results.
POV-Ray 3.73 beta 23 Ray Tracing Performance
POV-Ray is a popular, open-source raytracing application that also doubles as a great tool to measure CPU floating point performance.
I ran the SMP benchmark in beta 23 of POV-Ray 3.73. The numbers reported are the final score in pixels per second.
Blender 2.48a
Blender is an open source 3D modeling application. Our benchmark here simply times how long it takes to render a character that comes with the application.
41 Comments
View All Comments
Shadowmaster625 - Monday, December 21, 2009 - link
Why doesnt AMD just take one of their upcoming mobile 880 series northbridges and add a memory controller and a single Athlon core? It would be faster than atom, more efficient than Ion, and could be binned for low power. Instead they just stand there with their thumbs up their butts while Intel shovels this garbage onto millions of unsuspecting consumers at even higher profit margins.JarredWalton - Monday, December 21, 2009 - link
The problem is that even single core Athlons are not particularly power friendly. I'm sure they could get 5-6 hours of battery life if they tried hard, but Atom can get twice that.Hector1 - Monday, December 21, 2009 - link
Do you want some whine with that ? Where were you when chipsets were created by taking a bunch of smaller ICs on the motherboard and putting them altogether into one IC ? PCs became cheaper and faster. We thought it was great. Do you know anything about L2 Cache ? It used to be separate on the motherboards as well until it was integrated into the CPU. PCs became cheaper and faster and we thought it was great. Remember when CPUs were solo ? They became Duo & Quad making the PCs faster and dropping price/performance. AMD & Intel integrated the memory controller and, whoa!, guess what ? Faster & lower price/performance and, yes, we thought it was great. It's called Moore's Law and it's all been part of the semiconductor revolution that's still going on since the '60s. GPUs are no different. They're still logic gates made out of transistors and with new 32nm technology, then 22nm and 16nm, the graphics logic will be integrated as well. Seriously, what did you think would happen ?TETRONG - Monday, December 21, 2009 - link
Do you understand that Moore's Law is not a force of nature?Intel has artificially handicapped the low-voltage sector in order to force consumers to purchase Pentiums. Right where they wanted you all along.
Since when is it ok for Intel to dictate what type of systems are created with processors?
First it was the 1GB of Ram limitation, now you can't have a dual-core. When does it end?
"We have a mediocre CPU, combined with a below average GPU-according to our amortization schedule you could very well have it in the year 2013(after the holidays of course), by which time we should have our paws all over the video encoding and browsing standards, which we'll be sure to make as taxing as possible. Official release of USB 3.0 will be right up in a jiff!
Voldenuit - Monday, December 21, 2009 - link
The historical examples you cite are not analagous, because intel bundling their anemic GPUs onto the package makes performance *worse*, and bundling the two dies onto a single package (they're not on the same chip, either, so there is no hard physical limitation) makes competing IGPs more expensive, since you now have to pay for a useless intel IGP as well as a 3rd party one if you were going to buy an IGP system.And just because a past course of action was embraced by the market does not mean it was not anti-competitive.
bnolsen - Saturday, December 26, 2009 - link
Performance is worse?? As far as I can see the bridge requires no heat sink and the cpu can be cooled passively. Power use went way down. For this plaform that is improved performance.My current atom netbooks do fine playing flash on their screens and just fine playing 720p h264 mkv files.
If you want a bunch of power use and heat, just skip the ion platform and go with a core2 based system.
Hector1 - Monday, December 21, 2009 - link
You need to re-read the tech articles. Pineview does integrate both the graphics and memory controller into the CPU. It's the ICH that remains separate. Even if it didn't, what do you think will happen when this goes to 32nm, 22nm and 16nm ? As for performance, Anand says in the title "Pine Trail Boosts Performance, Cuts Power" so that's good enough for me.Intel obviously created the Atom for a low cost, low power platform and they're delivering. It'll continue to be fine-tuned with more integration to lower costs. The market obviously wants it. SOC is coming too (System On a Chip) for even lower costs. Not the place for high performance graphics, I think.
This is really about Moore's Law marching on. It's driven down prices, increased performances and lowered power more than anything else on the planet. Without it, we'd still be paying the $5000 I paid for my 1st PC in 1980 -- an Apple II Plus. What you're saying, whether you know it or not, is that we should stop advancing processes and stop Moore's Law. Personally, I'd like to see us not stop at 45nm and keep going.
kaoken - Monday, December 21, 2009 - link
I agree that progress should be made, but bundling an intel cpu and IGP into a chip is anti-competitive. I wouldn't mind though if there were an intel cpu and ati/nvidia on a chip.JonnyDough - Tuesday, December 22, 2009 - link
Hector is right in one respect, and that is that if Intel is going to be dumb, we don't have to purchase their products. I especially like the sarcastic cynicism in the article when mentioning all the things that Intel's chip CAN'T do. They just don't know how to make a GPU without patent infringement. If they can't compete, they'll try using their big market share to hurt competition. Classic Intel move. They never did care about innovation, only about market share and money. But I guess that's what happens when you're a mega corp with lots of stockholder expectations and pressure. I'll give my three cheers to the underdogs!overvolting - Monday, December 21, 2009 - link
Hear Hear