NVIDIA GeForce 8600: Full H.264 Decode Acceleration
by Anand Lal Shimpi on April 27, 2007 4:34 PM EST- Posted in
- GPUs
The Interpreter (H.264)
Our second H.264 test is The Interpreter which we've used in the past. Although it's not nearly as stressful as Yozakura, it still eats up almost all of our Core 2 Duo CPU at peak.
The BSP engine of the 8600 proves its worth once more as average CPU utilization drops to around 20% once more.
Maximum CPU utilization is a bit higher but still less than 30%. In a reversal from Yozakura, note how the 8600 GTS now has a slightly faster CPU utilization than the 8600 GT in PowerDVD.
WinDVD 8 tells a similar story: H.264 offload is absolutely necessary for good Blu-ray/HD-DVD playback.
64 Comments
View All Comments
JarredWalton - Saturday, April 28, 2007 - link
The peak numbers may not be truly meaningful other than indicating a potential for dropped frames. Average CPU utilization numbers are meaningful, however. Unlike SETI, there is a set amount of work that needs to be done in a specific amount of time in order to successfully decode a video. The video decoder can't just give up CPU time to lower CPU usage, because the content has to be handled or frames will be dropped.The testing also illustrates the problem with ATI's decode acceleration on their slower cards, though: the X1600 XT is only slightly faster than doing all the work on the CPU in some instances, and in the case of VC1 it may actually add more overhead than acceleration. Whether that's due to ATI's drivers/hardware or the software application isn't clear, however. Looking at the WinDVD vs. PowerDVD figures, the impact of the application used is obviously not negligible at this point.
BigLan - Saturday, April 28, 2007 - link
Does the 8600 also accelerate x264 content? It's looking like x264 will become the successor to xvid, so if these cards can, they'll be the obvious choice for HD-HTPCs.I guess the main question would be if windvd or powerdvd can play x264. I suspect they can't, but nero showtime should be able to.
MrJim - Tuesday, May 8, 2007 - link
Accelerating x264 content would be great but i dont know what the big media companies would think about that, maybe ATI or Nvidia will lead the way, hopefully.Xajel - Saturday, April 28, 2007 - link
I'm just asking why those enhancement are not in the higher 8800 GPU's ??I know 8600 will be more used in HTPC than 8800, but it's just not a good reason to not include them !!
Axbattler - Saturday, April 28, 2007 - link
Those cards came out 5 months after the 8800. Long enough for them to add the tech it seems. I'd expect them in the 8900 (or whatever nVidia name their refresh) though. Actually, it would be interesting to see if they add to the 8800 Ultra.Xajel - Saturday, April 28, 2007 - link
I don't expect Ultra to have them, AFAIK Ultra is just tweaked version of GTX with higher MHz for both Core and RAM...I can expect it for my 7950GT successor
Spacecomber - Friday, April 27, 2007 - link
I'm not sure I understand why Nvidia doesn't offer an upgraded version of their decoder software, instead of relying on other software companies to get something put together to work with their hardware.thestain - Friday, April 27, 2007 - link
http://www.newegg.com/product/product.asp?item=N82...">All this tech jock sniffing with the latest and greatest, but this old reliable is a better deal isn't it?For watching movies.. for the ordinary non-owner of the still expensive hd dvd players and hd dvds... for standard definition content.. even without the nice improvements nvidia has made.. seems to me that the old tech still does a pretty good job.
What do you think of this ole 6600 compared to the 8600 in terms of price paid for the performance you are going to see and enjoy in reality?
DerekWilson - Saturday, April 28, 2007 - link
the key line there is "if you have a decent cpu" ... which means c2d e6400.for people with slower cpus, the 6600 will not cut it and the 8600gt/8500gt will be the way to go.
the aes-128 step still needed to be done on older hardware (as it needs to decrypt the data stream sent to it by the CPU), but using dedicated hardware rather than the shader hardware to do this should help save power or free up resources for other shader processing (post processing like noise redux, etc).
Treripica - Friday, April 27, 2007 - link
I don't know if this is too far off-topic, but what PSU was used for testing?