HD Video Playback: H.264 Blu-ray on the PC
by Derek Wilson on December 11, 2006 9:50 AM EST- Posted in
- GPUs
The Test
As we previously indicated, we need to use at least a Core 2 Duo E6400 in order to avoid dropping frames while testing graphics card decode acceleration under X-Men: The Last Stand. As we also wanted an accurate picture of how much GPU decode acceleration really helps, we needed to use a CPU powerful enough to avoid dropping frames even under the most stressful load without GPU assistance. Thus we chose the Core 2 Duo X6800 for our tests. Using this processor, we can more accurately see how each graphics card compares to the others and how much each graphics card is able to assist the CPU.
We tested CPU utilization by using perfmon to record data while we viewed a section of X-Men: The Last Stand. The bookmark feature really helped out, allowing us to easily jump to the specific scene we wanted to test in Chapter 18. In this scene, the Golden Gate is being torn apart and people are running everywhere. This is one of the most stressful scenes in the movie, reaching a bitrate of over 41 Mbps at one point.
Unfortunately, we haven't found a feature in PowerDVD or another utility that will allow us to count dropped frames. This means we can't really compare what happens to the video quality when the CPU is running at 100%. In lieu of dropped frames, we will need to stick with CPU overhead as our performance metric.
For reference we recorded average and maximum CPU overhead while playing back our benchmark clip with no GPU acceleration enabled.
Here is the rest of our test system:
As we previously indicated, we need to use at least a Core 2 Duo E6400 in order to avoid dropping frames while testing graphics card decode acceleration under X-Men: The Last Stand. As we also wanted an accurate picture of how much GPU decode acceleration really helps, we needed to use a CPU powerful enough to avoid dropping frames even under the most stressful load without GPU assistance. Thus we chose the Core 2 Duo X6800 for our tests. Using this processor, we can more accurately see how each graphics card compares to the others and how much each graphics card is able to assist the CPU.
We tested CPU utilization by using perfmon to record data while we viewed a section of X-Men: The Last Stand. The bookmark feature really helped out, allowing us to easily jump to the specific scene we wanted to test in Chapter 18. In this scene, the Golden Gate is being torn apart and people are running everywhere. This is one of the most stressful scenes in the movie, reaching a bitrate of over 41 Mbps at one point.
Unfortunately, we haven't found a feature in PowerDVD or another utility that will allow us to count dropped frames. This means we can't really compare what happens to the video quality when the CPU is running at 100%. In lieu of dropped frames, we will need to stick with CPU overhead as our performance metric.
For reference we recorded average and maximum CPU overhead while playing back our benchmark clip with no GPU acceleration enabled.
Here is the rest of our test system:
Performance Test Configuration | |
CPU: | Intel Core 2 Duo X6800 |
Motherboard(s): | ASUS P5B Deluxe |
Chipset(s): | Intel P965 |
Chipset Drivers: | Intel 7.2.2.1007 (Intel) |
Hard Disk: | Seagate 7200.7 160GB SATA |
Memory: | Corsair XMS2 DDR2-800 4-4-4-12 (1GB x 2) |
Video Cards: | Various |
Video Drivers: | ATI Catalyst 6.11 NVIDIA ForceWare 93.71 NVIDIA ForceWare 97.02 |
Desktop Resolution: | 1920x1080 - 32-bit @ 60Hz |
OS: | Windows XP Professional SP2 |
86 Comments
View All Comments
Renoir - Tuesday, December 12, 2006 - link
Which is exactly the reason why I've been waiting so long for an article like this!redpriest_ - Tuesday, December 12, 2006 - link
Have you guys tried this configuration out? I have 2 Geforce 8800 GTXs in SLI, and using either the 97.02 or 97.44 driver and a 30" Dell monitor with a resolution capability of 2560x1600, I found I cannot play Bluray content at anything higher than a desktop resolution of 1280x800 (exactly half that resolution because of the way the dual-DVI bandwidth is setup). This means I cannot even experience full 1080p!Try anything higher than that and Cyberlink BD complains and says, set your resolution to less than 1980x1080. This sucks. I hope there is a fix on the way.
redpriest_ - Tuesday, December 12, 2006 - link
I should add I found this on nvidia's website."The Dell 3007WFP and Hewlett Packard LP3065 30" LCD monitors require a graphics card with a dual-link DVI port to drive the ultra high native resolution of 2560x1600 which these monitors support. With the current family of NVIDIA Geforce 8 & 7 series HDCP capable GPU's, playback of HDCP content is limited to single-link DVI connection only. HDCP is disabled over a dual-link DVI connection. The highest resolution the Dell 30" 3007WFP supports in single-link DVI mode is 1280x800 and therefore this is the highest resolution which HDCP playback is supported in single-link DVI mode on current Geforce 8 &7 series HDCP capable GPU's. On other 3rd party displays with a native resolutions of 1920x1200 and below, the graphics card interfaces with the monitor over a single-link DVI connection. In this case, playback of content protected Blu-Ray and HD-DVD movies is possible on HDCP capable Geforce 8& 7 series GPU's."
Someone needs to tip nvidia and other graphics card manufacturers that this is unacceptable. If I shell out $4000 ($2000 monitor, $1400 for 2 8800GTXsli, and $600 blueray drive) IT SHOULD WORK.
DerekWilson - Tuesday, December 12, 2006 - link
agreed, but don't blame NVIDIA -- blame the MPAA ... HDCP was designed around single link dvi and hdmi connections and wasn't made to work with dual link in the first place. I wouldn't be suprised if the problem NVIDIA is having has absolutely nothing to do with their hardware's capability.in addition, dell's design is flawed -- they only support resolutions above 12x8 with dual link dvi. it may have taken a little extra hardware, but there is no reason that they should not support up to at least 1920x1080 over a single link.
ssiu - Wednesday, December 13, 2006 - link
I would blame the 30" monitors -- they should at least support 1080p in single-link DVI mode, just like the way 24" monitors do.Renoir - Tuesday, December 12, 2006 - link
Wasn't blaming anyone in particular (although I'm always happy to bash the MPAA) just noting how stupid the situation is. Supporting a max of 12x8 over single link is inexcusable as far as I'm concerned.DerekWilson - Thursday, December 14, 2006 - link
then the problem you have is specifically with Dell.Renoir - Tuesday, December 12, 2006 - link
That is ridiculous! That's the problem with tech you can't take anything for granted these days. Things that seem obvious and sensible often turn out to be not as they seem. What a joke!poisondeathray - Monday, December 11, 2006 - link
sorry if this has been answered already...is powerDVD multithreaded? is your CPU utilization balanced across both cores? what effect does a quadcore chip have on CPU utilization
thx in advance
Renoir - Tuesday, December 12, 2006 - link
poisondeathray you read my mind! I have exactly the same question.