AnandTech Tests GPU Accelerated Flash 10.1 Prerelease
by Anand Lal Shimpi on November 19, 2009 12:00 AM EST- Posted in
- GPUs
Flash/Hulu on ION: Nearly Perfect
I dusted off ASRock’s ION system based on the Intel Atom 330 (dual-core 1.6GHz Atom) processor for the first part of today’s testing. It had a copy of Windows Vista x64 installed so I stuck with that. The integrated GeForce 9300/9400M chipset supports DXVA/DXVA2 and should be able to offload much of the video decode from the sluggish CPU to the integrated GPU.
As you can see from the results below, CPU utilization drops significantly when going from Flash 10.0.32.18 to 10.1.51.45. Not only do the numbers drop, but playback performance (number of dropped frames) improves significantly. I’d say that all of the tests below were totally playable on the Ion system thanks to Flash 10.1.
Windowed Average CPU Utilization | Flash 10.0.32.18 | Flash 10.1.51.45 |
Hulu Desktop - The Office - Murder | 70% | 30% |
Hulu HD 720p - Legend of the Seeker Ep1 | 75% | 52% |
Hulu 480p - The Office - Murder | 40% | 23% |
Hulu 360p - The Office - Murder | 20% | 16% |
YouTube HD 720p - Prince of Persia Trailer | 60% | 12% |
YouTube - Prince of Persia Trailer | 14% | 7% |
These are awesome improvements. The Hulu HD results were a bit high but the YouTube HD test showed a drop from 60% CPU utilization down to 12%. Most impressive. Now on to the full screen Hulu tests:
Full Screen 1920 x 1200 Average CPU Utilization | Flash 10.0.32.18 | Flash 10.1.51.45 |
Hulu Desktop - The Office - Murder | 70% | 55% |
Hulu HD 720p - Legend of the Seeker Ep1 | 83% | 68% |
Hulu 480p - The Office - Murder | 70% | 70% |
Hulu 360p - The Office - Murder | 70% | 70% |
The biggest difference I saw was running Hulu Desktop in full screen mode (1920 x 1200). While CPU usage wasn’t at 100%, the latest episode of The Office was completely unwatchable in the previous version of Flash. Updating to 10.1 not only dropped CPU utilization, but it made full screen Hulu Desktop watchable on a ~1080p display with the Ion system. I can’t believe it took this long to happen, but it finally did.
The one anomaly I encountered was CPU utilization not dropping while watching Hulu in a maximized IE8 window. I’ve brought it up with NVIDIA and we’re trying to figure out what’s going on.
There is some additional funniness that happens with certain NVIDIA GPUs and some flash video content. Some YouTube videos use a 854 pixel-wide resolution, and default to software decoding on NVIDIA ION and GeForce 8400GS (G98) GPUs. To fix this problem you have to do one of two things. Under IE8 NVIDIA recommends that you do the following:
With Internet Explorer, you may not be able to enter GPU-accelerated playback mode on many clips that naturally start in 854x mode. As a workaround, append “&fmt=22” to the end of 720p clip URLs and &fmt=37 to the end of 1080p clip URLs. The videos will then play in GPU- accelerated HD mode.
Firefox 3.5.5 users have to follow a separate set of instructions:
Before running a YouTube HD clip, please go to Firefox menus and select Tools/Clear Recent History. Ensure the Cookies checkbox is checked, and do the clear. Next, go to Tools/Options/Privacy and select “Never Remember History”.
The above procedure will ensure an HD clip is first loaded in SD mode with 640x horizontal resolution, and then you select the HD button and get GPU- accelerated playback at 1280x HD mode. If you do not first delete Cookies and then turn off history, you may enter an 854x SD horizontal resolution upon starting up an HD clip which is not GPU-accelerated today. If starting in 854x SD mode, when you switch to the HD version, it will still be non-GPU accelerated.
These limitations are only on ION and GeForce 8400GS based GPUs, the rest of NVIDIA supported GPUs accelerate all content regardless of resolution. NVIDIA expects this behavior to be fixed either by updated NVIDIA drivers or an updated version of Flash.
135 Comments
View All Comments
Autisticgramma - Tuesday, November 17, 2009 - link
I saw all this happening long ago, when adobe aquired flash to begin with.Adobe used to just make Acrobat reader, it sucked then it sucks now, its just so embedded in any corperate high-wire act its stoopid. Not to mention all the memory space want on start up, leaves in memory ect sloppy from day one.
Macromedia was the company that created flash (at least to my memory). When macromedia owned it, it wasn't bloated crap ware. And then again we weren't streaming whole shows, and 720I 1080P were not the buzzwords of the day.
I realize homestarrunner and illwillpress are not fully transmitted/encoded video, they are created in flash for flash.
But I don't see how this is enough to require gpu acceleration, isn't there a way to streamline this? Why doesn't other video kill everything else with such efficency? Are we sure they're not just accelerating how fast my computer can be exploited, this is a net application.
I'm not a coder, or some software guru, just a dude that works on computers. Could some one explain, or link me to something, that explains how this isn't an incoding issue, and a NEEDZ M0r3 PoWA issue? Adobe on my GPU - Sounds like "Sure I need some nike xtrainers for my ears?
cosmotic - Tuesday, November 17, 2009 - link
Flash original came from FutureSplash.You really need to work on your spelling. =/
Video decode is extremely CPU intensive. This is why most video decode now happens (at least partially) on the GPU.
PrinceGaz - Tuesday, November 17, 2009 - link
Video decode is quite CPU intensive, but nowhere near as heavy as video encoding with decent quality settings. Also, all current HD video formats will be able to be handled by the CPU within a few years once sex and octal-core or higher CPUs are mainstream.The situation we are in currently regarding HD video playback of MPEG4 AVC type video is rather like the mid-late 1990's with DVD MPEG2 video, where hardware assistance was required for the CPUs of the day (typically around 200-400MHz) and you could even buy dedicated MPEG2 decoder cards. Within a few years, the CPU was doing all of the important decoding work with the only assistance being from graphics-cards for some later steps (and even that was not necessary as the CPU could do it easily if required). The same will apply with HD video in due course, especially as the boundary between a CPU and GPU narrows.
bcronce - Tuesday, November 17, 2009 - link
I can watch 1080p 1920x1080 HD videos from Apple's site with 10% cpu, silky smooth. Now that is 80% of one of my logical CPUs, but that's also some crazy nice graphics.A Core i5 dual core should handle full HD videos with sub 25% cpu usage.
Autisticgramma - Tuesday, November 17, 2009 - link
Thanks for that.Misspellers Untie! Engrish is strictly a method of conveying information/ideas.
If ya get the gist the rest is irrelevant, at least to me.
johnsonx - Tuesday, November 17, 2009 - link
Flash has always had a Hardware Acceleration checkbox, at least in 9 & 10. What did it do?KidneyBean - Wednesday, November 18, 2009 - link
For video, I think it allowed the GPU to scale the screen size. So now you can maximize or resize the video without it taking up extra CPU resources.SanLouBlues - Tuesday, November 17, 2009 - link
Adobe is kinda right about Linux, but we're getting closer:http://www.phoronix.com/scan.php?page=article&...">http://www.phoronix.com/scan.php?page=article&...
phaxmohdem - Tuesday, November 17, 2009 - link
I'm still rocking my trusty 8800GTX card. My heart sunk a little bit when I read that G80 cards are not supported. This is the first time since I bought the ol' girl years ago that she has not been able to perform.However, I also have an 8600GT that runs two extra monitors in my workstation, and I always do my Hulu watching on one of those monitors anyway, so things may still work out between us for a while longer.
CharonPDX - Tuesday, November 17, 2009 - link
I have an original early 2006 MacBook Pro (2.0 GHz Core Duo; 2 GB RAM, Radeon X1600) running Snow Leopard 10.6.2.I not only don't see any difference, but I think something was wrong with your Mac Pro. Hulu 480P and YouTube 720P videos have been fully watchable on my system, in full screen on a 1080p monitor, all along.
When playing your same Hulu video (The Office - Murder, 480P, full screen) with both versions of Flash, I get a nice stable full frame rate (I don't know how to measure frame rate on OS X, but it looks the same as when I watch it on broadcast TV,) with 150% CPU usage. (Average; varies from 130% to 160%; but seems to hover in the 148-152 range the vast majority of the time.)
And Legend of the Seeker, episode 1 in HD skips a few frames, but is perfectly watchable.