AnandTech Tests GPU Accelerated Flash 10.1 Prerelease
by Anand Lal Shimpi on November 19, 2009 12:00 AM EST- Posted in
- GPUs
ATI and Intel Update, 11/19/2009:
After uninstalling Flash 10.1, reinstalling, rebooting, and switching to the High Performance power profile (instead of Balanced), some of the Hulu problems noted on the previous page seemed to clear up slightly. We already tested with the latest Intel drivers, so that wasn't the issue. Additional testing revealed that if you disable GPU acceleration with 10.1 (and restart your browser), the Hulu 480p problems are not present, but we continue to have difficulties with Hulu 480p playback on the GMA 4500MHD with GPU acceleration enabled on all the videos we've tested. The 360p videos work without any problems. Here are the updated results, including results from the Gateway NV52 HD 3200 laptop using the Catalyst 9.11 drivers. We've also added the data for 10.1 with GPU acceleration disabled as a point of reference.
Intel GMA 4500MHD (Gateway NV58)
Updated Gateway NV58 (GMA 4500MHD) Full Screen 1366x768 Performance |
|||
Flash 10.0 | Flash 10.1 (GPU) |
Flash 10.1 (No GPU) |
|
Hulu 720p - CPU | 61% | 37% | 69% |
Hulu 720p - FPS | 26.3 | 24.7 | 25.3 |
Hulu 480p - CPU | 58% | 56% | 68% |
Hulu 480p - FPS | 35.9 | 10.9 | 33.9 |
YouTube 720p - CPU | 32% | 24% | 37% |
YouTube 720p - FPS (Dropped) | 26.5 (0) | 24.0 (0) | 19.5 (104) |
Starting with Intel, the results have only changed slightly. We can now use Flash 10.1 in all cases, but we have to disable GPU acceleration for certain videos. This may be an issue similar to NVIDIA stating that ION has problems with YouTube HD videos that are 854 pixels wide; hopefully it will be cleared up with driver and/or Flash updates. HD Flash on the other hand definitely benefits from the GPU acceleration and DXVA in Flash 10.1. The Hulu HD Legend of the Seeker video has CPU usage drop 24% while the 720p Prince of Persia trailer on YouTube reduces CPU usage by 8%. Hulu's The Office does reduce CPU usage 2%, but frame rates drop from 30+ FPS to only 10 FPS.
Turning off GPU acceleration in Flash 10.1 shows where and how much the 4500MHD is helping. The YouTube HD trailer drops to around 20 FPS with occasional dropped frames causing noticeable stuttering, and CPU usage jumps 13%. Hulu HD playback remains smooth, but CPU usage jumps 32%, so the DXVA acceleration clearly helps a lot in this instance. Standard Hulu videos like The Office return to a smooth frame rate, but CPU usage is 10% higher than Flash 10.0. Overall, since the Intel GMA 4500MHD with a T6500 CPU manages to handle Flash video up to 720p in full screen mode using Flash 10.0, the 10.1 update isn't critical right now. If you're using a CULV processor (or a display with a higher resolution), Flash 10.1 may be more beneficial. We'll look at that scenario in a future article.
ATI HD 3200 (Gateway NV52)
Gateway NV52 (ATI HD 3200) Full Screen 1366x768 Performance |
|||
Flash 10.0 | Flash 10.1 (GPU) |
Flash 10.1 (No GPU) |
|
Hulu 720p - CPU | 76% | 56% | 76% |
Hulu 720p - FPS | 13.2 | 24.5 | 24.5 |
Hulu 480p - CPU | 72% | 62% | 73% |
Hulu 480p - FPS | 12.7 | 34.9 | 31.3 |
YouTube 720p - CPU | 53% | 22% | 42% |
YouTube 720p - FPS (Dropped) | 26.0 (0) | 24.0 (0) | 21.3 (103) |
With the updated Catalyst 9.11 drivers, our results were a lot better than before. Previously, using Flash 10.0 we were unable to view either of the Hulu videos (720p or 480p) in full screen mode without severe stuttering. YouTube HD on the other hand worked fine with 0 dropped frames. Moving to Flash 10.1 with DXVA GPU acceleration, we now see smooth frame rates on all Hulu content and lower CPU usage for both Hulu and YouTube videos. YouTube CPU usage on the Prince of Persia trailer drops 31%, Hulu's Legend of the Seeker drops CPU use 20% while nearly doubling the frame rate (i.e. from dropping half the frames to showing everything), and 480p Hulu drops CPU usage 10% with frame rates almost tripling (from ~13 FPS to over 30 FPS for what appears to be 30 FPS video content).
Disabling the GPU acceleration in Flash 10.1 still results in a better experience at Hulu than Flash 10.0, with roughly the same CPU load but no stuttering. YouTube HD is similar to the GMA 4500MHD in this case, with a frame rate of 21 FPS and slight stuttering. Unlike the Intel platform, if you have an ATI card and a moderate CPU it appears that Flash 10.1 is a clear win.
135 Comments
View All Comments
cosmotic - Thursday, November 19, 2009 - link
Most control panels are crutches for poor operating system and application developers. I have never touched ~95% of settings in such control panels, and AA settings belong in the application, if any. These sorts of things should be non-issues. Same with sub cutoffs. That belongs on the amp/receiver. Even if I'm completely off base, there's no reason the travesty that is any current control panel should look the way they do. Honestly, skins in Catalyst? What - the - fuck.The ONLY three things I ever do in a graphics card control panel are 1) adjust flat panel scaling, which It should never be on anything but maintain aspect ratio, 2) adjust black levels, which I should never have to do with a properly calibrated monitor and 3) adjust multi monitor options because for whatever reason nVidia and ATI insist using their own ultra shitty implementation instead of the only somewhat shitty Windows implementation.
You got me on the C2Q playing back video, although if you start doing anything else CPU intensive, that wont last. Imagine playing back TWO videos?!
Cerb - Saturday, November 21, 2009 - link
Sorry, but no. Having settings in the application means that if the application was not designed to use those features, you can't turn them on. Any features implemented just by the driver and hardware should be controlled entirely by the driver and hardware. The user should then have a front-end to control those settings.Games should have the minimal amount of settings for anything not programmed into the game. The driver should have everything else. AA is one that should especially not be controlled by the game itself. AA controlled by the game means the game can turn it off, which should never happen. I would return a game that did that in a heartbeat (I've been having AA always on since Quake 2 and Tribes on a GF2 GTS--which would not even be possible if it were a game setting--and am not turning it off until pixel size becomes at least 1/16 what it is now on LCDs).
The game should never get to decide what settings can or can not be used. Likewise, audio applications should not have to know about my speaker setup, just send data to the driver API. Video should be the same way: define the stream, send it to a demuxer, have that send the video portion to a black-box decoder API. it doesn't always end up working out perfectly, but that's why default settings tend to be the most widely compatible ones.
Not all such control panels are well thought out, but they have their place. In some cases, they are OS/app crutches (multimonior and color tuning, FI), but they are quite useful beyond that, and would not disappear, even if all the crutch-like settings could be done away with.
JarredWalton - Saturday, November 21, 2009 - link
The problem is that AA doesn't work properly using some methods unless it is built into the game engine. Never mind that with pixel and compute shaders, it is now possible to do AA within the game code and not have it impact performance as much (i.e. DX 10.1 enabled games). I don't think you can make a case for either direction: i.e. it shouldn't ALWAYS be controlled by the game, and likewise it shouldn't NEVER be controlled by the game. Similarly, trying to control that in the driver won't always work (but it sometimes will).IMO, the only reason we have the setting in drivers is because games are often not forward thinking, limiting what setting they will or won't support. Assassin's Creed for example decided that any resolution above 1680x1050 shouldn't be allowed to run AA. Stupid. Older games were made before AA was even a consideration. All new titles should look at implementing AA internally, in an optimal manner, with the ability for the user to turn it on or off. Thankfully, most games are doing exactly that.
Another case for why AA should be in the game/application and not the driver: say game X runs perfectly well at 2560x1600 4xAA, but game Y can't do more than 1920x1200 0xAA, and another game can run 2560x1600 0xAA, and yet another 2560x1600 2xAA.... You get the point. If you control the setting within each game, you set it once and forget about it. If it's a driver only setting, every time you decide to play a different game, you have to enter the control panel for the drivers and change the setting.
Saying AA should be only in the control panel is just one step up from saying games shouldn't even be able to specify what resolution to run at. I think we can all see how ludicrous that would be, and by extension forcing the driver to tell games what AA (and HDR, etc.) to use is equally limiting. The application knows what to do best, and the drivers are just an interface that talks to the hardware and interprets common function calls.
davepermen - Friday, November 20, 2009 - link
playing back full hd videos without hw decoding does actually even work on a c2d, no problem so far.just not with flash, as it's horribly inefficient.
JarredWalton - Friday, November 20, 2009 - link
A reasonable Core 2 Duo with GMA 4500 or better graphics should be able to handle Flash HD at up to 1680x1050 without trouble, even with Flash 10.0. The problem is that a lot of netbooks, nettops, and entry-level laptops don't have even that.ProDigit - Tuesday, November 17, 2009 - link
Flash sucks big time,and is largely unwanted on the internet!Instead of displaying movies in Flash, websites would have done better with streaming DivX/XviD or OGG/Mp3/WMA or so.
Flash is as much an internet hog as REAL was in the real networks days.
They are low in quality, require high CPU, and display few FPS.
I prefer the internet to become like a mobile internet, without ads, and just the minimal info visible necessary to do the most basic internet tasks (like this article, a few .jpg's, and a non-java based forum or thread underneath where we can type.
Many websites won't exist if it wasn't for the ads you may say, but I'll reply to them:
"So where is all that money going to that they get or pay to keep their website online? Who's on the head of the chain? The government?"
The internet was supposed to be a free thing, the only ones who should charge are the companies who place cables which carry the signals, and the renewal of the servers.
We're living in 2009, back in the '80, it could cost you quite some bucks to have 20MB of online server space!
Nowadays, they charge quite some to, while giving you 50MB of web space! I mean, what is that?
It costs a company today $90 to get a 1TB HD!
And if they got rid of flash altogether, internet pages wouldn't take up more than a few hundred of kilobytes.
A 1 TB harddrive could be enough to give 10.000 customers a 100MB webspace for $10 per person, which is almost for free.
But no,if you have a big website, they charge you hundreds of dollars per year, sticking a big fat bonus in their paychecks, because a server does not cost a company $100.000 anymore. They nowadays can easily be made for 1/10th of that price, yet they still charge too much.
That's why the world must be terrorized by flash ads.
I'm glad someone got some sense to create an ad blocker for my browser; because that not only seems to ease my reading of the page, it also reduces my overall network traffic, lowers CPU usage, and therefor increases battery life on my notebook, and it keeps me safer from hackers trying to enter into people's computers through annoying popup ads, using the weaknesses of flash.
The world would be a better place if flash was never invented.
Even those little tasks you do in flash,like playing farmville in facebook, perform much better if it was only an executable file for download, instead of a flash game!
bcronce - Tuesday, November 17, 2009 - link
"It costs a company today $90 to get a 1TB HD! "My employerr had to pay $120k for a new 16TB SAN cabinet and that doesn't include local back-ups or off site storage/back-ups
I wish storage was only $90/TB, I could get a raise.
snarfbot - Monday, March 8, 2010 - link
well did he really have to?seems like a bad time to invest that much cash when the entire storage industry is transitioning to solid state.
autoboy - Tuesday, November 17, 2009 - link
Do you have any clue how the world works?People provide services. In exchange for those services, they get paid. Their pay allows them to buy your services. People who provide services people actually want get paid, while those services that people don't want fail. Government steals 50% of your pay to build roads, police crime, and perpetual failed social programs.
So, the internet is supposed to be free with no business model?
Sound good to you Anand?
fic2 - Tuesday, November 17, 2009 - link
A 1T hard drive might cost YOU $90, but for a server company it cost quite a bit more. Or do you and the 10,000 other customers not care if the 1T is RAIDed and backed up? I am not sure what else goes into to it since I don't do that, but I would guess they also have to pay for space/electricity/AC/people/internet/etc.Oh, I'll add that I hate flash, too. Especially the idiot websites that think their front page has to be flash based.