NVIDIA and ATI HDCP Compatible Graphics Cards Roundup
by Josh Venning on November 16, 2006 12:00 AM EST- Posted in
- GPUs
CPU Utilization
Of course gaming performance is only part of the equation when it comes to looking at these HDCP compliant cards, the other major aspect is CPU utilization during high definition movie playback. Today we're only able to provide a small subset of HD movie playback performance as we're only testing with a MPEG-2 encoded Blu-ray title. We're still waiting for a PC HD-DVD player which will let us test VC1 and H.264 decode performance as well, but for now we're only able to look at high bitrate MPEG-2 content. VC1 and H.264 encoded content will put more stress on the CPU and GPU as a whole, but we'll unfortunately have to wait a little longer before testing it.
Just like when graphics cards started becoming important for offloading graphics processing with games like GLQuake, we are in a kind of transition period where it is becoming necessary to also have cards that can process our video playback for us. For the past couple of years ATI and NVIDIA products have been handling video decode acceleration, but it hasn't started to be really necessary until HD-DVD and Blu-Ray came around. The complex video formats they provide require more processing power to decode, meaning that slower processors won't be able to play them back without help from a graphics card.
Right now, since Blu-Ray titles are predominantly MPEG-2, having lots of extra power in a graphics card to accelerate the decode process isn't extremely important, but we still want to take a look at how much load the cards can take away from the CPU. With this in mind we put together a benchmark, recording the average CPU utilization of a period of about one minute of Blu-Ray movie playback. The movie we used was Click, and we tested each of the cards with the exact same one-minute segment of the movie. Audio was also enabled for this test.
Here are the CPU utilization results from each of our cards.
Video decode acceleration on NVIDIA GPUs is handled by the PureVideo processor, which is tied directly to the core clock speed, so the CPU utilization of each card will reflect this. The end result is that an NVIDIA card with more pipelines that is better at 3D performance will not necessarily be better at video decoding. With ATI, its AVIVO decoding is also tied to the processing power of the card, but is not quite as related to the clock speed as it is with NVIDIA. We also found that there was a bit of variance between multiple runs of the same tests, but these tests give us a general view of the CPU utilization of each of these cards.
We can see that the X1900 XT 256 gets a very low average CPU utilization compared to the other cards. Also, the 8800 GTX and 8800 GTS offloaded more processing from the CPU than the other NVIDIA cards, which isn't very surprising given that NVIDIA mentioned that the PureVideo core is a bit faster in G80. For reference, we measured the CPU utilization of the Blu-Ray playback benchmark with hardware acceleration disabled, and we got an average of 51.0%, giving us an idea of how much work these graphics cards take off the CPU. The Gigabyte 7600 GS doesn't seem to help in this area at all, and it makes sense when we consider that it's the slowest clocked NVIDIA card of the group. It would appear that a 400MHz clock speed doesn't provide enough power with PureVideo to make a difference in CPU utilization.
Even taking into account these results, CPU utilization isn't going to make a big difference between which of these cards would be better choices than others. Until we can look at H.264 and VC1 decode performance we will have to focus on other important factors to consider such as power, heat and noise.
Of course gaming performance is only part of the equation when it comes to looking at these HDCP compliant cards, the other major aspect is CPU utilization during high definition movie playback. Today we're only able to provide a small subset of HD movie playback performance as we're only testing with a MPEG-2 encoded Blu-ray title. We're still waiting for a PC HD-DVD player which will let us test VC1 and H.264 decode performance as well, but for now we're only able to look at high bitrate MPEG-2 content. VC1 and H.264 encoded content will put more stress on the CPU and GPU as a whole, but we'll unfortunately have to wait a little longer before testing it.
Just like when graphics cards started becoming important for offloading graphics processing with games like GLQuake, we are in a kind of transition period where it is becoming necessary to also have cards that can process our video playback for us. For the past couple of years ATI and NVIDIA products have been handling video decode acceleration, but it hasn't started to be really necessary until HD-DVD and Blu-Ray came around. The complex video formats they provide require more processing power to decode, meaning that slower processors won't be able to play them back without help from a graphics card.
Right now, since Blu-Ray titles are predominantly MPEG-2, having lots of extra power in a graphics card to accelerate the decode process isn't extremely important, but we still want to take a look at how much load the cards can take away from the CPU. With this in mind we put together a benchmark, recording the average CPU utilization of a period of about one minute of Blu-Ray movie playback. The movie we used was Click, and we tested each of the cards with the exact same one-minute segment of the movie. Audio was also enabled for this test.
Here are the CPU utilization results from each of our cards.
CPU Utilization |
Avg | Min | Max |
NVIDIA Gigabyte GeForce 7600 GS | 51.5 | 41.4 | 58.2 |
NVIDIA ASUS GeForce EN7600 GT | 45.5 | 38.8 | 50.8 |
NVIDIA MSI GeForce NX7600 GT Diamond Plus | 46.9 | 38.3 | 52.9 |
NVIDIA MSI GeForce NX7600 GT | 45.8 | 39.1 | 51.6 |
NVIDIA Albatron GeForce 7900 GS | 45.8 | 36.7 | 54.7 |
NVIDIA EVGA e-GeForce 7900 GS KO | 44.5 | 37.5 | 52.3 |
NVIDIA Leadtek WinFast PX7900GS TDH Extreme | 44.8 | 36.7 | 51.6 |
NVIDIA MSI GeForce 7900 GS | 45.9 | 38.3 | 52.3 |
NVIDIA MSI GeForce NX7900 GT | 44.9 | 38.3 | 51.6 |
NVIDIA EVGA e-GeForce 7950 GT KO | 43.9 | 35.9 | 50.0 |
NVIDIA Gigabyte GeForce NX7950 GT | 44.4 | 36.7 | 51.6 |
NVIDIA PNY GeForce 7950 GT | 44.3 | 36.7 | 52.3 |
NVIDIA XFX GeForce 7950 GT HDCP | 44.1 | 35.2 | 53.1 |
NVIDIA Sparkle Calibre 7950 GT | 44.1 | 35.9 | 64.1 |
NVIDIA BFG GeForce 7950 GX2 | 46.3 | 36.7 | 53.1 |
NVIDIA EVGA e-GeForce 7950 GX2 | 46.2 | 39.8 | 53.1 |
NVIDIA GeForce 8800 GTX | 38.7 | 29.7 | 46.9 |
NVIDIA GeForce 8800 GTS | 39.8 | 31.2 | 48.8 |
ATI Powercolor Radeon X1600 PRO HDMI | 40.6 | 28.1 | 50.0 |
ATI Sapphire Radeon X1950 XTX | 36.3 | 28.9 | 44.5 |
ATI Radeon X1900XT 256 (reference) | 34.2 | 28.1 | 39.8 |
ATI Radeon X1650XT (reference) | 38.3 | 28.1 | 46.1 |
Video decode acceleration on NVIDIA GPUs is handled by the PureVideo processor, which is tied directly to the core clock speed, so the CPU utilization of each card will reflect this. The end result is that an NVIDIA card with more pipelines that is better at 3D performance will not necessarily be better at video decoding. With ATI, its AVIVO decoding is also tied to the processing power of the card, but is not quite as related to the clock speed as it is with NVIDIA. We also found that there was a bit of variance between multiple runs of the same tests, but these tests give us a general view of the CPU utilization of each of these cards.
We can see that the X1900 XT 256 gets a very low average CPU utilization compared to the other cards. Also, the 8800 GTX and 8800 GTS offloaded more processing from the CPU than the other NVIDIA cards, which isn't very surprising given that NVIDIA mentioned that the PureVideo core is a bit faster in G80. For reference, we measured the CPU utilization of the Blu-Ray playback benchmark with hardware acceleration disabled, and we got an average of 51.0%, giving us an idea of how much work these graphics cards take off the CPU. The Gigabyte 7600 GS doesn't seem to help in this area at all, and it makes sense when we consider that it's the slowest clocked NVIDIA card of the group. It would appear that a 400MHz clock speed doesn't provide enough power with PureVideo to make a difference in CPU utilization.
Even taking into account these results, CPU utilization isn't going to make a big difference between which of these cards would be better choices than others. Until we can look at H.264 and VC1 decode performance we will have to focus on other important factors to consider such as power, heat and noise.
48 Comments
View All Comments
rnemeth - Friday, December 1, 2006 - link
I personally think this article was extremely on point with the direction of media components in general. I think there will be many people out there, like me, who are planning on the convergence of the Media Center PC & Gaming PC. Vista (Ultimate) will bring Media Center to the masses and why shouldn't you be able to play your favorite DX10 "Game for Windows" on your big HDTV as well?
This article is ahead of its time. HDPC/DRM/HDMI/DVI/BR/HD-DVD/HDTV is all in the early stages. In the future, you will be able to buy or rent your high-def movie by downloading it to your PC with DRM finally figured out. This review shows us that it is not all there yet, but gives us an idea who is doing what, and what we need to look for.
You mentioned in the beginning of the article that you were looking for feedback to see how interested your audience is on this subject... count me as 1.
thejez - Sunday, November 26, 2006 - link
HDCP is a joke... look at how hard it is to understand and get this stuff running... not to mention you have upgrade all your hardware?? lol and who watches movies on their PC anyway?? do you people really sit huddled over your keyboard watching movies?? Why not invest in some better equipment for the family room and watch movies they way they were intended....but the difficulty in setting all this up makes it even more important that hdcp has already been cracked... i wouldnt let the movie industry tell you your hardware isnt good enough to buy their content... just buy what you want and "work" around the issue later... the cracks are only getting better.... we'll see HD-DVD Shrink soon enough...
KalTorak - Monday, November 20, 2006 - link
Careful - it's not safe to assume that higher bitrate content is more computationally difficult to decode than lower bitrate content. [In fact, I suspect they're weakly correlated the other way - lower bitrate is harder.]DerekWilson - Monday, November 20, 2006 - link
ahh ... very interesting ...it would make sense to me to say that both are true.
In the case where low bitrate means more aggressive high quality encoding, i absolutely see your point. But low bitrate can also mean lower quality (less information) at the same level of encoding -- in these cases lower bitrate will be easier to decode.
Thanks for pointing this out.
Badkarma - Monday, November 20, 2006 - link
Hi Derek,Can you comment on HD audio formats like Dolby TrueHD and DTS-HD for the HTPC? I know Bluray has yet to use these formats but how about HD-DVD on the HTPC. From what I understand, you could get these formats outputted via analog output on your soundcard, but I'm interested in HDMI. I know some of the HDMI video cards you reviewed have SPDIF passthrough via HDMI, however SPDIF cannot carry Dolby Digital +, TrueHD, or DTS-HD, it will only output DTS or DD. I'm holding back on a HDCP video card because HD audio is an important part of HD movies.
Thanks.
Ajax9000 - Sunday, November 19, 2006 - link
p.13 "Both the 8800 GTX and GTS are fully HDCP compatible, and HDCP is enabled through both DVI ports" p.22 "Some of the cards, like the HDMI Gigabyte 7600 GS and ASUS EN7600 GT, were only able to play our Blu-ray movies over HDMI and not through the DVI port. Conversely, we found that with our MSI NX7600 GT Diamond Plus, the Blu-ray content wouldn't play through the HDMI connection but it would through the DVI port."OK, so which cards (other than the 8800s) could do HDCP over both ports?
----
p.19 "The end result is that an NVIDIA card with more pipelines that is better at 3D performance will not necessarily be better at video decoding."
In other words overclocking (say) a 7600 is likely to give as good or better HD video results than using (say) a 7950GX2?
----
p.20 "In the future, we could see power consumption go down with acceleration enabled. As graphics hardware is better suited to processing video than a CPU, efficiency should go up when using hardware acceleration."
The results for the 9750GTs already seem to show this 147W average non-accelerated >> 142 for EVGA & PNY (although Gigabyte > 148W).
Adrian
chucky2 - Friday, November 17, 2006 - link
...can we get it added to these results just for comparison?Also, you don't happen to know when that'd be, would you? :)
Chuck
BigLan - Friday, November 17, 2006 - link
I noticed that in the cpu utilisation tests you said it was around 51% for no acceleration - was this because the player software is single threaded and so only used one core?Also, is click encoded in h264, or mpeg2 like the initial bluray discs?
DigitalFreak - Friday, November 17, 2006 - link
Now that the Xbox 360 HD-DVD drive is available and is proven to work with a PC, any chance on doing another round-up "real soon now" using HD-DVD? I'd really love to see the numbers for VC1 and H.264 decoding.Still amazed that the lowly X1600 card spanked all Nvidia cards but the G80 boards in CPU utilization.
Good job guys.
DigitalFreak - Friday, November 17, 2006 - link
BTW, the new releases from Fox on Blu-Ray use the H.264 codec. Behind Enemy Lines, Fantastic 4, etc. I think Behind... is already out.