AMD's Radeon HD 5870: Bringing About the Next Generation Of GPUs
by Ryan Smith on September 23, 2009 9:00 AM EST- Posted in
- GPUs
AA Image Quality & Performance
With HL2 unsuitable for use in assessing image quality, we will be using Crysis: Warhead for the task. Warhead has a great deal of foliage in parts of the game which creates an immense amount of aliasing, and along with the geometry of local objects forms a good test for anti-aliasing quality. Look in particular at the leaves both to the left and through the windshield, along with aliasing along the frame, windows, and mirror of the vehicle. We’d also like to note that since AMD’s SSAA modes do not work in DX10, this is done in DX9 mode instead.
AMD Radeon HD 5870
|
AMD Radeon HD 4870
|
NVIDIA GTX 280
|
No AA | ||
2X MSAA | ||
4X MSAA | ||
8X MSAA | ||
2X MSAA +AAA | 2X MSAA +AAA | 2X MSAA + SSTr |
4X MSAA +AAA | 4X MSAA +AAA | 4X MSAA + SSTr |
8X MSAA +AAA | 8X MSAA +AAA | 8X MSAA + SSTr |
2X SSAA | ||
4X SSAA | ||
8X SSAA |
From an image quality perspective, very little has changed for AMD compared to the 4890. With MSAA and AAA modes enabled the quality is virtually identical. And while things are not identical when flipping between vendors (for whatever reason the sky brightness differs), the resulting image quality is still basically the same.
For AMD, the downside to this IQ test is that SSAA fails to break away from MSAA + AAA. We’ve previously established that SSAA is a superior (albeit brute force) method of anti-aliasing, but we have been unable to find any scene in any game that succinctly proves it. Shader aliasing should be the biggest difference, but in practice we can’t find any such aliasing in a DX9 game that would be obvious. Nor is Crysis Warhead benefitting from the extra texture sampling here.
From our testing, we’re left with the impression that for a MSAA + AAA (or MSAA + SSTr for NVIDIA) is just as good as SSAA for all practical purposes. Much as with the anisotropic filtering situation we know through technological proof that there is better method, but it just isn’t making a noticeable difference here. If nothing else this is good from a performance standpoint, as MSAA + AAA is not nearly as hard on performance as outright SSAA is. Perhaps SSAA is better suited for older games, particularly those locked at lower resolutions?
For our performance data, we have two cases. We will first look at HL2 on only the 5870, which we ran before realizing the quality problem with Source-engine games. We believe that the performance data is still correct in spite of the visual bug, and while we’re not going to use it as our only data, we will use it as an example of AA performance in an older title.
As a testament to the rendering power of the 5870, even at 2560x1600 and 8x SSAA, we still get a just-playable framerate on HL2. To put things in perspective, with 8x SSAA the game is being rendered at approximately 32MP, well over the size of even the largest possible single-card Eyefinity display.
Our second, larger performance test is Crysis: Warhead. Here we are testing the game on DX9 mode again at a resolution of 1920x1200. Since this is a look at the impact of AA on various architectures, we will limit this test to the 5870, the GTX 280, and the Radeon HD 4890. Our interest here is in performance relative to no anti-aliasing, and whether different architectures lose the same amount of performance or not.
Starting with the 5870, moving from 0x AA to 4x MSAA only incurs a 20% drop in performance, while 8x MSAA increases that drop to 35%, or 80% of the 4x MSAA performance. Interestingly, in spite of the heavy foliage in the scene, Adaptive AA has virtually no performance hit over regular MSAA, coming in at virtually the same results. SSAA is of course the big loser here, quickly dropping to unplayable levels. As we discussed earlier, the quality of SSAA is no better than MSAA + AAA here.
Moving on, we have the 4890. While the overall performance is lower, interestingly enough the drop in performance from MSAA is not quite as much, at only 17% for 4x MSAA and 25% for 8x MSAA. This makes the performance of 8x MSAA relative to 4x MSAA 92%. Once again the performance hit from enabling AAA is miniscule, at roughly 1 FPS.
Finally we have the GTX 280. The drop in performance here is in line with that of the 5870; 20% for 4x MSAA, 36% for 8x MSAA, with 8x MSAA offering 80% of the performance. Even enabling supersample transparency AA only knocks off 1 FPS, just like AAA under the 5870.
What this leaves us with are very curious results. On a percentage basis the 5870 is no better than the GTX 280, which isn’t an irrational thing to see, but it does worse than the 4890. At this point we don’t have a good explanation for the difference; perhaps it’s a product of early drivers or the early BIOS? It’s something that we’ll need to investigate at a later date.
Wrapping things up, as we discussed earlier AMD has been pitching the idea of better 8x MSAA performance in the 5870 compared to the 4800 series due to the extra cache. Although from a practical perspective we’re not sold on the idea that 8x MSAA is a big enough improvement to justify any performance hit, we can put to rest the idea that the 5870 is any better at 8x MSAA than prior cards. At least in Crysis: Warhead, we’re not seeing it.
327 Comments
View All Comments
SiliconDoc - Wednesday, September 30, 2009 - link
I was here before this site was even on the map let alone on your radar, and have NEVER had any other acct name.I will wait for your APOLOGY.
ol1bit - Friday, September 25, 2009 - link
Goodbye 8800gt SLI... nothing has given me the bang for the buck upgrade that this card does!I paid $490 for my SLI 8800Gt's in 11/07
$379 Sweetness!
Brazos - Thursday, September 24, 2009 - link
I always get nostalgic for Tech TV when a new gen of video cards come out. Watching Leo, Patrick, et al. discuss the latest greatest was like watching kids on Christmas morning. And of course there was Morgan.totenkopf - Thursday, September 24, 2009 - link
SiliconDoc, this is pathetic. Why are you so upset? No one cares about arguing the semantics of hard or paper launches. Besides, where the F is Nvidias Gt300 thingy? You post here more than amd fanboys, yet you hate amd... just hibernate until the gt300 lauunches and then you can come back and spew hatred again.Seriously... the fact that you cant even formulate a cogent argument based on anything performance related tells me that you have already ceded the performance crown to amd. Instead, you've latched onto this red herring, the paper launch crap. stop it. just stop it. You're like a crying child. Please just be thankful that amd is noww allowing you to obtain more of your nvidia panacea for even less money!
Hooray competition! EVERYONE WINS! ...Except silicon doc. He would rather pay $650 for a 280 than see ati sell one card. Ati is the best thing that ever happened to nvidia (and vice versa) Grow the F up and dont talk about bias unless you have none yourself. Hope you dont electrocute yourself tonight while making love to you nvidia card.
SiliconDoc - Thursday, September 24, 2009 - link
" Hooray competition! EVERYONE WINS! ...Except silicon doc. He would rather pay $650 for a 280 than see ati sell one card."And thus you have revealed your deep seated hatred of nvidia, in the common parlance seen.
Frankly my friend, I still have archived web pages with $500 HD2900XT cards from not that long back, that would easily be $700 now with the inflation we've seen.
So really, wnat is your red raving rooster point other than you totally excuse ATI tnat does exactly the same thing, and make your raging hate nvidia whine, as if "they are standalone guilty".
You're ANOTHER ONE, that repeats the same old red fan cleche's, and WON'T OWN UP TO ATI'S EXACT SAME BEHAVIOR ! Will you ? I WANT TO SEE IT IN TEXT !
In other words, your whole complaint is INVALID, because you apply it exclusively, in a BIASED fashion.
Now tell me about the hundres of dollars overpriced ati cards, won't you ? No, you won't. See that is the problem.
silverblue - Friday, September 25, 2009 - link
If you think companies are going to survive without copying what other companies do, you're sadly mistaken.Yes, nVidia has made advances, but so has ATI. When nVidia brought out the GF4 Ti series, it supported Pixel Shader 1.3 whereas ATI's R200-powered 8500 came out earlier with the more advanced Pixel Shader 1.4. ATI were the first of the two companies to introduce a 256-bit memory bus on their graphics cards (following Matrox). nVidia developed Quincunx, which I still hold in high regard. nVidia were the first to bring out Shader Model 3. I still don't know of any commercially available nVidia cards with GDDR5.
We could go on comparing the two but it's essential that you realise that both companies have developed technologies that have been adopted by the other. However, we wouldn't be so far down this path without an element of copying.
The 2900XT may be overpriced because it has GDDR4. I'm not interested in it and most people won't be.
"In other words, your whole complaint is INVALID, because you apply it exclusively, in a BIASED fashion. " Funny, I thought we were seeing that an nauseum from you?
Why did I buy my 4830? Because it was cheaper than the 9800GT and performed at about the same level. Not because I'm a "red rooster".
ATI may have priced the 5870 a little high, but in terms of its pure performance, it doesn't come too far off the 295 - a card we know to have two GPUs and costs more. In the end, perhaps AMD crippled it with the 256-bit interface, but until they implement one you'll be convinced that it's a limitation. Maybe, maybe not. GT300 may just prove AMD wrong.
SiliconDoc - Wednesday, September 30, 2009 - link
You have absolutely zero proof that we wouldn't be further down this path without the "competition".Without a second company or third of fourth or tenth, the monopoly implements DIVISIONS that complete internally, and without other companies, all the intellectual creativity winds up with the same name on their paycheck.
You cannot prove what you say has merit, even if you show me a stagnant monopoly, and good luck doing that.
As ATI stagnated for YEARS, Nvidia moved AHEAD. Nvidia is still ahead.
In fact, it appears they have always been ahead, much like INTEL.
You can compare all you want but "it seems ati is the only one interested in new technology..." won't be something you'll be blabbing out again soon.
Now you try to pass a lesson, and JARED the censor deletes responses, because you two tools think you have a point this time, but only with your deleting and lying assumptions.
NEXT TIME DON'T WAIL ATI IS THE ONLY ONE THAT SEEMS INTERESTED IN IMPLEMENTING NEW TECHGNOLOGY.
DON'T SAY IT THEN BACKTRACK 10,000 % WHILE TRYING TO "TEACH ME A LESSON".
You're the one whose big far red piehole spewed out the lie to begin with.
Finally - Friday, September 25, 2009 - link
The term "Nvidiot" somehow sprung to my mind. How come?silverblue - Thursday, September 24, 2009 - link
Youre spot on about his bias. Every single post consists of trash-talking pretty much every ATI card and bigging up the comparative nVidia offering. I think the only product he's not complained about is the 4770, though oddly enough that suffered horrific shortage issues due to (surprise) TSMC.Even if there were 58x0 cards everywhere, he'd moan about the temperature or the fact it should have a wider bus or that AMD are finally interested in physics acceleration in a proper sense. I'll concede the last point but in my opinion, what we have here is a very good piece of technology that will (like CPUs) only get better in various aspects due to improving manufacturing processes. It beats every other single GPU card with little effort and, when idle, consumes very little juice. The technology is far beyond what RV770 offers and at least, unlike nVidia, ATI seems more interested in driving standards forward. If not for ATI, who's to say we'd have progressed anywhere near this far?
No company is perfect. No product is perfect. However, to completely slander a company or division just because he buys a competitor's products is misguided to say the least. Just because I own a PC with an AMD CPU, doesn't mean I'm going to berate Intel to high heaven, even if their anti-competitive practices have legitimised such criticism. nVidia makes very good products, and so does ATI. They each have their own strengths and weaknesses, and I'd certainly not be using my 4830 without the continued competition between the two big performance GPU manufacturers; likewise, SiliconDoc's beloved nVidia-powered rig would be a fair bit weaker (without competition, would it even have PhysX? I doubt it).
SiliconDoc - Thursday, September 24, 2009 - link
Well, that was just amazing, and you;re wrong about me not complaining about the 4770 paper launch, you missed it.I didn't moan about the temperature, I moaned about the deceptive lies in the review concerning temperatures, that gave ATI a complete pass, and failed to GIVE THE CREDIT DUE THAT NVIDIA DESERVES because of the FACTS, nothing else.
The article SPUN the facts into a lying cobweb of BS. Juzt like so many red fans do in the posts, and all over the net, and you've done here. It is so hard to MAN UP and admit the ATI cards run hotter ? Is is that bad for you, that you cannot do it ? Certainly the article FAILED to do so, and spun away instead.
Next, you have this gem " at least, unlike nVidia, ATI seems more interested in driving standards forward."
ROFLMAO - THIS IS WHAT I'M TALKING ABOUT.
Here, let me help you, another "banned" secret that the red roosters keep to their chest so their minions can spew crap like you just did: ATI STOLE THE NVIDIA BRIDGE TECHNOLOGY, ATI HAD ONLY A DONGLE OUTSIDE THE CASE, WHILE NVIDIA PROGRESSED TO INTERNAL BRIDGE. AFTER ATI SAW HOW STUPID IT WAS, IT COPIED NVIDIA.
See, now there's one I'll bet a thousand bucks you never had a clue about.
I for one, would NEVER CLAIM that either company had the lock on "forwarding technbology", and I IN FACT HAVE NEVER DONE SO, EVER !
But you red fans spew it all the time. You spew your fanboyisms, in fact you just did, that are absolutely outrageous and outright red leaning lies, period!
you: " at least, unlike nVidia, ATI seems more interested in driving standards forward...."
I would like to ask you, how do you explain the never before done MIMD core Nvidia has, and will soon release ? How can you possibly say what you just said ?
If you'd like to give credit to ATI going with DRR4 and DDR5 first, I would have no problem, but you people DON'T DO THAT. You take it MUCH FURTHER, and claim, as you just did, ATI moves forward and nvidia does not. It's a CONSTANT REFRAIN from you people.
Did you read the article and actually absorb the OpenCL information ? Did you see Nvidia has an implementation, is "ahead" of ati ? Did you even dare notice that ? If not, how the hell not, other than the biased wording the article has, that speaks to your emotionally charged hate Nvidia mindset :
"However, to completely slander a company or division just because he buys a competitor's products is misguided to say the least."
That is NOT TRUE for me, as you stated it, but IT IS TRUE FOR YOU, isn't it ?
---
You in fact SLANDERED Nvidia, by claiming only ATI drives forward tech, or so it seems to you...
I've merely been pointing out the many statements all about like you just made, and their inherent falsehood!
---
Here next, you pull the ol' switcharoo, and do what you say you won't do, by pointing out you won't do it! roflmao: " doesn't mean I'm going to berate Intel to high heaven, even if their anti-competitive practices have legitimised such criticism.."
Well, you just did berate them, and just claimed it was justified, cinching home the trashing quickly after you claimed you wouldn't, but have utterly failed to point out a single instance, unlike myself- I INCLUDE the issues and instances, pointing them out imtimately and often in detail, like now.
LOL you: " I'd certainly not be using my 4830 without ...."
Well, that shows where you are coming from, but you're still WRONG. If either company dies, the other can move on, and there's very little chance that the company will remain stagnant, since then they won't sell anything, and will die, too.
The real truth about ATI, which I HAVE pointed out before, is IT FELL OFF THE MAP A FEW YEARS BACK AND ALTHOUGH PRIOR TO THAT TIME WAS COMPETITIVE AND PERHAPS THE VERY BEST, IT CAVED IN...
After it had it's "dark period" of failure and depair, where Nvidia had the lone top spot, and even produced the still useful and amazing GTX8800 ultimate (with no competition of any note in sight, you failed to notice, even to this day - and claim the EXACT OPPOSITE- because you, a dead brained red, bought the "rebrand whine" lock stock and barrel), ATI "re-emerged", and in fact, doesn't rteally deserve praise for falling off the wagon for a year or two.
See, that's the truth. The big fat red fib, you liars can stop lying about is the "stagnant technology without competition" whine.
ATI had all the competition it could ever ask for, and it EPIC FAILED for how many years ? A couple, let's say, or one if you just can't stand the truth, and NVIDIA, not stagnated whatsoever, FLEW AHEAD AND RELEASED THE MASSIVE GTX8800 ULTIMATE.
So really friend, just stop the lying. That's all I ask. Quit repeating the trashy and easily disproved ati cleche's.
Ok ?