AMD's Radeon HD 5870: Bringing About the Next Generation Of GPUs
by Ryan Smith on September 23, 2009 9:00 AM EST- Posted in
- GPUs
Batman: Arkham Asylum
Batman: Arkham Asylum is another brand-new PC game, and has been burning up the review charts. It’s an Unreal Engine 3 based game, something that’s not immediately obvious from just looking at it, which is rare for UE3 based games.
NVIDIA has put a lot of marketing muscle into the game as part of their The Way It’s Meant to Be Played program, and as a result it ships with PhysX support and 3D Vision support. Unfortunately NVIDIA’s influence has extended to its anti-aliasing abilities too, as its in-game selective AA abilities only work on NVIDIA’s cards. AMD’s cards can perform AA on the game, but only via traditional full screen anti-aliasing, which isn’t nearly as efficient. Because of this, this is the only game where we will not be using AA, as doing so produces meaningless results given the different AA modes used.
Without the use of AA, the performance in this game is best described as “runaway”. The 5870 turns in a score of 102fps, and even the GTS 250 can do just 53fps. However we’re also seeing the 5870’s performance pattern maintained here: it beats the single-GPU cards and loses to the multi-GPU cards.
327 Comments
View All Comments
Ryan Smith - Wednesday, September 23, 2009 - link
We do have Cyberlink's software, but as it uses different code paths, the results are near-useless for a hardware review. Any differences could be the result of hardware differences, or it could be that one of the code paths is better optimized. We would never be able to tell.Our focus will always be on benchmarking the same software on all hardware products. This is why we bent over backwards to get something that can use DirectCompute, as it's a standard API that removes code paths/optimizations from the equation (in this case we didn't do much better since it was a NVIDIA tech demo, but it's still an improvement).
DukeN - Wednesday, September 23, 2009 - link
I have one of these and I know it outperforms the GTX 280 but not sure what it'd be like against one of these puppies.dagamer34 - Wednesday, September 23, 2009 - link
I need my bitstream Dolby Digital TrueHD/DTS HD Master Audio bistreaming codecs!!! :)ew915 - Wednesday, September 23, 2009 - link
I don't see this beating the GT300 as for so it should beat the GTX295 by a great margin.tamalero - Wednesday, September 23, 2009 - link
dood, you forgot the 295 is a DUAL CHIP?SiliconDoc - Wednesday, September 23, 2009 - link
roflmao - Gee no more screaming the 4850x2 and the 4870x2 are best without pointing out the two gpu's needed to get there.--
Nonetheless, this 5870 is EPIC FAIL, no matter what - as we see the disappointing numbers - we all see them, and it's not good.
---
Problem is, Nvidia has the MIMD multiple instructions breakthrough technology never used before that according to reports is an AWESOME advantage, lus they are moving to DDR5 with a 512 bit bus !
--
So what is in the works is an absolute WHOMPING coming down on ati that BIG GREEN NVIDIA is going to deliver, and the poor numbers here from what was hoped for and hyped over (although even PREDICTED by the red fan Derek himself in one portion of one sorrowful and despressed sentence on this site) are just one step closer to that nail in the coffin...
--
Yes I sure hope ati has something major up it's sleeve, like 512 bit mem bus increased card coming, the 5870Xmem ...
I find the speculation that ATI "mispredicted" the bandwidth needs to be utter non-sense. They are 2-3 billion in the hole from the last few years with "all these great cards" they still lose $ on every single sale, so they either cannot go higher bit width, or they don't want to, or they are hiding it for the next "strike at NVidia" release.
erple2 - Friday, September 25, 2009 - link
So you're comparing this product with a not yet release product and saying that the not yet released product is going to trounce it, without any facts to back it up? Do you have the hardware? If not, then you're simply ranting.Will the GT300 beat out the 5870? I dunno, probably. If it didn't, that would imply that the move from GT200 to GT300 was a major disappointment for NVidia.
I think that EPIC FAIL is completely ludicrous. I can see "epic fail" applied to the Geforce FX series when it came out. I can also see "epic fail" for the Radeon MAXX back in the day. But I don't see the 5870 as "epic fail". If you look at the card relative to the 4870 (the card it replaces), it's quite good - solid 30% increase. That's what I would expect from a generation improvement (that's what the gt200's did over the 9800's, and what the 8800 did over the 7900, etc).
BTW, I'm seeing the 5870 as pretty good - it beats out all single card NVidia by a reasonable and measureable amount. Sounds like ATI has done well. Or are you considering anything less than 2x the performance of the NVidia cards "epic fail"? In that case, you may be disappointed with the GT300, as well. In fact, I'll say that the GT300 is a total fail right now. I mean jeez! It scores ZERO FPS in every benchmark! That's super-epic fail. And I have the numbers to back that statement up.
Since you are making claims about the epic fail nature of the 5870 based on yet to be released hardware, I can certainly play the same game, and epic fail anything you say based on those speculative musings.
SiliconDoc - Monday, September 28, 2009 - link
Well the GT200 was 60.96% increase average. AT says so.http://www.anandtech.com/video/showdoc.aspx?i=3334...">http://www.anandtech.com/video/showdoc.aspx?i=3334...
So, I guess ati lost this round terribly, as NVidia's last just beat them by more than double your 30%.
Great, EPIC FAIL is correct, I was right, and well...
Finally - Wednesday, September 23, 2009 - link
Team Green foames out of their mouthes. It's funny to watch.SiliconDoc - Wednesday, September 23, 2009 - link
Glad you are having fun.Just let me know when you disagree, and why. I'm certain your fun will be "gone then", since reality will finally take hold, and instead of you seeing foam, I'll be seeing drool.