ATI Radeon HD 2900 XT: Calling a Spade a Spade
by Derek Wilson on May 14, 2007 12:04 PM EST- Posted in
- GPUs
Final Words
What a long, strange journey it has been to this point. We have a very delayed launch from AMD that features a part that consumes quite a bit of power and doesn't compete with the competition's high end offering. At face value, this sounds quite a bit like NVIDIA's NV30 launch, but thankfully we wouldn't go so far as to call this NV30 Part 2: the R600 Story.
Even though AMD has not built a high end part, they have built a part that runs very consistently at its performance target (which could not be said about NV30). AMD is also not trying to pass this card off as something it's not: rather than price this card out of its class, the R600 will find a good home at a reasonable price.
Despite the delays, despite the quirks, and despite the lack of performance leadership, AMD has built a good part. It might not be as exciting as an ultra high end card, and it certainly isn't as power efficient as an 8800 GTX or Ultra, but it has quite a few positives that make it an interesting product, and more competition is always a good thing. The worst thing that could happen now is for NVIDIA to get as complacent as ATI did after R300 wiped the floor with the competition.
Let's break it down with something akin to a pro/con list. Here's what AMD did right:
R600 features a tessellator which offers an interesting option to geeks and game developers even if it doesn't offer a lot of value to the average consumer. We've got full HD video decode acceleration for all the major codecs. There is a huge amount of processing power available for the code and data that fits the structure of the hardware. Audio is integrated into the video stream and sent out over HDMI with a special adapter allowing both DVI and HDMI to coexist and without the need of splitting the audio channel out from elsewhere. We like to see more options for antialiasing, and even if we don't necessarily like the tent filters the edge detect AA is a really cool concept that looks pretty good. And we absolutely love the architectural detail AMD has gone into with R600.
And here's what AMD did wrong:
First, they refuse to call a spade a spade: this part was absolutely delayed, and it works better to admit this rather than making excuses. Forcing MSAA resolve to run on the shader hardware is less than desirable and degrades both pixel throughput and shader horsepower as opposed to implementing dedicated resolve hardware in the render back ends. Not being able to follow through with high end hardware will hurt in more than just in lost margins. The thirst for wattage that the R600 displays is not what we'd like to see from an architecture that is supposed to be about efficiency. Finally, attempting to extract a high instruction level parallelism using a VLIW design when something much simpler could exploit the huge amount of thread level parallelism inherent in graphics was not the right move.
Maybe that's a lot to digest, but the bottom line is that R600 is not perfect nor is it a failure. The HD 2900 XT competes well with the 640MB 8800 GTS, though the 8800 GTS 320MB does have a price/performance advantage over both in all but the highest resolutions and AA settings under most current games. There are features we like about the hardware and we would love to see exploited. There is potential there, especially for Xbox 360 ports, to really shine... though console ports are often looked down upon in the PC market, particularly if they come late and offer little new to the platform.
Another bit question is that we still haven't seen how either G80 or R600 handle DX10 based games. This unknown will continue for just a little while longer, as next month we should start seeing some titles support DX10. The first titles may not be representative of later DX10 titles, however, so this is something we will only be able to properly assess with time.
For now, R600 is a good starting place for AMD's DX10 initiative, and with a bit of evolution to their unified shader hardware it could eventually rise to the top. We aren't as excited about this hardware as we were about G80, and there are some drawbacks to AMD's implementation, but we certainly won't count them out of the fight. Power efficiency on 65nm remains to be seen, and there is currently a huge performance gap NVIDIA has left between the 8600 GTS and the 8800 GTS 320MB. If AMD is able to capitalize here with the HD 2600 series, they will certainly still have a leg to stand on. We will have to wait to see those performance results though.
In the meantime, we are just happy that R600 is finally here after such a long wait. Let's hope for AMD's sake that the next revision of their hardware doesn't take quite so long to surface and manages to compete better with six month old competing products. We certainly hope we won't see a repeat of the R600 launch when Barcelona and Agena take on Core 2 Duo/Quad in a few months....
86 Comments
View All Comments
TA152H - Monday, May 14, 2007 - link
Fanboy? What a dork.I've had success with ATI, not with NVIDIA, and I know ATI stuff a lot better so it's just easier for me to work with. It's not an irrational like or dislike. I bought one NVIDIA and it was a nightmare. Plus, I'm not as sure they'll be around for very long as I am ATI/AMD, although they had a good quarter, and AMD surely had a dreadful one.
Selling discrete video cards alone might get a lot more difficult with the integration of CPUs, and GPUs.
yyrkoon - Monday, May 14, 2007 - link
You are a fanboy, face it. 'I tried a nVidia card once . . .' How long ago was that ? Who made the card ? Did you have it configured properly? Once?! Details like this are important, and seemily/conviently left out. Anyhow, anyone claiming that nVIdia cards are 'junk' has definate issues with assembling/configuring hardware. I say this because my current system uses a nVidia based card, and is 100% rock solid. 'Person between the chair and keyboard' rings a bell.Ask any Linux user why they refuse to use ATI cards in their system . . . You are also one of these people out there that claims ATI driver support is superior to nVIdias driver support I suppose ? If you have truely been using ATI products for 20 years, then you know ATI has one of the worst reputations on the planet for driver support(and while it may have improved, it is not as good as nVidias still).
Yeah, anyhow, ATI, and nVidia both can have problems with their hardware, it is not based 100% on their architecture, but the OEM releasing the products have a lot of effect here also. There are bad OEMs to buy from here on both sides of the fence, knowing who to stay away from, is half the work when building a PC, and probably had a lot more to do with your alleged 'bad nVIdia card', assuming you actually configured the card properly.
I also had a problem with an nVIdia card once, I bought a brand new GF3 card about 7 years ago, and a few of the older games I had, would not display properly with it. What did I do ? I waited about a month, for a new driver, and the problem was solved. I have also had issues with ATI cards, one of which drew too much power from the AGP slot, and would cause the given system to crash 1-2 times a day. This was a design issue/oversight on ATI's behalf(the card was made by Saphire, who also makes ATIs cards). What did I do ? I replaced the card with an nVIdia card, and the system has been stable since.
So you see, I too can skew things to make anyone look bad also, and in the end, it would only serve to make me look like the dork. But if you want to pay more, for less, that is perfectly fine by me.
Pirks - Monday, May 14, 2007 - link
I've got all problems and crappy drivers (especially Linux ones) only from ATI while nVidia software was always much better in my experience. power hungry noisy monsters made by whom? by ATI! as always :) same shit as with their x1800/x1900 miserable power guzzling seriesdiscrete video cards are not going away any time soon. ever heard of integrated video used in games, besides ones from 2000, like old Quake 2? no? then please continue your lovefest with ATI, but for me - it looks like I'll pass on them this time again - since Radeon 9800Pro they went downhill and continue in that direction. they MAY make a decent integrated CPU/GPU budget-oriented vendor in a future, for all those office folks playing simple 2D office games, but real stuff? nope, ATI is still out of the game for me. let's see if they manage to come back with reincarnation of R300 in future.
ironically, AMD CPUs on the other hand have best price/performance ratio, so intel won't see me as their customer. I wish ATI 3D chips were as good as AMD CPUs in that regard (and overclockers please shut up, I'm not bothering to OC my rig because I don't enjoy benchmark numbers, I enjoy REAL stuff like games, and Intel is out of the game for me as well, at least until their budget single core Conroes are out)
utube545 - Tuesday, May 22, 2007 - link
Get a clue, you fucking cretin.dragonsqrrl - Thursday, August 25, 2011 - link
haha... lol, wow. facepalm.dragonsqrrl - Thursday, August 25, 2011 - link
Damn you're a fail noob of an ATI fanboy. Time has not been kind to the HD2900XT, and now you sound more ridiculous then ever... lol.yzkbug - Monday, May 14, 2007 - link
Not a word about new AVIVO HD and digital sound features?DerekWilson - Wednesday, May 16, 2007 - link
we mentioned this ...on the r600 overview page ...
photoguy99 - Monday, May 14, 2007 - link
First to be clear and I do not condone the title of this article, there's no need to bring racism into this.But my point is NVidia can and will react by making the performance per dollar competitive for the R600 vs 8800GTS.
Once the prices are comparable, why buy a more power hungry part (the ATI)?
This is one disadvantage they can't correct until the next respin.
DrMrLordX - Monday, May 14, 2007 - link
Based on the benchmarks results, the only reason I can see for getting 2900XTs is if a). you don't care about power consumption and b). want to run a Crossfire rig at a lower cost of entry than dual-8800 GTXs or 8800 Ultras.As others have said, some more benchmarks in mature DX10 titles might show who the real winner here is performance-wise, and that holds true for multi-GPU scenarios as well.