Fall 2003 Video Card Roundup - Part 2: High End Shootout
by Anand Lal Shimpi & Derek Wilson on October 7, 2003 5:30 PM EST- Posted in
- GPUs
Final Words
If anyone actually made it this far without skipping around, please let me express my sincere appreciation to your dedication. This article has definitely been an entity with a mind of its own, and it continued to grow regardless how much we hacked at it. There are benchmarks we had to leave out, and there is still so much more I want to do with these cards and games.
The 5950 hasn't been shown to perform much better than the 5900, but it definitely has an acceptable performance increase for a Fall refresh product. So far, we like what we have seen from the 9800XT, and we are anxious to test out ATIs OverDriver feature.
The new 52.14 drivers are much better than either the 51.xx or the 45.xx series. The image quality issues are corrected from 51.xx, and a lot of speed has been inked out over the 45.xx drivers. We have actually been very impressed with the speed, image quality, and playability enhancements we have seen. As long as NVIDIA doesn't take a step backwards before the official 50 series drivers are released, we think everyone who owns a GeForce FX card will be very pleased with what they get. NVIDIA should have never pushed the press to benchmark with the 51 series as no one used it for Half Life 2 and in the end the bugs in the drivers did nothing more than tarnish NVIDIA's name. Regaining the credibility they have lost will definitely take NVIDIA some time.
If you made it all the way through the section on TRAOD, you'll remember the miniboss named compilers. The very large performance gains we saw in Halo, Aquamark3, X2 and Tomb Raider can be attributed to the enhancements of NVIDIAs compiler technology in the 52.xx series of drivers. Whether a developer writes code in HLSL or Cg, NVIDIAs goal is to be able to take that code and find the optimum way to achieve the desired result on their hardware. Eliminating the need for developers to spend extra time hand optimizing code specifically for NVIDIA hardware is in everyone's best interest. If NVIDIA can continue to extract the kinds of performance gains from unoptimized DX9 code as they have done with the 52.14 drivers (without sacrificing image quality), they will be well on their way to taking the performance crown back from ATI by the time NV40 and R400 drop. NVIDIAs GPU architecture is a solid one, but it just needs to be treated the right way. From our angle, at this point, compiler technology is NVIDIAs wildcard. Depending on what they are able to do with it, things could go either way.
Right now NVIDIA is at a disadvantage; ATI's hardware is much easier to code for and the performance on Microsoft's HLSL compiler clearly favors the R3x0 over the NV3x. NVIDIA has a long road ahead of them in order to improve their compilers to the point where game developers won't have to hand-code special NV3x codepaths, but for now ATI seems to have won the battle. Next year will be the year of DX9 titles, and it will be under the next generation of games that we will finally be able to crown a true DX9 winner. Until then, anyone's guess is fair game.
ATI is still the recommendation, but NVIDIA is not a bad card to have by any stretch of the imagination. We still urge our readers not to buy a card until the game they want to play shows up on the street. For those of you who need a card now, we'll be doing a value card round up as part of this series as well.
Keep in mind that ATI's Catalyst 3.8 drivers are coming out this week, and rest assured that we will be doing a follow up as quickly as possible to fill in the gaps. To say this has been a very interesting month in the graphics world would be a definite understatement. If this hasn't been an overload of information, stay tuned, because there is so much more to come.
117 Comments
View All Comments
Anonymous User - Wednesday, October 8, 2003 - link
Didn't anyone notice that Ati doesn't do dynamic glows in Jedi Academy with the 3.7 cats!? Look at the lightsabre and it's clearly visible. They only work with the 3.6 cats and then they REALLY kill performance (It's barley playable in 800*600 here on my Radeon 9700 PRO)Anonymous User - Wednesday, October 8, 2003 - link
funny to see that ati fanboys can't believe that nvidia can bring drivers without cheats. And nobody talk about the issues in TRAOD with ATI cards, really very nice...Anonymous User - Wednesday, October 8, 2003 - link
WTH did you benchmark one card with unreleased drivers (something you said you would never, ever do in the past) and use micro-sized pictures for IQ comparisons?You might as well have used 256 colors.
The Catalyst 3.8's came out today - the 51.75 drivers will not be availible for an indeterminate amount of time. Yet you bench with the Cat 3.7's and use a set of unreleased and unavailible drivers for the competition.
I suggest you peruse this article:
http://www.3dcenter.org/artikel/detonator_52.14/
from 3DCenter (german) to learn just how one goes about determining how IQ differs at different settings with the Nvidia 45's, 51's, and 52's.
Needless to say, everyone else who has compared full-sized frames in a variety of games and applications has found the 5X.XX nvidia drivers (all of them) do selective rendering, and especially lighting.
And why claim the lack of shiny water in NWN is ATi's fault?
Bioware programmed the game using an nvidia exclusive instruction and did not bother to program for the generic case until enough ATI and other brand users complained.
This is the developer's fault, not a problem with the hardware or drivers.
Anonymous User - Wednesday, October 8, 2003 - link
Nice article. I like that you benched so many games.Unfortunately you missed that the Det52.14 driver does no real Trilinear Filtering in *any* DirectX game, regardless of whether you're using anisotropic filtering or not. This often can't be seen in screenshots but in motion only. Please have a look here:
http://www.3dcenter.de/artikel/detonator_52.14/
There is *NO* way for a GeForceFX user to enable full trilinear filtering when using Det52.14. No wonder the performance increased...
Anonymous User - Wednesday, October 8, 2003 - link
TR: AOD is a fine game, you just have to play it...Sure there are some graphical issues on the later levels but there's nothing wrong with the game as such and considering that it has made its way into a lot of bundles (sapphire and creative audigy 2 ZS to name two) I believe it will recieve a fair share of gameplay.
Anonymous User - Wednesday, October 8, 2003 - link
You guys need to stop talking about gabe newell...for such a supposed good programmer he sure needs to learn about network security...We all know he's got his head up ATI's rearend. The funny part is that they are bundling hl2 with the 9800xt (a coupon) when it isn't coming out until april now. Who's to say who will have the better hardware then? Doom 3 will likely be out by then. In 4 months when the new cards are out you guys won't care who makes the better card the 12 year old fan-boys will be up in arms in support of their company. I owned the 5900u and sold it on ebay after seeing the hl2 numbers. I then bought a 9800pro from newegg and on the first tried ordering the 9800xt from ati which said it was in stock. 2 days later they told me my order was on backorder and hassled me when I wanted to cancel. One thing I'd point out is that war3 looks much better on the 5900u then the 9800. It looks really dull on the 9800 where it's bright and cartoony (like it should be) on the geforce. Either way who knows what the future will hold for both companies but let's hope they both succeed to keep our prices low....Anonymous User - Wednesday, October 8, 2003 - link
i took over #41's original post... i didnt like his tone :|Anonymous User - Wednesday, October 8, 2003 - link
IQ part was crappy at best. small screenshots in open not-so-detailed areas and sometimes there was no option for a big one to check.You can call me what you want, but there are quite a few reviews there that will disagree BIG time with what has been posted about IQ here. And it is impossible all of them are wrong on this at the same time.
HomeWorld has shadow issues in ATI cards with cat 3.7, yet that ain't shown there anyways....this goes for both ways.
If you ask me, NVidia got his DX9 wrapper to work fine this time.
Anonymous User - Wednesday, October 8, 2003 - link
Um what happened to post #41 where the guy detailed all the inconsistencies of the IQ comparisons? Please don't tell me you guys actually modded that post....I haven't had the chance to go through everything yet but those few I did, I definitely saw differences even in these miniscule caps (how about putting up some full size links next time guys?).. particularly in the AA+AF ones. It's obvious theres still quite a difference in their implementations.
I was also surprised at the number of shots that weren't even of the same frame. Honestly, how can you do a IQ test if you aren't even going to use the same frames? A split second difference is enough to change the output because of the data/buffer/angle differences etc.
Personally I wonder what happened to the old school 400% zoom IQ tests that Anand was promising and I'm fairly disappointed despite the number of games in this article.
That said, I am glad that Nvidia didn't botch up everything entirely and hopefully they'll have learned their lesson for NV4x.
Anonymous User - Wednesday, October 8, 2003 - link
Where can i get the 52.14 drivers?