Fall 2003 Video Card Roundup - Part 2: High End Shootout
by Anand Lal Shimpi & Derek Wilson on October 7, 2003 5:30 PM EST- Posted in
- GPUs
Final Words
If anyone actually made it this far without skipping around, please let me express my sincere appreciation to your dedication. This article has definitely been an entity with a mind of its own, and it continued to grow regardless how much we hacked at it. There are benchmarks we had to leave out, and there is still so much more I want to do with these cards and games.
The 5950 hasn't been shown to perform much better than the 5900, but it definitely has an acceptable performance increase for a Fall refresh product. So far, we like what we have seen from the 9800XT, and we are anxious to test out ATIs OverDriver feature.
The new 52.14 drivers are much better than either the 51.xx or the 45.xx series. The image quality issues are corrected from 51.xx, and a lot of speed has been inked out over the 45.xx drivers. We have actually been very impressed with the speed, image quality, and playability enhancements we have seen. As long as NVIDIA doesn't take a step backwards before the official 50 series drivers are released, we think everyone who owns a GeForce FX card will be very pleased with what they get. NVIDIA should have never pushed the press to benchmark with the 51 series as no one used it for Half Life 2 and in the end the bugs in the drivers did nothing more than tarnish NVIDIA's name. Regaining the credibility they have lost will definitely take NVIDIA some time.
If you made it all the way through the section on TRAOD, you'll remember the miniboss named compilers. The very large performance gains we saw in Halo, Aquamark3, X2 and Tomb Raider can be attributed to the enhancements of NVIDIAs compiler technology in the 52.xx series of drivers. Whether a developer writes code in HLSL or Cg, NVIDIAs goal is to be able to take that code and find the optimum way to achieve the desired result on their hardware. Eliminating the need for developers to spend extra time hand optimizing code specifically for NVIDIA hardware is in everyone's best interest. If NVIDIA can continue to extract the kinds of performance gains from unoptimized DX9 code as they have done with the 52.14 drivers (without sacrificing image quality), they will be well on their way to taking the performance crown back from ATI by the time NV40 and R400 drop. NVIDIAs GPU architecture is a solid one, but it just needs to be treated the right way. From our angle, at this point, compiler technology is NVIDIAs wildcard. Depending on what they are able to do with it, things could go either way.
Right now NVIDIA is at a disadvantage; ATI's hardware is much easier to code for and the performance on Microsoft's HLSL compiler clearly favors the R3x0 over the NV3x. NVIDIA has a long road ahead of them in order to improve their compilers to the point where game developers won't have to hand-code special NV3x codepaths, but for now ATI seems to have won the battle. Next year will be the year of DX9 titles, and it will be under the next generation of games that we will finally be able to crown a true DX9 winner. Until then, anyone's guess is fair game.
ATI is still the recommendation, but NVIDIA is not a bad card to have by any stretch of the imagination. We still urge our readers not to buy a card until the game they want to play shows up on the street. For those of you who need a card now, we'll be doing a value card round up as part of this series as well.
Keep in mind that ATI's Catalyst 3.8 drivers are coming out this week, and rest assured that we will be doing a follow up as quickly as possible to fill in the gaps. To say this has been a very interesting month in the graphics world would be a definite understatement. If this hasn't been an overload of information, stay tuned, because there is so much more to come.
117 Comments
View All Comments
Anonymous User - Tuesday, October 7, 2003 - link
You need to look at the FSAA each card empolys...go back and look again at the screenies, this time looking at all the jaggis on each card....especially in F1, it doesn't even look like nVidia is using FSAA....while on the ATI, it's smooth ......I don't think it's a driver comparison, just the fact that ATI FSAA is far better at doing the job....At least I think that's what he's talking about..hard to tell any IQ differences when the full size screenies are not working, but poor FSAA kinda jumps out at you (If your'e used to smooth FSAA)Also worth noting, nVidia made great jumps in performance in DX9, but nothing that actually used PS2.0 shaders : (
Anonymous User - Tuesday, October 7, 2003 - link
#14 Blurred? Are you not wearing your glasses or something? Nice and sharp for me...Anonymous User - Tuesday, October 7, 2003 - link
of course you do, you're a fanATIc...Anonymous User - Tuesday, October 7, 2003 - link
I like the way he discounts Tomb raider. Saying it is just not a good game. Thats a matter of opinion. It almost seems like he trys to undermine that game before revealing any benches.And the benches for that game are not done in FPS but on percentage lost on PS2.0.
On first inspection of the graphs it appears that Nvidia is leading in tombraider. But if you look at the blurred print on the graph it does say "lower is better" Very clever!
Why no FPS in that game?
Nice information in this review but it almost seems that he is going out of his way to excuse Nvidia.
I smell a rat in this review.
Anonymous User - Tuesday, October 7, 2003 - link
#3, #7: If you take the screens into photoshop and observe the result of their 'difference', you'll see that there's a fairly significant difference between the 45's and 3.7's, but almost no difference whatsoever between the 52's and 3.7's. In most of those screenshots it's impossible to do this since the shots aren't neccessarily from the exact same position each time. Try the ut2k3 ones for example. Also these are jpeg's, so there'll be a little fuzz due to the differences in compression.Also, if I need to take two screenshots into photoshop to be able to discern any difference between them, that's really saying alot. And since we can't refer to a reference software shot, it could be ati's driver that's off for all we know.
In any event I'm pleasantly surprised with nvidia. Their IQ has definitely caught up, and their performance is quickly improving. Hopefully the cat3.8's will pull a similar stunt.
Anonymous User - Tuesday, October 7, 2003 - link
No he's just a "fanny"AgaBooga - Tuesday, October 7, 2003 - link
They must have a reason for choosing those drivers. Anandtech has been around long enough for that :)The reason is probably along the lines of when they started this benchmarking because they did soooo many games, resolutions, AA and AF levels, times the number of different cards, etc. That takes quite some time. Had they waited for the newer ATI drivers, it may have delayed this article one, or even two weeks till publishing. Also, they did mention they will do a followup articles with the new drivers, so patience is the key here.
Anonymous User - Tuesday, October 7, 2003 - link
#8 seems like a fanboy himselfdvinnen - Tuesday, October 7, 2003 - link
well #8, Nvidia was able to do it with the wonder driver, I dn't see why Ati can'tAnonymous User - Tuesday, October 7, 2003 - link
LOL, the ATI fanboys are already coming out of the woodwork. Listen #3 and #7, it's a fact, there is no IQ difference at all between the 50 Dets and the 3.7 CATs. And if you honestly believe you're going to see much of a difference with the CAT 3.8's....you're just stupid.