Fall 2003 Video Card Roundup - Part 2: High End Shootout
by Anand Lal Shimpi & Derek Wilson on October 7, 2003 5:30 PM EST- Posted in
- GPUs
Final Words
If anyone actually made it this far without skipping around, please let me express my sincere appreciation to your dedication. This article has definitely been an entity with a mind of its own, and it continued to grow regardless how much we hacked at it. There are benchmarks we had to leave out, and there is still so much more I want to do with these cards and games.
The 5950 hasn't been shown to perform much better than the 5900, but it definitely has an acceptable performance increase for a Fall refresh product. So far, we like what we have seen from the 9800XT, and we are anxious to test out ATIs OverDriver feature.
The new 52.14 drivers are much better than either the 51.xx or the 45.xx series. The image quality issues are corrected from 51.xx, and a lot of speed has been inked out over the 45.xx drivers. We have actually been very impressed with the speed, image quality, and playability enhancements we have seen. As long as NVIDIA doesn't take a step backwards before the official 50 series drivers are released, we think everyone who owns a GeForce FX card will be very pleased with what they get. NVIDIA should have never pushed the press to benchmark with the 51 series as no one used it for Half Life 2 and in the end the bugs in the drivers did nothing more than tarnish NVIDIA's name. Regaining the credibility they have lost will definitely take NVIDIA some time.
If you made it all the way through the section on TRAOD, you'll remember the miniboss named compilers. The very large performance gains we saw in Halo, Aquamark3, X2 and Tomb Raider can be attributed to the enhancements of NVIDIAs compiler technology in the 52.xx series of drivers. Whether a developer writes code in HLSL or Cg, NVIDIAs goal is to be able to take that code and find the optimum way to achieve the desired result on their hardware. Eliminating the need for developers to spend extra time hand optimizing code specifically for NVIDIA hardware is in everyone's best interest. If NVIDIA can continue to extract the kinds of performance gains from unoptimized DX9 code as they have done with the 52.14 drivers (without sacrificing image quality), they will be well on their way to taking the performance crown back from ATI by the time NV40 and R400 drop. NVIDIAs GPU architecture is a solid one, but it just needs to be treated the right way. From our angle, at this point, compiler technology is NVIDIAs wildcard. Depending on what they are able to do with it, things could go either way.
Right now NVIDIA is at a disadvantage; ATI's hardware is much easier to code for and the performance on Microsoft's HLSL compiler clearly favors the R3x0 over the NV3x. NVIDIA has a long road ahead of them in order to improve their compilers to the point where game developers won't have to hand-code special NV3x codepaths, but for now ATI seems to have won the battle. Next year will be the year of DX9 titles, and it will be under the next generation of games that we will finally be able to crown a true DX9 winner. Until then, anyone's guess is fair game.
ATI is still the recommendation, but NVIDIA is not a bad card to have by any stretch of the imagination. We still urge our readers not to buy a card until the game they want to play shows up on the street. For those of you who need a card now, we'll be doing a value card round up as part of this series as well.
Keep in mind that ATI's Catalyst 3.8 drivers are coming out this week, and rest assured that we will be doing a follow up as quickly as possible to fill in the gaps. To say this has been a very interesting month in the graphics world would be a definite understatement. If this hasn't been an overload of information, stay tuned, because there is so much more to come.
117 Comments
View All Comments
Anonymous User - Tuesday, October 21, 2003 - link
Why are they using Flash to do this. I can't see the performance charts (or whatever they are)?Anonymous User - Tuesday, October 14, 2003 - link
what a crap this article was. More games - sure. More data. Bot no brains to interprete it right, obviously.Anonymous User - Monday, October 13, 2003 - link
#114Its more than just difference in visuals. By removing some of the visuals the card will run faster. The Nvidia drivers for example do not do trilinear filtering in dx they do some fake bilinear. That makes the card better than it really is.
The whining is how the reveiwers missed all this stuff. People are not getting the true story here.
Anonymous User - Monday, October 13, 2003 - link
Okay now I'm new here and all but DAMN do some of you whine! You act like any visual diffrences between the Nvidia cards and the Ati (of which I can't see at all) are astronomically huge! It's not, this is the first and last time I post here looks like half of the people here are fanboys!Anonymous User - Monday, October 13, 2003 - link
I don't understand why the obvious differences in IQ in the Aquamark 3 4xAA/8xAF shots, for example, are totally ignored by the reviewer.Just looked at the fuzziness in the plants surrounding the explosion in the nvidia shot.
Anonymous User - Sunday, October 12, 2003 - link
Well I hope it was worth it.You spend all that time on a review and you end up getting caught being in someones back pocket.
Guess you can't have your cake and eat it to :(
Anonymous User - Saturday, October 11, 2003 - link
Here's is part of an addendum to the 3DCenter article direclty addressing this comparison:"AnandTech made an extremely extensive article about the performance and image quality of the current high-end graphic cards like Radeon 9800XT and GeForceFX 5950 Ultra (NV38). Beside the game benchmarks with 18 games, the image quality tests made with each of those games are strongly worth to be mentioned. AnandTech uses the Catalyst 3.7 on ATi side and the Detonator 52.14 on the nVidia side to compare the image quality. In contrast to the statements of our youngest driver comparison, AnandTech didn’t notice any general differences of the image quality between the Detonator 52.14 and 45.23 and therefore AnandTech praises the new driver a little into the sky.
This however not even absolutely contradicts itself with our realizations. The nVidia-"optimizations" of the anisotropic filter with texture stages 1 till 7 in Control panel mode (only a 2x anisotropic filter is uses, regardless if there were made higher settings) are only to find with proper searching for it, besides most image quality comparisons by AnandTech were concerned without the anisotropic filter and therefore it’s impossible to find any differences on those pictures. The generally forced "optimization" of the trilinear filter into a pseudo trilinear filter by the Detonator 52.14 is besides nearly not possible to see on fixed images of real games, because the trilinear filter was created in order to prevent nearly only the MIP-Banding which can be seen in motion.
Thus it can be stated that the determined "optimizations" of the Detonator 52.14 won’t be recognized with the view of screenshots, if you do not look for them explicitly (why however AnandTech awards the driver 52.14 a finer filter quality than the driver 51.75 is a mystery for us, then the only difference between them is a correctly working Application mode of the Detonator 52.14). Thus the "optimizations" of nVidia are not to be really seen, whereby there is also a clear exception as for example Tron 2.0 (screenshots will follow). Whether this is now a reason to excuse the "optimizations" of nVidia about it, one can surely argue."
All on-line computer journalists should strive to inform their viewing public like these folks do.
Once again: 51/52.XX nvidia drivers do *not* apply trilinear filtering in D3D when AF is on. The 51.75 at least, applies trilinear to the first (0) stage, though not *AT ALL* to any other stage - the 52 series does not apply trilinear filtering to any stage in D3D, regardless.
Bing! Bing! Try again!
May I suggest the filtering tester used by 3dCenter, and perhaps a mipmap shading program (as used by everyone in the known universe), and rthdribl to discern *ACTUAL* image quality via high dynamic range light source rendering.
http://www.daionet.gr.jp/~masa/rthdribl/index.html
Anonymous User - Saturday, October 11, 2003 - link
http://www.3dcenter.de/artikel/detonator_52.14/ind...Take a look at this if you think the 5x.xx drivers have the same image quality of the 45s.
Anonymous User - Saturday, October 11, 2003 - link
The only thing I give Anand credit for is allowing us freely write about his review. I mean he did not have to allow us to reply in a open forum.After reading it I am not at all suprised at the heat he is taking, I hope he was not either.
The review had potential but was squandered.
Todays cards all all fast enough to do Dx8 games.
The question is can they do it will all the goodies turned on?
The main reason to buy a ATI 96-9800/5900U is to clean up the graphics but not at the expense of speed. If you don't care about image quality stick with your GF4 or 8500 as they both are horrid vs the new gen cards.
An old Gforce 4 kicks butt in mnay games so long as you don't have FSAA turned on.
Most people know that the 5x.xx detonator drivers do reduce image quality in many areas. This is not a driver bug its what Nvidia choose to keep pace. Image quality is much more subjective than FPS. People are not buying 400.00 video cards for the speed alone.
Anand glossed over/hid quality issues, the one area where subtle reductions here and there add up to large FPS gains.
People will say so what the XT gets recommended in the end why bitch?
Well its the principle, The review made the 5900 seem much closer to the XT than it actually is.
When a driver (beta one at that) improves speed that much it deserves a much closer inspection than what Anad gave.
Someone threw Anand a pass but he dropped the ball :(
Anonymous User - Saturday, October 11, 2003 - link
I didn't care for this review for the following reasons:Many comments on IQ in part 1, but no followup in part 2. There were so many comments that they needed to be mentioned, even if it was to say that they discovered it was some wrong setting and they fixed it.
Small cropped compressed images used for IQ comparison. If the image is compressed how can we judge it? The only way to present IQ comparisons to the reader is to show them the exact images the reviewer saw, without compression or cropping.
Apples to apples. All of the benchmarks for all games should have been done in the same format unless it was impossible to achieve certain settings on a given card at a given resolution. Changing the metric for TR:AOD was a bad idea. Both parts should also have been done on the same system. For all we know the ignored IQ issues from part 1 could have been due to the AGP implementation on the first board. We just don't know.
Gunmetal is also a very poor DX9 benchmark, since it relies on VS 2.0 and PS 1.1 only. Since most of the benefits of DX9, and the controversies for that matter, revolve around PS 2.0 this benchmark is not a good exemplar of DX9 performance. I also find the fact that Gunmetal was co-developed by Nvidia something that needs examination. IHVs have no place in developing benchmarks, they should stick to technology demos.
Now I don't know if the 52.14 drivers do what the article says they do or not. I know Digit-Life said they gave up to a 20% improvement in some cases, and some improvement is certainly credible. However, this article as written does not support the conclusions that the 52.14 provide significant performance boosts with no IQ loss. I am not commenting on whether they do or not perform as advertised, only that you cannot draw that conclusion from the article.