3dMark 2003

3dMark is not a benchmark that we routinely bring you here at AnandTech, as our editorial policy is to bring you benchmarks from real-world games and engines, and not synthetic metrics. That said, it would be inappropriate to leave out 3dMark in this case due to the significant cheating incidents with it. And, as a flashy, system draining benchmark backed by a unique database for comparisons, it's still an important title in the eyes of many consumers, OEMs, and the GPU makers looking for bragging rights.

With 3dMark, its importance in this regression is not so much the performance improvements as a sign of what happened with the card - the improvements were most certainly exaggerated due to in part by the synthetic nature of the benchmark - but rather a possibility of what can happen when ATI dedicates its resources to a game/benchmark that it considers most important. We should note that ATI has admitted to "cheating" on 3dMark 2003; however, these were what we consider honest shader-replacement optimizations (same mathematical output) that ATI voluntarily removed, though they were apparently re-introduced at some point. We used the latest version of 3dMark 2003, so this "cheat" was not activated in the older drivers.

For these benchmarks, 3dMark was run at its default resolution of 1024x768.

3DMark 2003

3DMark 2003 HQ

With 3dMark, we are starting to see a very common theme, which we have seen with most of our other benchmarks that worked with the Catalyst 3.00 drivers; there's a very significant performance improvement between them and 3.04 when AA/AF are used. Otherwise, 3dMark shows a very slow, very steady performance improvement over the life of the 9700 Pro both with and without AA/AF.



Catalyst 5.11 versus 3.00 (mouse over to see 3.00)

As far as IQ goes, however, we may as well have just shown you the same screenshot twice - there is no difference between the 3.00 and 5.11 drivers. It's the same story for all of the other screenshots in between. It should be noted, however, that this IQ comparison highlights one of the flaws with 3dMark - its non-interactive nature means that certain cheats can be used based on the fact that all perspectives are known ahead of time. So, while we have no reason to believe that ATI is being dishonest here, we have no way of being completely sure that they aren't using any sort of perspective cheating.

Overall then, 3dMark is much like Halo, a benchmark that received a slow, but steady improvement, without any fixes.

Halo Conclusion
Comments Locked

58 Comments

View All Comments

  • timmiser - Wednesday, December 14, 2005 - link

    That is why they are so good. It shows that the early drivers are already well opitimized and that there is not much improvement over the months/years from driver release to driver release.

    Nvidia on the other hand, will have a driver release (typically around the launch of a competing ATI card) that all of a sudden shows a 25% gain or some ungodly number like that. This shows us that either A) Nvidia didn't do a very good job with opitimizing their drivers prior to that big speed increase, or B) held the card back some via the driver so that they could raise the speed upon any threat (new release) by ATI.

    Either way, it reflects poorly on Nvidia.
  • DerekWilson - Monday, December 12, 2005 - link

    lots of people have requested more modern games.

    our goal at the outset was to go back as far as possible with the drivers and select a reasonable set of games to test. most modern games don't run on older drivers, so we didn't consider them.

    for future articles of this nature, we will be including a couple modern games (at the very least, half-life 2 and doom 3). we will handle the driver compatibility issue by starting with the oldest driver that supports the game.

    very new games like FEAR won't be useful because they've only got a driver revision or two under their belt. Battlefield 2 is only about 6 months old and isn't really a suitable candidate either as we can't get a very good look at anything. My opinion is that we need to look at least a year back for our game selection.

    thanks for the feedback. we're listening, and the next article in the series will definitely incorporate some of the suggestions you guys are making.
  • Cygni - Tuesday, December 13, 2005 - link

    I cant belive people missed this point. I thought it was pretty obvious in the text of the article. Throwing teh gaem of teh futar at a videocard running drivers from 1997 is going to have obvious consequences. That doesnt give you anyway to measure driver performance increases over time, whatsoever.
  • nserra - Monday, December 12, 2005 - link

    I agree.

    But I think the best candidate would be the R200 (8500) for testing,
    since everyone said it was a good card (hardware) with bad drivers (software).

    So a good retro test is how the R200 would standup with recent drivers VS nvidia geforce 3/4 with the older games.
    The all idea is to see if 8500 could keep up with geforce 3/4 if it had good drivers.

    Resuming:
    2002/2003 games | radeon8500 card | 2002/2003 driver
    2002/2003 games | geforce3/4 card | 2002/2003 driver

    2002/2003 games | radeon8500 card | 2005 driver
    2002/2003 games | geforce3/4 card | 2005 driver

    2004/2005 games | radeon8500 card | 2005 driver
    2004/2005 games | geforce3/4 card | 2005 driver
  • JarredWalton - Monday, December 12, 2005 - link

    The problem is that the R200 is no longer acceptable for even moderate gaming. If you have a 9700 Pro, you can still get reasonable performance on almost any modern game. Yes, you'll need to drop to medium and sometimes even low quality settings, but a 9700 Pro is still three or four times (or more) as fast as the best current IGP.

    I'm not working on these articles, but personally I have little to no interest in cards that are more than 3 generations old. It might be intersting to study from an academic perspective, but for real-world use there's not much point. If enough people disagree with this, though, I'm sure Ryan could write such an article. :)
  • Hardtarget - Monday, December 12, 2005 - link

    Neat article idea but I would deffinitely of thrown in one more game, a modern one. Probably Half Life 2, see how it does on teh older hardware in general, and see what sort of driver revisions do for it. Would of been pretty interesting.
  • Avalon - Monday, December 12, 2005 - link

    I think Far Cry, HL2, and Doom 3 ought to be tested. I remember running those games on my 9700 pro. Far Cry and D3 ran just fine at 10x7, and HL2 ran great at 12x9. I'm pretty sure quite a few people were using these cards before upgrading, in these games.
  • WileCoyote - Monday, December 12, 2005 - link

    My conclusion after seeing the numbers: ATI prefers directing money/man-power/time/resources towards synthetic benchmarks rather than improving game performance. I consider THAT cheating.
  • Questar - Monday, December 12, 2005 - link

    Explain the image quality increases then.

    Or do you consider nvidia lowering image quality from generation to generation an improvment?
  • Jedi2155 - Monday, December 12, 2005 - link

    Explain the Halo benchmark then?

Log in

Don't have an account? Sign up now