Conclusion

So, now that we have gone through 6 applications and 12 drivers, what have we learned? Not much, if we want to talk about consistency.

In general, there was one significant performance improvement across all games via the driver, and this was the move from the Catalyst 3.00 drivers to the 3.04 drivers. Otherwise, for anyone who would have been expecting multiple across - the-board improvements, this would be a disappointment.

Breaking down the changes by game, we see an interesting trend among what games had the greatest performance improvement. Jedi Academy, UT2004, and really every non-modern/next-gen game saw no significant performance improvements to which we can isolate to just the driver offering targeting optimizations, and there was only the one aforementioned general improvement. However, with our next-gen benchmarks, Halo and 3dMark, we saw a similar constant performance improvement among the two, unlike with the other games.

There is also a consistent performance improvement among most of the titles we used that was isolated to when we enabled AA/AF, which is a positive sign to see given just how important AA/AF has become. With the latest cards now capable of running practically everything at a high resolution with AA/AF, it looks like ATI made a good bet in deciding to put some of their time in these kinds of optimizations.

Getting back to the original question then, are drivers all they're cracked up to be? Yes and no. If the 9700 Pro is an accurate indicator, than other cards certainly have the possibility of seeing performance improvements due to drivers, but out of 3 years of drivers, we only saw one general performance improvement, so it seems unreasonable to expect that any given driver will offer a massive performance boost across the board, or even that most titles will be significantly faster in the future. However, if you're going to be playing next-generation games that will be pushing the latest features of your hardware to its limits, then it seems likely that you'll find higher performance as time goes on, but again, this will be mostly in small increments, and not a night-and-day difference among a related set of drivers.

As for the future, the Radeon 9700 Pro is by no means a crystal ball in to ATI's plans, but it does give us some places to look. We've already seen ATI squeeze out a general performance improvement for OpenGL titles for their new X1000-series, and it seems likely that their memory controller is still open enough that there could be one more of those improvements. Past that, it seems almost a given that we'll see future performance improvements on the most feature-intensive titles, likely no further game-specific changes on lighter games, and plenty of bug fixes along the way.

3dMark 2003
Comments Locked

58 Comments

View All Comments

  • timmiser - Wednesday, December 14, 2005 - link

    That is why they are so good. It shows that the early drivers are already well opitimized and that there is not much improvement over the months/years from driver release to driver release.

    Nvidia on the other hand, will have a driver release (typically around the launch of a competing ATI card) that all of a sudden shows a 25% gain or some ungodly number like that. This shows us that either A) Nvidia didn't do a very good job with opitimizing their drivers prior to that big speed increase, or B) held the card back some via the driver so that they could raise the speed upon any threat (new release) by ATI.

    Either way, it reflects poorly on Nvidia.
  • DerekWilson - Monday, December 12, 2005 - link

    lots of people have requested more modern games.

    our goal at the outset was to go back as far as possible with the drivers and select a reasonable set of games to test. most modern games don't run on older drivers, so we didn't consider them.

    for future articles of this nature, we will be including a couple modern games (at the very least, half-life 2 and doom 3). we will handle the driver compatibility issue by starting with the oldest driver that supports the game.

    very new games like FEAR won't be useful because they've only got a driver revision or two under their belt. Battlefield 2 is only about 6 months old and isn't really a suitable candidate either as we can't get a very good look at anything. My opinion is that we need to look at least a year back for our game selection.

    thanks for the feedback. we're listening, and the next article in the series will definitely incorporate some of the suggestions you guys are making.
  • Cygni - Tuesday, December 13, 2005 - link

    I cant belive people missed this point. I thought it was pretty obvious in the text of the article. Throwing teh gaem of teh futar at a videocard running drivers from 1997 is going to have obvious consequences. That doesnt give you anyway to measure driver performance increases over time, whatsoever.
  • nserra - Monday, December 12, 2005 - link

    I agree.

    But I think the best candidate would be the R200 (8500) for testing,
    since everyone said it was a good card (hardware) with bad drivers (software).

    So a good retro test is how the R200 would standup with recent drivers VS nvidia geforce 3/4 with the older games.
    The all idea is to see if 8500 could keep up with geforce 3/4 if it had good drivers.

    Resuming:
    2002/2003 games | radeon8500 card | 2002/2003 driver
    2002/2003 games | geforce3/4 card | 2002/2003 driver

    2002/2003 games | radeon8500 card | 2005 driver
    2002/2003 games | geforce3/4 card | 2005 driver

    2004/2005 games | radeon8500 card | 2005 driver
    2004/2005 games | geforce3/4 card | 2005 driver
  • JarredWalton - Monday, December 12, 2005 - link

    The problem is that the R200 is no longer acceptable for even moderate gaming. If you have a 9700 Pro, you can still get reasonable performance on almost any modern game. Yes, you'll need to drop to medium and sometimes even low quality settings, but a 9700 Pro is still three or four times (or more) as fast as the best current IGP.

    I'm not working on these articles, but personally I have little to no interest in cards that are more than 3 generations old. It might be intersting to study from an academic perspective, but for real-world use there's not much point. If enough people disagree with this, though, I'm sure Ryan could write such an article. :)
  • Hardtarget - Monday, December 12, 2005 - link

    Neat article idea but I would deffinitely of thrown in one more game, a modern one. Probably Half Life 2, see how it does on teh older hardware in general, and see what sort of driver revisions do for it. Would of been pretty interesting.
  • Avalon - Monday, December 12, 2005 - link

    I think Far Cry, HL2, and Doom 3 ought to be tested. I remember running those games on my 9700 pro. Far Cry and D3 ran just fine at 10x7, and HL2 ran great at 12x9. I'm pretty sure quite a few people were using these cards before upgrading, in these games.
  • WileCoyote - Monday, December 12, 2005 - link

    My conclusion after seeing the numbers: ATI prefers directing money/man-power/time/resources towards synthetic benchmarks rather than improving game performance. I consider THAT cheating.
  • Questar - Monday, December 12, 2005 - link

    Explain the image quality increases then.

    Or do you consider nvidia lowering image quality from generation to generation an improvment?
  • Jedi2155 - Monday, December 12, 2005 - link

    Explain the Halo benchmark then?

Log in

Don't have an account? Sign up now