GeForce 9800 GTX and 3-way SLI: May the nForce Be With You
by Derek Wilson on April 1, 2008 9:00 AM EST- Posted in
- GPUs
Crysis, DX10 and Forcing VSYNC Off in the Driver
Why do we keep on coming back to Crysis as a key focal point for our reviews? Honestly because it’s the only thing out there that requires the ultra high end hardware enabled by recently released hardware.
That and we’ve discovered something very interesting this time around.
We noted that some of our earlier DX10 performance numbers on Skulltrail looked better than anything we could get more recently. In general, the higher number is usually more likely to be "right", and it has been a frustrating journey trying to hunt down the issues that lead to our current situation.
Many reinstalls and configuration tweaks later and we’ve got an answer.
Every time I set up a system, because I want to ensure maximum performance, the first thing I do is force VSYNC off in the driver. I also generally run without having the graphics card scale output for my panel; centered timings allow me to see what resolution is currently running without having to check. But I was in a hurry on Sunday and I must have forgotten to check the driver after I set up an 8800 Ultra SLI for testing Crysis.
Low and behold, when I looked at the numbers, I saw a huge performance increase. No, it couldn’t be that VSYNC was simply not forced off in the driver could it? After all, Crysis has a setting for VSYNC and it was explicitly disabled; the driver setting shouldn’t matter.
But it does.
Forcing VSYNC off in the driver can decrease performance by 25% under the DX10 applications we tested. We see a heavier impact in CPU limited situations. Interestingly enough, as we discussed last week, with our high end hardware, Crysis and World in Conflict were heavily CPU and system limited. Take a look for yourself at the type of performance gains we saw from disabling VSYNC. These tests were run on Crysis using the GeForce 9800 GX2 in Quad SLI.
We would have tried overclocking the 790i system as well if we could have done so and maintained stability.
In looking at these numbers, we can see some of the major issues we had between NVIDIA platforms and Skulltrail diminish. There is still a difference, but 790i does have PCIe 2.0 bandwidth between its cards and it uses DDR3 rather than FB-DIMMS. We won’t be able to change those things, but right now my option is to run half the slots with 800 MHz FB-DIMMS or all four slots with 667 MHz. We should be getting a handful of higher speed lower latency FB-DIMMS in for testing soon which will we believe will help. Now that we’ve gotten a better feel for the system, we also plan on trying some bus overclocking to help alleviate the PCIe 2.0 bandwidth advantage 790i has. It also seems possible to push our CPUs up over 4GHz with air cooling, but we really need a larger PSU to keep the system stable (without any graphics going, a full CPU load can pull about 700W at the wall when running 4GHz at 1.5v) especially when you start running a GPU on top of that.
Slower GPUs will benefit less from not forcing VSYNC off in the driver, but even if the framerate is near a CPU limit (say, within 20%) performance will improve. NVIDIA seems more impacted by this than AMD, but we aren’t sure at this point whether that is because NVIDIA’s cards expose more of a CPU limit due to their higher performance.
NVIDIA is aware of the VSYNC problem, as they were able to confirm our findings yesterday.
Being that we review hardware, it is conceivable that this issue might not affect most gamers. Many people like VSYNC on, and since most games allow for the option it isn’t usually necessary or beneficial to force VSYNC off in the driver. So we decided to ask in our video forum just how many people force VSYNC off in the driver, and whether they do so always or just some of the time.
More than half our 89 respondents (at the time of this writing) never force VSYNC off, but 40% of the remaining respondents admitted to forcing VSYNC off at some point, half of these always forcing VSYNC off (just as we do in our testing).
This is a big deal, especially for members who want to play Crysis and have lower end CPUs. We didn’t have time to rerun all of our numbers without VSYNC forced off in the driver, so keep in mind that these numbers could benefit a lot by doing so.
49 Comments
View All Comments
Jangotat - Friday, April 18, 2008 - link
The way they're setting this up is great but they need to fix a few things 1 use a 790i Asus motherboard 2 use OCZ 1600 platinum memory 3 let us see some benchmarks with 3-way 8800 ultra cards that would be sweetPlatinum memory has custom timings for asus, and asus doesn't have issues like EVGA and XFX do. And we really need to see the 3-way ultra setup to see what's really the best for crysis and everything else
You guys could do this right?
LSnK - Wednesday, April 2, 2008 - link
What, are you guys running out of zeros or using some ancient text mode resolution?Mr Roboto - Thursday, April 3, 2008 - link
Derek, you say that a 25% decrease in performance resulted from disabling VSYNC in Crysis and WIC. However you then say in the next sentence that performance gains can be had by disabling VSYNC? Maybe I'm misunderstanding?"Forcing VSYNC off in the driver can decrease performance by 25% under the DX10 applications we tested. We see a heavier impact in CPU limited situations. Interestingly enough, as we discussed last week, with our high end hardware, Crysis and World in Conflict were heavily CPU and system limited. Take a look for yourself at the type of performance gains we saw from disabling VSYNC".
Evilllchipmunk89 - Wednesday, April 2, 2008 - link
Seriously what about the AMD 790FX board? you will test the Nvidea cards on thier "home platform/790I" platform, But what not the ATI cards home platform. Obviously you can get more performance if you had the 790FX board that was made more specificly for the Radeon3870swhere you can tweek more aspects of the card. In an earlyer review you showed us that with nothing changed but the board the 780I outperformed the skulltrail on the Nvidia cards but you wint even mess with the ATI boards
just4U - Tuesday, April 1, 2008 - link
I dont quite understand why they just didnt go with a 512bit interface like on the X2's. That's what I was expecting anyway.One thing that has me surprised. I was checking my local store on the web for "new arrivals" (a feature where new listings appear daily) and saw the GTX and was thinking hey wait .. Annand hasn't even reviewed this yet and it's in stock???! wow. I imediately came here and there the review was :D So nvidia is trying to stay on top of the hard launch which is nice to see but mmmm.. still troubled by that no 512bit interface. To me it still seems like a GTS/512.
7Enigma - Wednesday, April 2, 2008 - link
And yet the GTS wasn't included in the review...deeznuts - Tuesday, April 1, 2008 - link
It's actually "lo and behold" and I'm not even sure it's being used right. You propably are, but essentially you're saying, "look, see, I looked, and saw ..."Olaf van der Spek - Tuesday, April 1, 2008 - link
So what is the cause of the vsync issue? I don't see an explanation of that.It'd be interesting to know why performance drops with vsync off.
finbarqs - Tuesday, April 1, 2008 - link
Haha Happy April Fools day!prophet001 - Tuesday, April 1, 2008 - link
you guys write some nice reviews on this website but the visuals are a little lacking. i guess when i read an RSS feed that talks about 9800 gtx triple SLI then i kinda expect to see at least a picture of a mobo with 3 cards on it and a uranium iv. i know, it's about the results, but more neat pictures would be nice :)