Gaming Tests: Gears Tactics

Remembering the original Gears of War brings back a number of memories – some good, and some involving online gameplay. The latest iteration of the franchise was launched as I was putting this benchmark suite together, and Gears Tactics is a high-fidelity turn-based strategy game with an extensive single player mode. As with a lot of turn-based games, there is ample opportunity to crank up the visual effects, and here the developers have put a lot of effort into creating effects, a number of which seem to be CPU limited.

Gears Tactics has an in-game benchmark, roughly 2.5 minutes of AI gameplay starting from the same position but using a random seed for actions. Much like the racing games, this usually leads to some variation in the run-to-run data, so for this benchmark we are taking the geometric mean of the results. One of the biggest things that Gears Tactics can do is on the resolution scaling, supporting 8K, and so we are testing the following settings:

  • 720p Low, 4K Low, 8K Low, 1080p Ultra

For results, the game showcases a mountain of data when the benchmark is finished, such as how much the benchmark was CPU limited and where, however none of that is ever exported into a file we can use. It’s just a screenshot which we have to read manually.

If anyone from the Gears Tactics team wants to chat about building a benchmark platform that would not only help me but also every other member of the tech press build our benchmark testing platform to help our readers decide what is the best hardware to use on your games, please reach out to ian@anandtech.com. Some of the suggestions I want to give you will take less than half a day and it’s easily free advertising to use the benchmark over the next couple of years (or more).

As with the other benchmarks, we do as many runs until 10 minutes per resolution/setting combination has passed. For this benchmark, we manually read each of the screenshots for each quality/setting/run combination. The benchmark does also give 95th percentiles and frame averages, so we can use both of these data points.

AnandTech Low Res
Low Qual
Medium Res
Low Qual
High Res
Low Qual
Medium Res
Max Qual
Average FPS
95th Percentile

All of our benchmark results can also be found in our benchmark engine, Bench.

Gaming Tests: Far Cry 5 Gaming Tests: GTA 5
Comments Locked

120 Comments

View All Comments

  • realbabilu - Monday, November 2, 2020 - link

    That Larger cache maybe need specified optimized BLAS.
  • Kurosaki - Monday, November 2, 2020 - link

    Did you mean BIAS?
  • ballsystemlord - Tuesday, November 3, 2020 - link

    BLAS == Basic Linear Algebra System.
  • Kamen Rider Blade - Monday, November 2, 2020 - link

    I think there is merit to having Off-Die L4 cache.

    Imagine the low latency and high bandwidth you can get with shoving some stacks of HBM2 or DDR-5, whichever is more affordable and can better use the bandwidth over whatever link you're providing.
  • nandnandnand - Monday, November 2, 2020 - link

    I'm assuming that Zen 4 will add at least 2-4 GB of L4 cache stacked on the I/O die.
  • ichaya - Monday, November 2, 2020 - link

    Waiting for this to happen... have been since TR1.
  • nandnandnand - Monday, November 2, 2020 - link

    Throw in an RDNA 3 chiplet (in Ryzen 6950X/6900X/whatever) for iGPU and machine learning, and things will get really interesting.
  • ichaya - Monday, November 2, 2020 - link

    Yep.
  • dotjaz - Saturday, November 7, 2020 - link

    That's definitely not happening. You are delusional if you think RDNA3 will appear as iGPU first.

    At best we can hope the next I/O die to intergrate full VCN/DCN with a few RDNA2 CUs.
  • dotjaz - Saturday, November 7, 2020 - link

    Also doubly delusional if think think RDNA3 is any good for ML. CDNA2 is designed for that.
    Adding powerful iGPU to Ryzen 9 servers literally no purpose. Nobody will be satisfied with that tiny performance. Guaranteed recipe for instant failure.

    The only iGPU that would make sense is a mini iGPU in I/O die for desktop/video decoding OR iGPU coupled with low end CPU for an complete entry level gaming SOC aka APU. Chiplet design almost makes no sense for APU as long as GloFo is in play.

Log in

Don't have an account? Sign up now