Benchmark Results

The first thing you should realize is that these are very preliminary benchmarks, using the brand-new Tomb Raider built-in benchmark, just released as part of the DX12 patch. Because there is no other way to record average frames per second in DX12, it was a wise move on the developer's part to include this new benchmark test. We have a lot to learn about how to best utilize this tool, and one thing we realized half way through our testing is that the first run of the benchmark yields lower results than subsequent runs. We didn't have time to reinstall the video cards we had already tested, so we just used the first run for each card, as they all seemed to be affected in the same way. Specifically, we found that minimums seemed a bit lower in the first runs, typically by 1-2fps. In future tests, we'll run the benchmark several times to get more accurate results.  

DirectX 11

DX11

This isn't an AMD vs. Nvidia shootout, but we've coded the results with traditional red and green colors so you can pinpoint which manufacturer the results relate to. The key takeaway here is that something seems to be amiss with the way the benchmark runs on AMD cards, as the minimums seem unreliable. Let's move on to our DX12 benchmarks, because that's really what you're here for, right?

DirectX 12

DX12

Yes, DX12 affects performance, and yes, it does so in a negative way. By the way, we examined the results from each of the three scenes that make up the benchmark, and found the results were consistent: all dropped nearly across the board. But interestingly, the average hit each card takes is almost identical: 1-2fps. That actually means it's a bigger hit in terms of percentage for the slower cards (R9 290 and GTX 970, for example), but no card suffers significantly, at least with regard to average frames per second. On the other hand, the minimums dropped quite a bit for the Nvidia cards, while the 390X shot up quite a bit. We'll say more on that issue below and on the next page, because the reason behind it may not be what you think.

Note that as per the developer's release notes, the benefits of DX12 will be felt the most in CPU-intensive situations, where DX11 would have previously loaded most of the CPU processing onto one core, but under DX12, the load is spread out over many. This helps the "bigger" CPUs the most, such as the hex-core we used, as well as AMD's eight-core FX CPUs and Intel's top-of-the-line 5960X. These are early days still for DX12, and we have no doubt that we'll see more benefits down the road. But also take note: DX12 is not meant to benefit GPUs as much as CPUs, so if you're looking for major differences between how Nvidia's and AMD's GPU offerings handle DX12, you may well be disappointed...

But that's not the end of the story for our DX12 testing. We stumbled upon an interesting pattern in our benchmarks, related to VRAM and system RAM usage. Have a look below:

VRAM Usage

VRAM

Perhaps the general trends here aren't too surprising: video cards with more VRAM allocate more VRAM to the game engine. But look more closely: the R9 290 doesn't use all of its RAM, while all the Nvidia cards are pegged at their maximum, except, interestingly, the GTX 970, which dropped to 3.8GB in the DX12 run. We bet this is Nvidia's driver working hard to reallocate (or more likely remove) data from the slow 0.5GB of VRAM portion of that card's 4GB of memory. And also look at the R9 390X 8GB: we see a significant increase under DX12, to a whopping 7.1GB. Again, we only had a chance to run these benchmarks once, so additional testing may fine-tune these findings.

And now onto some really surprising findings! 

RAM Usage

RAM

We have for a long time insisted that 8GB is plenty for gaming, as we found in our 2013 DDR3 benchmark analysis and again in our 2016 DDR benchmark analysis. We've never specifically looked, however, at whether VRAM amounts have implications for system RAM requirements. The common wisdom has been that if you have a card with more VRAM, you need more RAM. We've never ascribed to that belief, as we never, ever saw that play out in the real world of our game benchmarking. But the results we show above truly astounded us: video cards with less VRAM (in this case 4GB) actually forced the game to allocate far more system RAM to the game engine. Our guess is that because Rise of the Tomb Raider was originally coded for the PS4 and Xbox One, it expects 8GB of dynamic system RAM, and when ported over to the PC, developers had to find a way to access that much RAM if required. Now, we're not saying that this affected performance in any measurable way, but it's possible that the sky-high minimums of the 390X under DX12 may not have been a fluke. It in fact requires the least system RAM. And here's an interesting fact we realized after looking at the data a second time: all five systems require 14GB of total memory when using DX12. Coincidene? We think not!

Our system was equipped with 32GB, and we're pretty sure you'd see none of this on an 8GB system, and perhaps a less extreme version in a 16GB system. Again, more testing needs to be done to figure out exactly what's going on here, but that will have to wait until another day!

For now, let's take a closer look at the 390X 8GB, which seemed to derive at least some benefit from DX12.

Previous page Next page