We know from plenty of previous testing, as well as the results a few pages back, that this game is seriously CPU-bound with a high-end video card. So six cores wins the day. But taking away two of the 5820K's cores doesn't turn it into a 4790K, apparently, as it still exhibits significantly better minimums than its nearly identical cousin. The most obvious explanation here is that either the extra cache or extra memory bandwidth offered by the 5820K/X99 duo provides a boost even when you don't have the extra cores of the 5820K at your disposal.
Far Cry 4
And finally, we conclude with a total lack of drama. No, disabling cores on the 5820K doesn't magically bring it up to par with the 4790K at the same clock speed, but nor does it hurt performance. This game is clearly held back by something in the X99 platform that extra cores cannot remedy.
Does having more cores perform miracles? Well, not quite. In fact, only in a handful of the games we tested does it provide any boost at all, with the biggest boost being in Crysis 3. And that speaks to the overall "core debate" that enthusiasts have been engaged in ever since the first dual-core appeared on the market more than a decade ago. Unless games are designed to take advantage of extra cores, those cores do no good at all. But taking a look at our six-game average shown below, one is almost forced to conclude that the extra cores somehow act as extra baggage, exacting an overall performance penalty.
There are a few other angles we can take on the data we collected beyond just averages. First, while results varied by game, overall, both the Radeon and GeForce card were negatively affected by being paired with the 5820K. Thus, that negates one of our original hypotheses: that AMD driver overhead would allow the Radeon to unleash hidden potential when paired with more cores. Second, both averages and minimums are negatively impacted, meaning whatever is holding back the 5820K-based system is a broad-based effect. And there's one last piece of the puzzle we haven't yet touched upon, namely power use. We found that our 5820K-based system drew 75W at idle, versus just 41W for our 4790K-based system, while drawing 418W under a 3DMark combined load, versus 359W for the 4790K-based system. That alone may be a tie-breaker for many people, especially the idle numbers.
Overall, then, this is what we'd suggest to potential PC builders: unless you're looking to gain additional future-proofing for games coded a la Crysis 3, a fast quad-core is really what you're after. And that's not all. With the release of the faster Core i7-6700K and the DDR4-based Z170 platform, things could get nasty for the Core i7-5820K. That's because the Skylake-based 6700K offers about 10% higher performance than the Haswell design at the same core clock. Also looming on the horizon are DirectX 12-based games, which will be coded much more efficiently to reduce CPU overhead. This could work either in favor of lower core counts, or perhaps in favor of higher core counts if developers use that increased efficiency to load on extra processing effects.
There is a caveat to all of this, however, and that is the dual-GPU scenario. If you're going to go with two high-end cards in Crossfire or SLI, the extra cores offered by the 5820K and its Haswell-E bretheren the Core i7-5930K and Core i7-5960X, along with their greater number of PCIe lanes, will more than even the odds. The more GPU power you throw at a game, the more the CPU will struggle to keep up, so the extra power of the X99 platform pays off in dual-card scenarios.
Need more advice? Well, for our take on the best overall system builds, check out our TBG Do-It-Yourself Buyer's Guides, which are updated on a monthly basis.