Until September 2014, the Nvidia GeForce GTX 780 Ti 3GB was the fastest video card in town, and at that time it was selling for around $600-$650. Based on Nvidia’s “Kepler” micro architecture, it was a formidable (and power-hungry) beast. Then along came the svelte Maxwell-based Nvidia GeForce GTX 980 4GB, which slotted in at $550, shed about 75W of power draw, and initially posted benchmark results 10-15% faster than the 780 Ti. The extra 1GB of VRAM was nice, and so was the performance/Watt, but it wasn’t anything groundbreaking. Rumors have been circulating lately, though, that Nvidia has intentionally driven performance of the 780 Ti down over time to make the 980 look better. We took the hot new game Witcher 3 for a spin on our in-house 780 Ti and 980, along with a Radeon R9 290, to see if there was any truth to the rumor of Kepler’s untimely death.

First off, we should note that there are two highly-demanding features available only on Nvidia video cards: HBAO+ ambient occlusion and Nvidia Hairworks. We tested the game at 1920×1080, Ultra settings, but toggled these features on and off, in part to see how they affected gameplay, but also to allow for apples-to-apples testing of our AMD-based Radeon R9 290 video card.

Our test setup was as follows:

CPU: Intel Core i7-4790K
Motherboard: AsRock Z97 Extreme4
Memory: 4x4GB G.Skill TridentX DDR3-2400
OS: Windows 8.1

The three video cards we tested were as follows, with each one being run at reference and as-shipped clocks:

EVGA GeForce GTX 980 FTW 4GB
MSI GeForce GTX 780 Ti Gaming 3GB (discontinued)
Sapphire Radeon R9 290 Tri-X 4GB

We tested one of the introductory sequences in The Witcher 3: The Wild Hunt, where the protagonist Geralt is passing through a decimated village. We ran the game with its required release-day patch, along with Nvidia’s game-ready ver. 352.86 drivers, optimized for The Witcher 3. AMD’s latest beta driver was used for the R9 290, although it pre-dates the release of the game.

First off, we tested with all settings at maximum, meaning Ultra details, along with Nvidia’s Hairworks and HBAO+ enabled:

GW

Performance really isn’t that great on the Kepler-based 780 Ti. At reference clocks, the 980 is 48% faster, although the 780 Ti is 22% faster than the R9 290. That’s just about what we’d expect, maybe a bit lower than it should be. But then we tested with Hairworks and HBA0+ off, and things got very strange, very fast…

WO

Wow, something is not right here. It seems that Nvidia Gameworks was actually masking some performance issues lurking in the background. The Kepler-based 780 Ti underperforms the lower-end R9 290 without the Nvidia features enabled, and the 980 is now 52% faster than its older cousin. By the way, we also tested the 780 Ti with HBAO+ on and Hairworks off, and the hit to performance was only 1fps versus having HBAO+ off. Therefore, the 33% drop in performance on that card with the Gameworks features turned on can be attributed almost entirely to Hairworks (which looks pretty good, by the way). The R9 290 is 77% faster without Gameworks, showing that yes indeed, Nvidia-specific features can hammer AMD cards. Note that the impact of Gameworks is actually greater on the 980 than the 780 Ti, proving without a doubt that Gameworks is not what’s killing the 780 Ti.

What can we conclude from this? Something in the game engine is definitely taking a serious toll on the 780 Ti, but it’s not Gameworks. Given that the game is unplayable with Hairworks enabled on either the R9 290 or the 780 Ti, we think owners of both cards will play with it off, and the R9 290 (and 290X) owners out there will have a better experience.

We certainly haven’t figured out what is causing the discrepancy in Kepler vs. Maxwell performance, but it’s something we’ll keep an eye on in upcoming games. And despite what conspiracy theorists out there might think, it’s not Gameworks. But there is something unexpected going on here….

By the way, we also dish on AMD in today’s post on HBM and the upcoming 390X. Read on for our thoughts on the big announcement.