Today, August 20th, 2018, Nvidia announced its hotly-anticipated new line of Turing-based GPUs, the RTX 2000 series. Using an architecture adapted from the professional-market Volta series, Turing devotes up to 18.6 billion transistors to gaming graphics. Turing brings two major additions to the gaming space, namely real-time ray tracing using dedicated RT cores, as well as supersampling enhanced by artificial intelligence via Nvidia’s servers. Nvidia’s newest GPUs still include CUDA (aka SM cores), and Nvidia is offering up about 20% more at each level of its GPUs this time around versus what was released in 2016 and 2017. Also on board is new GDDR6 memory, which operates at 14Gpbs, up from around 8-11Gbps in previous GDDR5 and GDDR5X-based models.
The other big news is that Nvidia has finally given up on its blower cooler, which we’ve previously shown to be entirely inadequate to cool its modern GPUs. Even in small form factor cases, open-air coolers almost always perform significantly better, and with modern case design finally catching up with this reality, it no longer made sense to offer blower-style cards. Founders Edition cards will all be dual-fan open-air units, although some board partners will still offer blower cards (Asus having already announced one, its “Turbo” model).
Finally, there's some so-so news when it comes to the issue of running multiple cards in SLI. As with the Pascal generation, Nvidia has taken the opportunity with Turing to further restrict SLI usage. It has dropped SLI from yet another model (this time the 2070 loses SLI ability, last time it was the 1060), but Nvidia has also gone ahead and invented yet another SLI bridge, this time called NVLink (last time it was HB SLI). So that means if you want to run SLI, you'll need a 2080 or 2080 Ti, as well as a new bridge, which will not be included with your video card or your motherboard. Nvidia is offering them up for $80.
Now for the really bad news: pricing. Because Nvidia has zero competition and therefore could delay this release until the premium “Ti” version of its 2000-series was ready (as opposed to staggering the release), it could endow its GPUs with very high prices. The RTX 2070 8GB is launching at $500, which is $50 more than 2016’s GTX 1070, while the RTX 2080 8GB is coming in at $700, the same as 2016’s GTX 1080 8GB. Note that those older products were all “launched” at lower prices that didn’t actually come to pass: the GTX 1070 was $380 and the GTX 1080 was $600, but these were fake prices, as the very real $450 and $700 Founders Edition cards ended up setting the price floor for all subsequent versions. The Founders Edition versions of the RTX 2070 and RTX 2080 will be sold for $600 and $800, respectively, and while they come factory overclocked this time around, these prices are still pretty staggering. The biggest disappointment for gamers is the new RTX 2080 Ti 11GB, which will hit the market at $1,000 (and a shocking $1,200 for the factory-overclocked Founders Edition), a pricepoint previously reserved for exclusive Titan offerings. It will be a minimum of $250 above the GTX 1080 Ti 11GB’s launch price of $750 in May 2017.
We’re a bit concerned that as with the 1000-series GPUs, the Founders Edition pricing will end up setting an unofficial price floor, making the RTX 2070 8GB $600, the RTX 2080 8GB $800, and the RTX 2080 Ti 11GB $1,200. Nvidia is truly pushing the boundaries of what consumers can tolerate just to get better performance every few years. We're also very concerned that Nvidia will forbid board partners from overclocking their products, just like it did with the 1070 Ti it released late in the Pascal product cycle (update: partner RTX 2080 cards are now up for pre-sale, and so are RTX 2080 Ti cards, and while they are factory overclocked, the overclocks aren't being disclosed). Furthermore, all cards announced so far are priced just like the FE cards or higher. Looks like the "$500, $700, and $1,000" prices were indeed fake after all. That will make the price of the Founders Edition both better justified and a bit of a bitter pill for enthusiasts to swallow.
And what kind of performance can we expect this time around? Well, Nvidia was pretty clear it wasn’t going to provide any hard numbers before launch, but our hunch is that Turing’s CUDA cores aren’t actually faster than Pascal’s (which in turn weren’t faster than Maxwell’s released in 2014). While Pascal had the advantage of very high clock speeds, it doesn’t appear that Turing does, so instead Nvidia is giving users more CUDA cores at each product level to compensate (20% more in the 2070, 15% more in the 2080, and 21% more in the 2080 Ti). Like the 1080 Ti, the 2080 Ti will utilize a 352-bit memory bus, providing far more bandwidth than the 256-bit bus in the lower models.
Given that the RT cores and AI-enhanced supersampling will need a lot of game development work to utilize, our hunch is that in existing games, Turing GPUs will offer performance 15- 20% better than each of their predecessors based on the additional CUDA cores alone. So the $500-$600 RTX 2070 will be 20% faster than today’s ~$400 GTX 1070 8GB, the $700 -$800 RTX 2080 will be 15% faster than today’s $500 GTX 1080, and the $1,000-$1,200 RTX 2080 Ti will be 20% faster than today’s $650 GTX 1080 Ti. In future games, of course, that advantage could increase, particularly if game developers incorporate ray tracing into their graphics engines. Of the offerings, we’re most surprised by the RTX 2080, which just isn’t enhanced enough to justify its price (it will probably be slower than the ~$650 GTX 1080 Ti 11GB in current games). At least with Pascal, the x80 card had faster memory than the x70 edition, but this time around, it doesn’t. While the core clock is higher, if history is any guide, the 2070 and 2080 can actually hit the exact same speeds. In our opinion, there’s no question the 2070 and 2080 Ti will be the cards to get, if you can afford them!
With the 2080 and 2080 Ti launching on September 20th (and the 2070 coming in October) enthusiasts will have plenty of time to pick up closeout deals on Pascal GPUs, which we expect to drop another $100 to $150 from today’s prices (our guess is $300 for the 1070, $400 for the 1080, and $500 for the 1080 Ti). As Turing benchmarks won’t be available until its release date, you’ll have to roll the dice on whether it makes sense to pick up a closeout here, but hey, that’s just part of the game!