A new generation of consumer gaming GPUs is finally arriving over two years after the launch of Nvidia Pascal based GTX 10xx series. Unfortunately for gamers, AMD had not been competitive at the top end and GPU prices were extremely inflated by the cryptomining craze. So let us take a look at what the new Turing architecture promises to bring and how much the green team attempts to ask for it.
From a technical standpoint, today’s announcement is very fascinating and Nvidia seems justified to call it the greatest leap since CUDA and GeForce 8xxx series. For a long time, Nvidia had been crippling advanced computation features in GeForce cards, as they did not affect games much, but were important for specialists paying over $2000 for Quadro and Tesla cards. However, as Nvidia aims to bring real time raytracing to gaming, some of those features are necessary. To speed up raytracing itself, Turing GPUs have special RT cores that are very efficient at calculating ray intersections with objects. Even with that power, producing raytraced effects in a scene without noise would take too long. To keep rendering time in check, the Tensor cores are very good at running deep learning algorithms to de-noise partially rendered raytraced images and give a pleasing result. For another example of their use, Nvidia has demonstrated is DLSS – a new anti-aliasing algorithm. It is trained on much more powerful server farms against extremely high resolution targets that cannot be rendered in real-time. After such extensive training, DLSS can run on Tensor cores to avoid aliasing while game renders at more reasonable resolution. Based on the Nvidia’s existing work with DL algorithms, I expect it to work well, but training it for each game may be necessary for the best results.
While the performance benefits for raytracing from the new hardware and algorithms are impressive, they are not enough to run completely raytraced games yet. Thus the favoured approach at the moment is hybrid rendering. Games that support new hardware still use rasterization for rendering a lot of stuff. Meanwhile, raytracing is used to improve the indirect lightning, shadows and reflections. While current effects try to fake advanced stuff convincingly based on what is visible on screen, new raytraced effects do that much more realistically and can easily include off screen objects present in the scene. Nvidia has announced 21 games using these effects, including Shadow of the Tomb Raider, Metro: Exodus and Battlefield V.
Three RTX cards were announced: GeForce RTX 2080 Ti, GeForce RTX 2080, and GeForce RTX 2070. All of these cards use 14 Gbps GDDR6 memory and support the new VirtualLink standard to provide all power and data to VR headset via a single cable with USB-C connector. The reference design has changed considerably from the blower design of previous generation. Now Nvidia’s own cards have a dual fan design with vapour chamber. There is a new SLI connector based on NVLink and providing higher bandwidth, but it is only available on RTX 2080 and RTX 2080 Ti.
RTX 2080 Ti and RTX 2080 are set to launch on September 20th, while RTX 2070 will launch in October. It is hard to tell performance from just the specs alone for the new architecture. Based on FLOPS alone, new cards do not seem to be a full level above their predecessors in games not using the new effects. We will have to wait until launch and independent benchmarks to be sure. In case of new effects, the result is obvious – old GeForce cards are simply hopeless at raytracing compared to RTX series.
Now, there is a part that can only be described as bad – pricing. MSRPs are $499 for RTX 2070, $699 for RTX 2080 and $999 for RTX 2080 Ti. Founders Editions are even more expensive, at extra $100 for two lower parts and extra $200 for RTX 2080 Ti. RTX 2070 is priced at a point where xx80 usually were in the past, RXT 2080 is at Ti pricing. While we don’t get a new Titan, 2080 Ti is at Titan pricing (excluding V). We will see if the cards provide a performance worth their price, but such extreme price creep is troubling nevertheless. While Nvidia’s technological dominance does not seem to hinder the GPU technological advancement yet, it seems very unhealthy for the gamers’ wallets.
Comments (0)