facebook rss twitter

Nvidia GeForce RTX 2080 50 per cent faster than GTX 1080

by Tarinder Sandhu on 22 August 2018, 19:01

Tags: NVIDIA (NASDAQ:NVDA)

Quick Link: HEXUS.net/qadwrm

Add to My Vault: x

Nvidia's big announcement at Gamescom this year is trio of GeForce RTX 20-series graphics cards. A lot has been made of their forward-looking features that will offer far more impressive reflections through raytracing and improved anti-aliasing performance via intelligent AI, amongst other benefits.

We already know that Turing, the architecture underpinning RTX 20-series, will combine traditional rasterisation (SM cores) with raytracing (RT cores) and AI (Tensor cores) for best-in-class image quality, but the question on most enthusiasts' lips is just how fast these cards are going to be in present games.

Nvidia has previously alluded to RTX 20-series being fundamentally better than the incumbent 10-series for pure rasterisation - in other words, for the games you play today - and now we can shed some light by sharing a performance slide from a presentation that backs up this assertion.

Performance slide provided by Nvidia

Comparing the GeForce RTX 2080 against the GeForce GTX 1080 at a 4K resolution - a model-to-model comparison more than a price-to-price one - we see the new GPU is up to 50 per cent faster in today's games when benchmarked at the same image-quality settings... without activating the machine-learning-based DLSS anti-aliasing, where it is said to be 2x faster than the GTX 1080 running traditional TAA.

That's a goodly amount. Understanding how it achieves this performance uptick requires an appreciation of the additional efficiencies present in the Turing SM block, (think of how the Volta architecture works), plus just more shaders and memory bandwidth - 2,944 vs. 2,560 and 448GB/s vs. 320GB/s, respectively.

Interestingly, that purported performance increase ought to put it a shade or two above a GeForce GTX 1080 Ti.

There you go. Finally, some hard-and-fast numbers on what makes RTX 2080, and the rest of the RTX line, a better gaming GPU than its predecessor. The numbers are better than a high-level perusal of the spec sheet would suggest.

Of course, it is always prudent to treat manufacturer-provided numbers with caution; they will inevitably produce a best-case scenario to illustrate generational performance uplift, and our own upcoming testing out to separate the wheat from the chaff.

Is a potential 50 per cent enough of a performance hike in your opinion, especially when one factors in the matching 50 per cent pricing hike? We look forward to hearing your thoughts. Fire away.



HEXUS Forums :: 85 Comments

Login with Forum Account

Don't have an account? Register today!
A few things about the figures. Pascal had some issues with HDR,which means a GTX1080 can lose 10% to 15% of its performance unlike a Vega64 and it looks like a software issue:

https://www.computerbase.de/2018-07/hdr-benchmarks-amd-radeon-nvidia-geforce/2/

Edit!!

Also Wolfenstein II is DX12/Vulkan so that indicates async compute now works properly on Nvidia cards. That is the bigger deal about the figures. That means we are more likely to see DX12/Vulkan games on PC now.
There you go. Finally, some hard-and-fast numbers on what makes RTX 2080, and the rest of the RTX line, a better gaming GPU than its predecessor. The numbers are better than a high-level perusal of the spec sheet would suggest.

Don't believe it, I'll wait until we see some actual real world benchmarks not provided by a marketing team.

Is 50 per cent enough of a performance hike in your opinion, especially when one factors in the matching 50 per cent pricing hike? We look forward to hearing your thoughts. Fire away.

No. Normally you'll get the previous generations top performance for the same or a lower price point, not a massive price hike. Nvidia is not endearing themselves to me, not with the price hike.
Oh,sneaky,Nvidia,sneaky and quite clever too.

Wolfenstein uses FP16 for certain effects and Pascal lacks decent FP16 performance which Turing will have!!

Edit!!

Having said that I would hope its quicker than a GTX1080TI at that price!! :p
Iota
No. Normally you'll get the previous generations top performance for the same or a lower price point, not a massive price hike. Nvidia is not endearing themselves to me, not with the price hike.
Indeed because otherwise a high-end GPU (or any GPU) would cost £10,000s.
There used to be someone who compiled what they called a Voodoo Point rating where the original Voodoo 2 scored 1 point. Think a HD7970GHz was around 400 so if perf/price hadn't changed that would have been $80,000 and we'd be over £100k by now.


LOL,full HDR test suite!!