vacancies advertise contact news tip The Vault
facebook rss twitter

Inflation adjusted price history of high end Nvidia GPUs tabulated

by Mark Tyson on 13 March 2017, 10:11

Tags: NVIDIA (NASDAQ:NVDA)

Quick Link: HEXUS.net/qade7h

Add to My Vault: x

Just ahead of the weekend HardOCP contributor, Zarathustra, published a pricing table comparing all the high end consumer graphics cards released by Nvidia since the year 2000. The table is interesting because it records Nvidia's launch pricing of flagship (Non Titan) graphics cards aimed at gamers alongside the price adjusted using the US Bureau of Labour and Statistics published values for the Consumer Price Index (CPI-U). Yes, they are basically inflation adjusted prices.

Putting the table results into perspective, it does seem like Nvidia high-end graphics card launches are often met with cries of 'price gouging' and similar emotive language about the pricing set by Nvidia. First of all we must remember that the flagships are premium and / or halo products for a range of graphics cards. Secondly, the graphical historical chart below, again by Zarathustra, plots the inflation adjusted prices in an easy to assess green wiggly line which indicates that the newly launched GeForce GTX 1080 Ti is "pretty much in line with where NVIDIA has typically been".

It is noted in the original report that Nvidia pricing of its flagships over the years has fluctuated significantly, even with the inflation adjustments. Neatly, the year 2000 GeForce 2 Ultra (pictured below) and the brand spanking new GTX 1080 Ti look to be the 'same' price. As anyone with a faint grasp of economics might have guessed, even without this data, the years where Nvidia priced its flagship most competitively were the years in which it faced "higher competition in the market". Let's hope for a downswing in the green line graph caused by fierce competition in 2017-18.



HEXUS Forums :: 14 Comments

Login with Forum Account

Don't have an account? Register today!
They maybe consistent in Dollar pricing in Nvidia's home market, but the chart does not reflect the fact that when the £/$ exchange rate wasn't weak as it is now, Nvidia, conspiring with price gouging UK retailers, ripped off the UK consumer.

Taking into account 20% VAT, prices in the UK are basically $1=£1 ATM, but there was a time when the £ = $1.50 to $1.60 and again taking into account VAT, there was still a 30% currency difference.

..and I can assure you that prices in the UK were NEVER cheaper by an amount that accurately reflected the higher exchange rate.
Currency movements are not that relevant expect of course if their prices end up too high in non-US markets they may not sell (although with Nvidia there seems to be a lot of buyers who think ‘higher prices=good’).
No more interesting is the die sizes, cost to make and Nvidia's margins (and since we don't know how much each product makes, we can only go with their overall margins even if they include non-gaming stuff).
We know that for a lot of expensive cards, the die sizes have gone down. And yes node wafer costs gone up. But another thing with Nvidia is that a lot of their cards are under-engineered (bean-counter'ed) with poor VRMs, PCB and cheap parts.
But against all that, people keep buying Nvidia's stuff by the tons and even when AMD have far better value products at a given price the Nvidia stuff sells better. So unless consumers change their ways, why would Nvidia change?
(Personally I've mostly avoided their stuff since their infamous solder defects back in the 65nm era which affected me and people I know rather a lot.)
Why non-Titan? That just makes the chart incomplete.
DanceswithUnix
Why non-Titan? That just makes the chart incomplete.

Because the Titan isn't purely a gaming GPU? Or at least it wasn't when it was first released, it had a much higher configuration for DP compute. I think that may have changed with the latest iteration though…
DanceswithUnix
Why non-Titan? That just makes the chart incomplete.

To my mind the Titans are semi-pro cards, and therefore there's a reasonable argument for keeping them out. Personally I'd have done both ways, but I can appreciate the logic.

Point is, whether they are there or not, it clearly demonstrates that when people moan about NVIDIA pushing prices into the stratosphere they're often talking rubbish, and in reality are just failing to appreciate inflation.