facebook rss twitter

NVIDIA GeForce GTX 480: the way it was meant to be played?

by Tarinder Sandhu on 29 March 2010, 14:55

Tags: GeForce GTX 480, NVIDIA (NASDAQ:NVDA)

Quick Link: HEXUS.net/qaxor

Add to My Vault: x

NVIDIA finally unleashed its Fermi architecture with the introduction of its GeForce GTX 480 and GTX 470 graphics cards late last week, but despite claiming victory with the fastest GPU in history, the graphics giant has been met with a mixed reception from both the media and its enthusiast fan base.

The range-topping GeForce GTX 480 is a beast of a graphics card, but it isn't without its well publicised drawbacks - including an average pre-order price of around £466, a potential maximum system-wide power draw of around 450 watts, and a sizzling operating temperature of around 93°C under load.

Despite the massive amount of performance on offer, the card is undoubtedly hot and expensive to run, and it offers only a marginal improvement over competing cards in current-generation gaming titles.

It's important to remember, however, that cutting-edge GPUs have a history of resulting in products that challenge thermal design. In the past, we've been able to forgive hot-and-pricey GPUs for the sake of jaw-dropping visuals in the latest triple-a titles. I remember picking up a Radeon 9800 XT in the winter of 2003, only to be plagued with overheating issues throughout the following summer. Back then, it was all worth it just to play Half-Life 2 with the eye-candy turned up to max.

And herein lies Fermi's biggest challenge - the architecture is second to none, as is performance, but triple-a PC exclusives are nowhere to be seen. NVIDIA's best technologies, it seems, are largely being ignored.

Developers, developers, developers

A GPU's greatness will be measured in relation to the games it displays. For the current generation, the ability to achieve that greatness lies largely with DirectX 11.

The latest version of Microsoft's graphics API brings numerous improvements to the table, and promises better-than-ever visuals through technologies such as hardware tessellation and real-time physics.

Mouseover to see what tessellation can add in terms of detail

Yet, DirectX 11 - first demonstrated in 2008 and brought to retail via ATI Radeon HD 5000-series GPUs some six months ago - has thus far struggled to live up to expectations, with supporting titles remaining few and far between.

To NVIDIA's credit, the GeForce GTX 480's scalable geometry pipelines offer the best hardware tessellation performance we've ever seen, and the GPU's compute architecture provides a number of other promising goodies, too. PhsyX is first to spring to mind, and when implemented right, the on-screen results can be breathtaking.

Unfortunately for NVIDIA, the implementation of its technologies is far from strong, and there's a reason why developers are reluctant to set their sights on newer, better and improved development platforms such as DirectX 11.

Games consoles: the real thorn in Fermi's side

To the dismay of many a PC gamer, the vast majority of developers have in recent years become devoted to creating games primary for consoles - particularly Microsoft's DirectX-driven Xbox 360.

Hoping to maximise profits, one could argue that developers have opted to shun the PC marketplace - which has admittedly been plagued with piracy for many years - in favour of the high margins available via console releases. Driving that point home, last year's Call of Duty: Modern Warfare 2 became the biggest entertainment launch in history, but first-day PC sales were reported to account for less than five per cent of the overall $310 million haul.

Why is a developer's focus on games consoles a concern for PC users? Well, both the Xbox 360 and PlayStation 3 utilise older GPU technology. Microsoft's console makes use of an ATI Xenos GPU that resembles the 2006 PC graphics card ATI Radeon X1900, whilst the PlayStation 3 features an RSX Reality Synthesiser GPU based on NVIDIA's GeForce 7800 architecture. Importantly, both chips only offer support for DirectX 9.0c, and as a result developers continue to create games using an API and shader model first introduced in 2004.

Unfortunately, that means many of the games being played on your £400 DirectX 11 GPU were actually created from the ground up for DirectX 9. Any additional DirectX 11 features are likely to have been added at a later date, and improved visuals through technologies such as tessellation or NVIDIA PhysX are therefore likely to be little more than an afterthought. This increasingly-common occurrence is taking place on a vast number of PC games now referred to as simply "console ports".

Hardware development will of course always remain a step ahead of software releases, and there's a case to be made for upcoming PC titles that may make full use of GeForce GTX 480 features, but can you name a single one?

Despite the numerous criticisms fired at NVIDIA's latest GPU, the most important appears to have been overlooked; there isn't anywhere near enough software support. Had GeForce GTX 480 launched alongside a big-name PC exclusive - imagine Half-Life 3 built from the ground up to take advantage of DirectX 11's array of technologies - it would have made perfect sense, and there may have been good reason to rush out and buy one. As it stands, it's a graphics card that offers a couple of extra frames per second, and little else.

The list of DirectX 11-compatible titles remains bleak, and the console ports currently on offer can in most cases be played back with the highest detail settings on a lesser, cheaper GPU, such as AMD's ATI Radeon HD 5850.

If you've ever wondered about the state of the enthusiast PC gaming marketplace; consider this, NVIDIA has launched the fastest single GPU ever known to man, but we're struggling to think of compelling reasons to buy one.

The worrying truth for PC gamers is that the buzz words of today - tessellation, realtime physics and interactive ray-tracing - may not be fully realised in terms of software until we see a new generation of games console.



HEXUS Forums :: 29 Comments

Login with Forum Account

Don't have an account? Register today!
it isn't without its well publicised drawbacks - including an average pre-order price of around £466, a substantial maximum power draw of around 450 watts

Little bit misleading that - shouldn't you state that you mean a system power draw? Implies that it's just the card itself at the moment.

Regarding the article… I'm not sure I completely agree. I mean, DX11 is for the 5*** range as well as the GTX 480, so I don't think it would've been a deal-maker for nVidia. But on the general idea of DX11, yes - definitely. It's like the whole Physx conundrum… the only reason to buy Physx-capable cards is to play a Physx-capable game. And the only reason to develop Physx-capable games is to sell them to owners of Physx-capable cards.

Having said that, it only takes one forward thinking developer to come along - like with Crysis a while back, and produce something really spectacular. Looking at that demo image right now, just imagine if a game like Oblivion was released looking like that tomorrow - people would be raving about the graphics (and too right), and it would sell extremely well… assuming the gameplay is of a good calibre. Obviously though, the developers make a hard decision… they either start off with DX9 and fully redevelop for DX11, and pay a fortune, or start off with DX11 and leave XP and consoles out of the picture. Not to mention that people will need a recent graphics card to even get those visuals, of which ownership is low. I still think there's a lot of money in it for a company who takes one of those risks, but especially given the times I'd say it's fairly unlikely to happen for a while.
If you've ever wondered about the state of the enthusiast PC gaming marketplace; consider this, NVIDIA has launched the fastest single GPU ever known to man, but we're struggling to think of compelling reasons to buy one.

That's because you're limiting your viewpoint to the gaming market? I've CUDA simulations that'd love a few of these.
borandi
That's because you're limiting your viewpoint to the gaming market? I've CUDA simulations that'd love a few of these.

I'd want the full fat fermi that doesn't have so much of the compute performance disabled, not the 480.

Great article Hexus. We can quibble at the details, but the bottom line is a very important and accurate one.
kalniel
Great article Hexus. We can quibble at the details, but the bottom line is a very important and accurate one.

Interesting indeed but curiously similar to an article on ‘Australian PC Authority’ that was referenced over on \. 3 days ago…? Surely not…

Anyway can this be blamed squarely on the console market? New GPUs are always right on the bleeding edge and while ‘enthusiasts’ may be running a couple of these in SLI a week after launch, the majority of PC gamers are using hardware a year or two old as well.

If one looks at something like the Steam hardware survey, which even of itself could be considered to be slightly more weighted to the enthusiast end of the market, there's still only 3.3% using DX11 GPUs
w1ntergr33n
Interesting indeed but curiously similar to an article on ‘Australian PC Authority’ that was referenced over on \. 3 days ago…? Surely not…

Curiously similar to the discussions that many of us have been having on Hexus over the last year or so, are you saying slashdot plagiarised us?