facebook rss twitter

Review: NVIDIA GeForce GTX 780

by Tarinder Sandhu on 23 May 2013, 14:00

Tags: NVIDIA (NASDAQ:NVDA)

Quick Link: HEXUS.net/qabwev

Add to My Vault: x

A chip off the Titan block

Kerching

Nvidia pulled off an enviable trick with its high-end GeForce GTX 600-series cards. This trick encapsulated two parts: a change in architecture from Fermi (GTX 580) to Kepler (GTX 680) along with an accompanying shrinking of the transistors that make up the GPU, down from 40nm to 32nm.

And what do you know, Nvidia managed to increase performance by 30-odd per cent, drop power by around 50W, and herald a new GeForce performance champion at the same $499 price point. But the real benefit to Nvidia lay in substantially reducing production costs, because, by historical standards, GTX 680 uses a mid-range-sized die.

In tenuous car parlance this was like swapping out an expensive-to-produce 3.0-litre engine, replacing it with a cheaper, more powerful 1.6-litre, and pocketing the difference. The Kepler architecture's impressive performance-per-watt metric has now been distilled much further down the product stack and, crucially, been propagated into the notebook market, where perf-per-watt is king.

Though we were impressed with just how much performance Nvidia squeezed out of a small die known as GK104, the proper enthusiast wanted the chip company to bring a genuine high-end part to the gaming market. Suitably emboldened with the success of Kepler cards and their ability to hamstring arch-rival AMD's Radeon pricing, Nvidia decided to think big, really big, and teased a Tesla K20X workstation GPU into the card that, just recently, became the GeForce GTX Titan.

Building on the energy-efficiency nature of the Kepler architecture by integrating 7.1-billion transistors into a card with a moderate 250W TDP, Titan handily beats GeForce GTX 680 and the competing Radeon HD 7970 GHz into submission at the kind of resolutions and image-quality settings that define the very cutting edge of hardware.

The trouble is Titan costs £850-plus and, as such, offers very little in terms of value for money, no matter how fast it is. And for those after ultimate performance, dual-GPU cards such as the GTX 690 and HD 7990 provide more performance and typically cost less.

So how about a Titan-like card, built from the same architectural fabric, costing substantialy less? That kind of card would be deserving of a name such as GeForce GTX 780, don't you think?

GeForce GTX 780 - a Titan in all but name

Nvidia is going to make a bunch of enthusiasts rather happy and a few rather cross at the same time. It is today announcing the GeForce GTX 780 GPU. This card uses the same GK110 die as the Titan but has two notable deficiencies that ensure performance will remain between 15-20 per cent below the range-topper. GTX 780, you see, drops two of Titan's 15 SMX units - which means a commensurate drop in texture units, etc. - and halves the card memory from a mammoth 6GB to an economical 3GB. The back-end setup remains the same, however, retaining 48 ROPs.

You'd expect a card costing around 40 per cent less to also rein in frequencies, but Nvidia actually increases the core from 837MHz to 860MHz - with an expected jump in GPU Boost - and keeps the 384-bit memory interface fed with 6Gbps memory chips. Nascent Titan owners, having shelled-out the best part of a grand, are likely to be miffed that a similar, albeit less powerful, card is available for a lot less.

A snip here and there doesn't mean a whole heap in isolation, so let's compare the vital specs of GeForce GTX 780 against other high-end cards.

GeForce GTX Titan
(6,144MB)
GeForce GTX 780
(3,072MB)
GeForce GTX 680
(2,048MB)
Radeon HD 7970 GHz
(3,072MB)
Launch Date
February 2013
May 2013
March 2012
June 2012
DX API
11.1
11.1
11.1
11.1
Process
28nm
28nm
28nm
28nm
Transistors
7.1bn
7.1bn
3.54bn
4.3bn
Approx Die Size
551mm²
551mm²
294mm²
352mm²
Processors
2,688
2,304
1,536
2,048
GPU Boost
v2.0
v2.0
v1.0
AMD-specific
Texture Units
224
192
128
128
ROP Units
48
48
32
32
GPU Clock/Boost (MHz)
837 (876)
863 (902)
1,006 (1,058)
1,000 (1,050)
Shader Clock/Boost (MHz)
836 (876)
863 (902)
1,006 (1,058)
1,000 (1,050)
GFLOPS
4,494
3,977
3,090
4,096
Memory Clock (MHz)
6,008
6,008
6,008
6,000
Memory Bus (Bits)
384
384
256
384
Max bandwidth (GB/s)
288.4
288.4
192.2
288
Power Connectors
8+6
8+6
6+6
8+6
TDP (Watts)
250
250
195
250
GFLOPS Per Watt
17.98
15.85
15.84
16.38
MSRP
$999
$649
$449
$449

Analysis

We'd liken the difference between Titan and GTX 780 as consistent with GTX 680 vs. GTX 670 and Radeon HD 7970 vs. HD 7950 - you get a second-rung card that's rather close in specification to the best of the bunch.

Our benchmarks show Titan to be some 30-50 per cent faster than a GTX 680 across a broad range of games. This therefore means that GTX 780 is likely to be 20-40 per cent nippier, relegating GTX 680 to the third rung in Nvidia's current single-GPU line-up. GTX 680 owners will feel further bruised by the knowledge that another Nvidia GPU, which we can't name here, is soon to replace their previous top-of-the-line card and demote it further still.

AMD's made very decent strides in driver-related performance boosts in recent months, so much so that the Radeon HD 7970 GHz Edition is quicker than the GTX 680. Our thoughts are that Nvidia has devised the GTX 780 to beat AMD's best at what can be considered premium (but not Titan-like outrageous) money.

Nvidia's recipe for creating a winning enthusiast card is disarmingly simple: take the best GPU, artificially hobble it in a few meaningful ways, and release at £550. But before the workstation mob become all excited and think of GTX 780 as a small-framebuffer Tesla on the super, super cheap, Nvidia has seen fit to remove/disable the GPU's double-precision ability.

We've managed to get this far without laying into Nvidia for calling this a 7-series GPU when there's been no advancement in process or architecture. The purist in us would prefer Nvidia to reserve a nomenclature change for the next iteration of GPU, Maxwell, but such thoughts are now shared by too few and fly in the face of 'bigger-means-better' marketing that Nvidia, AMD and Intel routinely engage in.