Introducing G84
Introducing G84
With the launch today of the GeForce 8600-series we finally get our hands on the new G84 GPU which sits at its heart.
Naturally, this chip is based on the G80 that powers the all-conquering 8800 range of cards. But NVIDIA hasn't been so crude as to simply hack away at the their flagship GPU until it's small enough to fit into the lower transistor budget; they've taken the opportunity to massage it to improve efficiency and implement new features.
So how does the G84 (somewhat confusingly named 8600) core line up against other SKUs in the NVIDIA range.
SKU | GeForce 8600GT | GeForce 8600GTS | GeForce 8800GTS | GeForce 7600GT | GeForce 7900GS |
---|---|---|---|---|---|
GPU Core | G84 | G80 | G73 | G71 | |
API support | DX10 (SM 4.0) | DX10 (SM 4.0) | DX9 (SM 3.0) | DX9 (SM 3.0) | |
Fabrication Process | 80nm | 90nm | 90nm | 90nm | |
Millions of Transistors | 289 | 681 + NVIO | 177 | 278 | |
Core Clock Speed | 540MHz Core / 1.19GHz Shader Processors | 675MHz Core / 1.45GHz Shader Processors | 500MHz Core / 1.2GHz Shader Processors | 560MHz | 450MHz |
Memory Clock Speed | 1400MHz | 2000MHz | 1600MHz | 1400MHz | 1320MHz |
Memory Interface Width | 128-bit | 320-bit | 128-bit | 256-bit | |
Memory Bandwidth | 22.4GB/sec | 32GB/sec | 64GB/sec | 22.4GB/sec | 42.2GB/sec |
Shader Units | 32 (Unified) | 96 (Unified) | 12 Pixel Shaders 5 Vertex Shaders |
24 Pixel Shaders 8 Vertex Shaders |
|
ROPs | 8 | 24 | 8 | 16 |
As with the G80 the G84 uses a unified shader approach and is fully Shader Model 4.0 compliant. The number of shader units is slashed from 96 for the 8800GTS (128 for the 8800GTX) down to 32, but these have been tweaked to improve their performance per clock. The manufacturing process sees an 8-series shrink to 80nm, too.
Disappointingly, the memory interface is cut all the way down to 128-bit and this leaves the 8600GTS, even with its 2GHz GDDR3, with a 10GB/sec deficit compared to the 7900GS. The 8600s do match the total number of shader units within the 7900GS's G71 core, albeit they're unified.
Content protection for most
G84 supports HDCP over DVI-I, as well as HDMI should manufacturers choose to implement it. This, however, is only compulsory on 8600GTS cards.
If you are buying an 8600GT then HDCP support is at the manufacturer's discretion.
The 8600GTS SKU ships with not just HDCP through both DVI-I outputs, but also HDCP over a dual-link DVI connection, a first for any graphics card, including the 8800-series cards.
Speaking of the 8800, the functionality of the NVIO chip, which controls all non-PCIe I/O functionality on G80-based boards in an effort to improve yields of the larger G80 die, has now been moved back into the main die for G84.
PureVideo HD
If you are going to provide all this content protection it would be rude not to lend a hand in the decoding process, and the 8600 cards are more than happy to not just take a step up beyond the GeForce 7-series GPUs but the G80 parts as well.
The video processor has been upgraded to help reduce CPU load by carrying out more video processing on the GPU for content encoded with MPEG-2, VC-1 and H.264 CODECs, even at high bit rates.
Totally new to PureVideo with the G84 is the Bitstream Processor, which accelerates the first stage of the video H.264 decoding pipeline, and the AES128 engine, which provides hardware acceleration of the encryption protocol used by HD-DVD and Blu-ray.
These units, combined, are able to accelerate four stages of the video-decode pipeline, and according to NVIDIA's documentation, can reduce CPU load to just 18-21 per cent, compared with a 7-series' 59-75 per cent