System Setup
Hardware
- ASUS Radeon 9800XT/TVD, 256MB, AGP8X, R360
- NVIDIA NV38 Reference board, 256MB, AGP8X, NV38
- ATI Radeon 9800XT, 256MB, AGP8X, R360
- Intel Pentium 4 3.0 'C', 15 x 200MHz, 512KB L2, HyperThreading
- EPoX EP-4PDA2+, Intel i865PE (Springdale-PE), Socket 478, dual DDR400
- Corsair XMS3200LLPT, 2 x 256MB, 2-2-2-5
- Enermax 431W PSU
- Swiftech MCX478
- Seagate Barracuda ATA IV 80GB
Software
- Windows XP Professional w/SP1 and current Windows Update patches (as of 14/10/03)
- ATI CATALYST 3.8
- Detonator FX 52.16
- Intel Chipset Driver 5.00.1012
- DirectX 9.0a Runtime
- 3DMark 2001SE v330
- 3DMark03 v330
- Quake 3 v1.30 (Four demo)
- Aquamark 3
- Gun Metal 2
- Serious Sam 2 (SDC demo, Extreme addon)
- Unreal Tournament 2003 Retail (HEXUS custom demo)
- X2: The Threat Rolling Demo
The usual test suite, driver setup and test platform. Driver quality was set to 'Quality' in both CATALYST and Detonator XP driver sets on all tested cards, to even out the playing field as much as possible. With a cloud over NVIDIA's use of trilinear mip level filtering in Direct3D, it would maybe have been more prudent to drop the quality in D3D mode down a notch on the Radeon's. I decided against it and against doing seperate runs with a comparable trilinear filter setting to populate the graphs. I simply set all cards to render as good an image as they possibly could, levelling the playing field in that way. If you like, imagine a tiny touch more performance from the 9800XT's in the graphs in multitexture heavy benchmarks, to account for a lower quality trilinear filter on those boards. It just wont look as nice.
Onto the graphs.