System Setup
Hardware
- NVIDIA NV36 Reference board, 128MB, AGP8X, NV36
- NVIDIA NV38 Reference board, 256MB, AGP8X, NV38
- ASUS V9950 GeForceFX 5900 Ultra, 256MB, AGP8X, NV35
- ATI Radeon 9600XT, 128MB, AGP8X, RV360
- ATI Radeon 9800XT, 256MB, AGP8X, R360
- Intel Pentium 4 3.0 'C', 15 x 200MHz, 512KB L2, HyperThreading
- EPoX EP-4PDA2+, Intel i865PE (Springdale-PE), Socket 478, dual DDR400
- Corsair XMS3200LLPT, 2 x 256MB, 2-2-2-5
- Enermax 431W PSU
- Swiftech MCX478
- Seagate Barracuda ATA IV 80GB
Software
- Windows XP Professional w/SP1 and current Windows Update patches (as of 14/10/03)
- ATI CATALYST 3.8
- Detonator FX 45.32 (MSI build supporting FX5950 Ultra)
- Detonator FX 52.16
- Intel Chipset Driver 5.00.1012
- DirectX 9.0a Runtime
- 3DMark 2001SE v330
- 3DMark03 v330
- Quake 3 v1.30 (Four demo)
- Aquamark 3
- Gun Metal 2
- Serious Sam 2 (SDC demo, Extreme addon)
- Unreal Tournament 2003 Retail (HEXUS custom demo)
- X2: The Threat Rolling Demo
Intermediate semi DX9 test suite for this article, proper DX9 focussed suite in articles featuring retail samples of boards using these new GPU's. Sorry I couldn't use new tests just yet.
So 5700 Ultra vs 9600XT in the mid-range, 5950 Ultra vs 5900 Ultra vs 9800XT in the high end.
Driver Issues
Lots of people are going to want to see driver image quality screenshots by the dozen with reviews of these new cards. Indeed I promised an IQ comparison article last week on the 52.16 driver that's released to the public today, that I should have finished by now. Apologies for not being done, writing this article and illness stops me from doing much on that side of things. It looks like the 3DCenter article on 52.14 applies to 52.16 too. In 52.14, FX's do an alternate, poorer quality (compared to ATI boards) trilinear mip filter in D3D than what's normally expected (and what's appeared in previous drivers).I still contend that you can't really see it while playing games, but it does exist and the freely available D3D AF tester will show you it in action. The issue arises with 9500+ series Radeon's able to do a poorer quality trilinear filter in D3D too, at the advantage of some extra performance.
This effects our testing like so. ATI's 3.8 CATALYST suite doesn't make it obvious how to configure the image quality sliders to enable the weaker trilinear. Plus in default maximum quality mode it's off, you get the higher IQ version out of the box. This isn't the case with NVIDIA's driver, you get the weak trilinear in D3D all the time.
The AF filter itself is excellent on FX's, actual aniso sample quality seems to me to be top notch (I'll show screenshots to hopefully prove it), it's simply the trilinear mip stage filter that's a bit broken still. I'm semi assured by an NVIDIA PR person that it should be fixed in a future driver. You can be certain that we, like everyone else, will pick any new driver with such a claim to pieces to see if it's true.
As it stands from me, no explicit IQ testing on every game (screenshots you can analyse). But the driver is out now, download and see for yourself, and wait for us and other sites to do explicit IQ analysis. My money is on a 3DCenter or Beyond3D article to show you what I mean, if I don't get there first. Pesky app/driver set IQ filtering makes it hard to write articles analysing IQ.
Broken mip stage trilinear in D3D (the filter between mipmap texture levels, so you don't see banding between textures of different quality), but nothing else generally broken that I can see, IQ wise. Application level weirdness I'll comment on as appropriate.
On to the benchmarks.