facebook rss twitter

Review: GeForce FX 5200 Ultra and FX 5600 Ultra

by David Ross on 13 March 2003, 00:00 3.0

Tags: NVIDIA (NASDAQ:NVDA)

Quick Link: HEXUS.net/qaqp

Add to My Vault: x

Benchmarks - Image Quality



Editors NotePlease see the conclusion for more information about our IQ statements.

The current trend throughout the graphics industry seems to be focused on producing cinema quality images rather than pure speed. To test these developments we have attempted to compare the image quality of the comparison cards.

The images shown are taken from screen captures in Unreal Tournament 2003. The screen resolution was set at 1024x768 and the images are not magnified. If you wish to examine the images more closely we suggest that you load them into an image viewer and use the magnification tool.

Reading from left to right the images show:

1. AA and AF turned off
2. AA Off and 8xAF
3. 4xAA and AF Off
4. 4xAA and 8xAF
5. 4xAA and 8xAF with NVIDIA’s texture sharpening enabled.

GF FX5200



Radeon 9000



GF FX5600



GF4 Ti4200



Looking at these fairly crude images we can begin to gauge the image quality produced by these graphics cards. As it was almost impossible to place the view in exactly the same place in UT each time we restarted the machine it should be remembered that the viewing angle and distance from the object is not uniform and may be affecting the results.

Hopefully these images are able to answer one of the most common graphics questions we are asked. What difference does changing the AF and AA settings make to the image quality? The effect of these features can be seen very clearly as you move from left to right.

There are two major differences in image quality demonstrated above. Firstly the brown area in the centre of each image contains a wooden style texture. This texture is very well defined on both the Ti4200 and the FX5200, but all the other cards display this as blurred. It is debatable which image is of better quality, but in my opinion better image quality is partly defined by the quality and detail of textures. Using that definition the Ti4200 scores the highest marks. Secondly you can see huge variances in the quality of the vertical edges. Without the combined AA and AF features these edges are poorly defined and made up of blocks. As we look further to the right the edges are blended and look far more realistic. Again measuring this is very subjective but in my opinion the cleanest lines are produced by the Radeon and the Ti4200.

These images would seem to contradict NVIDIA’s claim that the GF FX cards produce enhanced image quality. It may be that UT2003 is unable to fully utilise DX9. As said, image quality is very personal, so please draw your own conclusions.