2,560x1,600 8xAA Max AF
But what happens when load is increased to 2,560x1,600 with maximum image-quality settings invoked? We've run the Sapphire Radeon HD 4870 Vapor-X 2GB up against a reference-clocked 1GB card, to see how frame-buffer additions are manifested with everything turned up.Call of Duty 4: MW (high-end) 2,560x1,600 8xAA 16xAF | |
---|---|
Sapphire HD 4870 Vapor-X 2GB | Sapphire HD 4870 1,024MB |
40.6 | 40.2 |
Nothing in Call of Duty 4.
Company of Heroes: OF (high-end) 2,560x1,600 8xAA 0xAF | |
---|---|
Sapphire HD 4870 Vapor-X 2GB | Sapphire HD 4870 1,024MB |
32.76 | 28.87 |
But a noticeable improvement in Company of Heroes, and the average frame-rate actually makes a difference.
Enemy Territory: Quake Wars (high-end) 2,560x1,600 8xAA 16xAF | |
---|---|
Sapphire HD 4870 Vapor-X 2GB | Sapphire HD 4870 1,024MB |
33.95 | 33.85 |
Far Cry 2 2,560x1,600 8xAA MaxAF | |
---|---|
Sapphire HD 4870 Vapor-X 2GB | Sapphire HD 4870 1,024MB |
21.44 | 14 |
Far Cry 2 shows significant gains - or a lack of performance loss, if you will - when moving from 1GB to 2GB, but the game's hardly playable at these settings.
Race Driver: GRID (high-end) 2,560x1,600 8xAA 0xAF | |
---|---|
Sapphire HD 4870 Vapor-X 2GB | Sapphire HD 4870 1,024MB |
49.23 | 49.24 |
Image quality
We see no performance drop-off going from 2GB down to 1GB in Call of Duty 4: MW, but how different is the image quality between 4x AA and 8xAA?
Tell the difference in one portion of the screen, captured at 1,920x1,200? Worth losing around 40 per cent of the frame-rate for?