There's been a lot of talk about GPUs and image quality lately, and as the party on the receiving end of some of the accusations, AMD felt the need to set the record straight. That's why we were invited to talk to Senior Manager of Software Engineering Andy Pomianowski and Technical Marketing Manager Dave Nalasco about image quality and the ruckus that NVIDIA kicked off last week.
The settings, they are a-changin'
Dave explained to us that there had been some changes to the Catalyst drivers to coincide with the release of the HD 6000-series GPUs, and that image quality had been a big part of that. At the heart of all this is Catalyst AI, which controls a whole host of different settings via a single slider.
Responding to feedback, this single slider was divided into a number of different settings in the latest release, giving users a bit more control. One of the new additions was a slider to control texture filtering with settings for 'High Quality', 'Quality' and 'Performance'.
High Quality turns off all optimisations and lets the software run exactly as it was originally intended to. Quality - which is now the default setting - applies some optimisations that the team at AMD believes - after some serious testing, benchmarking and image comparisons - will maintain the integrity of the image while increasing the application performance. Lastly, the Performance setting applies even more of these optimisations to squeeze out a few more frames, but risks degrading the image quality just a bit.
What do you see?
Dave acknowledged that some sources had observed visual anomalies when running a few games and benchmarks. He explained that the algorithms that the drivers run - notably anisotropic filtering - are very complex and that despite their best efforts, the image wasn't going to be perfect 100 per cent of the time, even on default settings.
What he stressed was that, in the opinions of the whole driver development team, the default settings and optimisations still offered the best performance with no noticeable drop in quality for the vast majority of users the vast majority of the time. And for those who were experiencing any problems, High Quality mode would always be there to allow a picture perfect image. This, he made clear, wasn't going to change any time soon.
And then something strange happened - Andy asked us what we thought. These guys seemed genuinely concerned about what we felt were the best settings to use, whether we'd experienced any problems, and what we would change if we were designing the Catalyst tools. They're clearly committed to delivering the best product that they can, and that means listening to feedback and taking on board what the press, as well as average gamers, think.
Hopefully, this whole image quality debate can now be put to bed. At least for the time being.
Editor's note
AMD has admitted tinkering with the default image-quality settings with post-Catalyst 10.9 drivers. What's important here is the one implied takeway with got from Dave Nalasco "...best performance with no noticeable drop in quality for the vast majority of users the vast majority of the time." This tells me that AMD is prepared to sacrifice driver-default IQ for performance, but, importantly, still have the quality option available in the Catalyst Control panel.
We found that an AMD Radeon HD 5850's numbers jumped around five per cent when benchmarking a range of games with Catalyst 10.9 and Catalyst 10.10 - the latter including the new driver layout and optimisations.
What is conspicuous by its absence is a statement from AMD that indicates the 'new' default driver settings are comparable with arch-rival NVIDIA's. This means that, at absolute default default, NVIDIA produces a slightly better image quality, though 'better' is a hugely subjective term.
The method by which AMD and NVIDIA's GPUs run filtering algorithms - anti-aliasing and anisotropic filtering - is slightly different. This indicates it's almost impossible to have a perfect apples-to-apples comparison: you just need to compare the two companies' outputs to a reference raster from Microsoft. What irks me, however, is that I have to manually change the slider settings in order to obtain default IQ settings that were default in pre-Catalyst 10.10 drivers.
Is AMD cheating here? The answer, I feel, is no, but it is being somewhat underhanded by the way it's changed the default IQ settings without drawing explicit attention to it... until NVIDIA has pointed it out. Will NVIDIA be forced to follow suit and degrade its IQ settings in future ForceWare drivers, to stand on a level playing field? I don't know, but AMD is at the top of a very slippery path if it continues with driver optimisations that compromise default image-quality settings.
I've got no problems with optimisations per se, and the more choice the enthusiast has in the control panel, the better, but increased choice cannot and should not come at the expense of degraded IQ settings from freshly-installed drivers.