facebook rss twitter

AMD puts image quality debate to bed?

by Pete Mason on 1 December 2010, 18:08

Tags: Catalyst Drivers, AMD (NYSE:AMD)

Quick Link: HEXUS.net/qa3ek

Add to My Vault: x

There's been a lot of talk about GPUs and image quality lately, and as the party on the receiving end of some of the accusations, AMD felt the need to set the record straight. That's why we were invited to talk to Senior Manager of Software Engineering Andy Pomianowski and Technical Marketing Manager Dave Nalasco about image quality and the ruckus that NVIDIA kicked off last week.

The settings, they are a-changin'

Dave explained to us that there had been some changes to the Catalyst drivers to coincide with the release of the HD 6000-series GPUs, and that image quality had been a big part of that. At the heart of all this is Catalyst AI, which controls a whole host of different settings via a single slider.

Responding to feedback, this single slider was divided into a number of different settings in the latest release, giving users a bit more control. One of the new additions was a slider to control texture filtering with settings for 'High Quality', 'Quality' and 'Performance'.

High Quality turns off all optimisations and lets the software run exactly as it was originally intended to. Quality - which is now the default setting - applies some optimisations that the team at AMD believes - after some serious testing, benchmarking and image comparisons - will maintain the integrity of the image while increasing the application performance. Lastly, the Performance setting applies even more of these optimisations to squeeze out a few more frames, but risks degrading the image quality just a bit.

What do you see?

Dave acknowledged that some sources had observed visual anomalies when running a few games and benchmarks. He explained that the algorithms that the drivers run - notably anisotropic filtering - are very complex and that despite their best efforts, the image wasn't going to be perfect 100 per cent of the time, even on default settings.

What he stressed was that, in the opinions of the whole driver development team, the default settings and optimisations still offered the best performance with no noticeable drop in quality for the vast majority of users the vast majority of the time. And for those who were experiencing any problems, High Quality mode would always be there to allow a picture perfect image. This, he made clear, wasn't going to change any time soon.

And then something strange happened - Andy asked us what we thought. These guys seemed genuinely concerned about what we felt were the best settings to use, whether we'd experienced any problems, and what we would change if we were designing the Catalyst tools. They're clearly committed to delivering the best product that they can, and that means listening to feedback and taking on board what the press, as well as average gamers, think.

Hopefully, this whole image quality debate can now be put to bed. At least for the time being.

Editor's note

AMD has admitted tinkering with the default image-quality settings with post-Catalyst 10.9 drivers. What's important here is the one implied takeway with got from Dave Nalasco "...best performance with no noticeable drop in quality for the vast majority of users the vast majority of the time." This tells me that AMD is prepared to sacrifice driver-default IQ for performance, but, importantly, still have the quality option available in the Catalyst Control panel.

We found that an AMD Radeon HD 5850's numbers jumped around five per cent when benchmarking a range of games with Catalyst 10.9 and Catalyst 10.10 - the latter including the new driver layout and optimisations.

What is conspicuous by its absence is a statement from AMD that indicates the 'new' default driver settings are comparable with arch-rival NVIDIA's. This means that, at absolute default default, NVIDIA produces a slightly better image quality, though 'better' is a hugely subjective term.

The method by which AMD and NVIDIA's GPUs run filtering algorithms - anti-aliasing and anisotropic filtering - is slightly different. This indicates it's almost impossible to have a perfect apples-to-apples comparison: you just need to compare the two companies' outputs to a reference raster from Microsoft. What irks me, however, is that I have to manually change the slider settings in order to obtain default IQ settings that were default in pre-Catalyst 10.10 drivers.

Is AMD cheating here? The answer, I feel, is no, but it is being somewhat underhanded by the way it's changed the default IQ settings without drawing explicit attention to it... until NVIDIA has pointed it out. Will NVIDIA be forced to follow suit and degrade its IQ settings in future ForceWare drivers, to stand on a level playing field? I don't know, but AMD is at the top of a very slippery path if it continues with driver optimisations that compromise default image-quality settings.

I've got no problems with optimisations per se, and the more choice the enthusiast has in the control panel, the better, but increased choice cannot and should not come at the expense of degraded IQ settings from freshly-installed drivers.

 



HEXUS Forums :: 5 Comments

Login with Forum Account

Don't have an account? Register today!
So does this mean benchmarks need to be redone to get a like for like comparison, or did Hexus disable AMDs override in their testing?
While it shouldn't have to come down tot his, I wish one of the many tech outlets around the web would just do a benchmark at the various settings and post up the findings.
lol - I don't care wot anyone says - I can tell the difference between running a game on ati v's nvidia. :)
ATI is better quality - physically noticeable on default settings of both types of card.
Now ATI optimise they're drivers and pay attention to performance, someone calls foul - let me guess who - nvidia - who've been known for doing it for years, lol….:clapping:
I think their would be quite a few people who disagree with the statement of ATI having optimised drivers…
zoomee
Now ATI optimise they're drivers and pay attention to performance, someone calls foul - let me guess who - nvidia - who've been known for doing it for years, lol….:clapping:
The first widely reported ‘cheating / optimisation’ I remember, was the Quack3.exe incident. There is no denying that one, there were plenty of side to side screen shots. It may not justify nVidia's subsequent ‘cheat / optimisation’, but let's not be blind to either company's history and mistakes.

And then something strange happened - Andy asked us what we thought.
And what did you guys say? I'll give them credit for handling the issue quite well. I also think that if it takes a rival company to highlight the issue, then it probably is more negligible than past controversies (which were found by eagle eyed reviewers). As enthusiasts, we like to tweak (I am sure many of us have, in the past, tweaked our settings to improve frame rate at the cost of qualities), and so long as the options are transparent, and in our hand, then I am okay with multiple profiles (but don't wait for others to point out, please).