facebook rss twitter

Review: Asus PQ321Q: 4K Gaming Tried and Tested

by Tarinder Sandhu on 17 July 2013, 12:00

Tags: ASUSTeK (TPE:2357), NVIDIA (NASDAQ:NVDA), AMD (NYSE:AMD)

Quick Link: HEXUS.net/qabypz

Add to My Vault: x

Framebuffer usage, thoughts

We've established that gaming on a 4K monitor at high- or ultra-quality settings requires a beefy graphics card. Performance in Far Cry 3, for example, barely nudges past 30fps, let alone the desired 60fps, while it's even lower in Crysis 3. The 8.3MP load is substantially higher than the 6.2MP load required for three 1080p screens in a 3x1 surround setup. In fact, simplifying it greatly, the pixel count is worth an extra screen - four vs. three - as a single 4K PC monitor uses resolution equivalent to a quartet of 1080p screens operating in tandem.

We can argue that antialaising is far less of a requirement at this resolution because the sheer number of pixels alleviates the need to run taxing calculations. It's still worth investigating what effect, if any, adding multisampling to the mix has on overall performance. We've chosen Far Cry 3 as the guinea pig because it's a thoroughly modern game and the non-AA results are deemed to be borderline playable.

The test is run on the GeForce Titan card and we use no AA, 2x MSAA and 4x MSAA.

Numbers don't mean much without pictorial references, so click on the following links to open up PNG files at the full 3,840x2,160 resolution: 0x AA, 2x AA, and 4x AA. The pictures are best viewed on a 4K panel, of course, but you'll soon get the idea that antialiasing doesn't help much once the resolution is dialled up to 11.

Remember that Titan is the fastest single-GPU in the business; its performance is leagues ahead of a mid-range card's. Add 2x MSAA and it takes a 15 per cent performance hit, crank that up to 4x and performance falls by almost a third, pushing the game from barely playable to juddery.

And having to strenuously exercise the backend of the GPU causes the framebuffer to fill up. Keeping all the data for 4x MSAA increases the framebuffer load by almost 50 per cent. What's important here is to realise that the maximum figure is still within the capabilities of the GTX 780 and HD 7970 GHz.

Having larger framebuffers remains more of a marketing tool than a real-world benefit for even enthusiast gamers... there simply won't be many harder cases than rendering to a 4K screen at ultra-quality settings and with a semi-pointless 4x MSAA invoked. Should you really want to have pristine edges in games and aren't happy with the default render from an 8.3MP screen, we'll doubtless see other, more efficient techniques such as FXAA take over.

4K - how does it look, thoughts

This article is more about framerate investigation at 4K than subjective analysis of the wow factor induced by having a 4K screen in the office. Sitting arm's length away from the screen, the extra detail over a 24in, 1080p screen is immediate and obvious; there's just more of everything. The panel's size, at 31.5in across, helps engender a feeling of swamped peripheral vision.

The 4K effect is less pronounced when moving from a 27in, 2,560x1,440 screen. You can see extra detail once you start looking for it, but play any fast-paced shooter and the pixel-density benefits of 4K diminish. We'd probably give up some of that lovely PPI for an even-larger screen, to fully engulf our vision.

Our first foray into the world of 4K consumer monitors shows that current high-end cards are, for the most part, able to run high/ultra-quality settings in excess of 30fps when antialiasing is turned off. Performance in GPU-crunching titles such as Crysis 3 or Far Cry 3 reduces markedly once any form of antialiasing is applied, though we'd only use it sparingly once you factor the insane resolution into account.

The sheer complexity of a 4K resolution and high-quality settings imposes new challenges for the cards' framebuffers. Our analysis shows that, even if antialiasing is used, it is unlikely that a Radeon HD 7970 GHz or GeForce GTX 780's 3GB framebuffer will be the main limiting factor; sheer horsepower and memory bandwidth are greater stumbling blocks to silky-smooth performance.

A best guess is that 4K screens won't become mainstream for at least a couple of years. Today's best cards can handle the resolution and image quality without too much fuss, so we're hopeful that the mid-range cards of 2015/2016 will make a decent fist of rendering games at stupid-high resolutions.

So what do you reckon, folks? Is 4K the next big thing in monitor innovation, and how much would you be willing to pay for such high-res thrills?



HEXUS Forums :: 31 Comments

Login with Forum Account

Don't have an account? Register today!
I would prefer to have a solid 120 FPS on every title rather than stupid-high resolution.
Why is it, that the most popular games, are not those that necessarily look the best, but those that have the best game play. For me, this is a step too far. As for how much I'd pay? Well I won't pay what it costs to get your average 46" HD TV, so I probably wouldn't buy one, unless they were competitively priced with standard screens. I cannot even justify the cost to buy the top end GPU required to use it.
I'm still waiting for a decent 2,560x1,440 screen (pref IPS, a reasonable refresh rate and not too much lag) that doesn't break the bank.
It's going to be a looong time before I can afford a 4K screen, and even longer before I can justify buying one; especially as it would mean a heavy GPU investment.

I'm happy as I am. :)
Is screen tearing more noticeable on these massive displays? I definitely think ultra resolution is going to be nice for some kinds of slower movement game - Civ 5 at 4k would be lush :) But others (driving games etc.) probably would do better with better response times - anytime you get blur or pixel ghosting you've just killed any resolution advantage.