HEXUS Forums :: 16 Comments

Login with Forum Account

Don't have an account? Register today!
Posted by kalniel - Wed 07 Oct 2020 14:35
Not sure of the value of this, given what's coming in a few days.
Posted by Kanoe - Wed 07 Oct 2020 15:09
Hexus, on the control page you state:

Intel enjoys a small lead at FHD and QHD, but it's not large enough to be perceptible during gameplay.

but your graph actually has AMD ahead at QHD unless the labels on the graph bars are wrong?
Posted by Tarinder - Wed 07 Oct 2020 15:11
kalniel
Not sure of the value of this, given what's coming in a few days.

It's the current state of play. Whichever AMD CPU comes out in a couple of weeks will be added to this to show how Ryzen has progressed. Consider this a baseline article.
Posted by nwmark - Wed 07 Oct 2020 15:14
If your spending £5-600 on a CPU and £1400 on a GPU you are not gaming at 1080p or 1440p - so surely a pretty pointless exercise other than to show intel in better light??
Posted by Tarinder - Wed 07 Oct 2020 15:17
nwmark
If your spending £5-600 on a CPU and £1400 on a GPU you are not gaming at 1080p or 1440p - so surely a pretty pointless exercise other than to show intel in better light??

We thought that, but there games where fps is king, typically played at the lower resolutions. It also taps into the recent influx of high-framerate FreeSync/G-Sync monitors.

I'm personally more intrigued to see where upcoming Zen 3 sits in all this; a final nail in the 14nm+++++++++++++++++++++ coffin, perhaps.
Posted by nwmark - Wed 07 Oct 2020 15:21
Tarinder
nwmark
If your spending £5-600 on a CPU and £1400 on a GPU you are not gaming at 1080p or 1440p - so surely a pretty pointless exercise other than to show intel in better light??

We thought that, but there games where fps is king, typically played at the lower resolutions. It also taps into the recent influx of high-framerate FreeSync/G-Sync monitors.

Fair point - didn't really think of that angle.
Posted by Hoonigan - Wed 07 Oct 2020 15:27
The FHD graph on the Tomb Raider page has the colours mixed up.
Posted by Iota - Wed 07 Oct 2020 15:35
nwmark
If your spending £5-600 on a CPU and £1400 on a GPU you are not gaming at 1080p or 1440p - so surely a pretty pointless exercise other than to show intel in better light??

Not really, if the 3090 gave me a solid 144fps with all the eye candy at QHD, I'd opt for that over all the eye candy and lower fps at UHD. Personal preference obviously.
Posted by Tarinder - Wed 07 Oct 2020 15:42
Hoonigan
The FHD graph on the Tomb Raider page has the colours mixed up.

They don't. It's an anomaly of the game that the minimum is reported as more than the maximum. Only happens on the AMD platform.
Posted by BigBANGerZ - Wed 07 Oct 2020 15:48
Tarinder
I'm personally more intrigued to see where upcoming Zen 3 sits in all this; a final nail in the 14nm+++++++++++++++++++++ coffin, perhaps.

yes indeed looking forward to that one, though I think if using gaming benchmarks the 3080 is more relevant
Posted by Hoonigan - Wed 07 Oct 2020 15:52
Tarinder
They don't. It's an anomaly of the game that the minimum is reported as more than the maximum. Only happens on the AMD platform.

Ah right, fair enough. That's very odd!
Posted by Tarinder - Wed 07 Oct 2020 16:49
BigBANGerZ
yes indeed looking forward to that one, though I think if using gaming benchmarks the 3080 is more relevant

Yup, will switch to an RTX 3080 for that one - R9 3950X vs R9 59xx vs i9-10900K. Framerates will be around 10 percent lower.
Posted by LSG501 - Wed 07 Oct 2020 17:23
Would have been nice to have some benchmarks that weren't focused on gaming especially seeing as Nvidia are promoting the 3090 towards content creators… just so we could see gpu processing is affected by the cpu etc.
Posted by philehidiot - Wed 07 Oct 2020 19:06
Tarinder
nwmark
If your spending £5-600 on a CPU and £1400 on a GPU you are not gaming at 1080p or 1440p - so surely a pretty pointless exercise other than to show intel in better light??

We thought that, but there games where fps is king, typically played at the lower resolutions. It also taps into the recent influx of high-framerate FreeSync/G-Sync monitors.

I'm personally more intrigued to see where upcoming Zen 3 sits in all this; a final nail in the 14nm+++++++++++++++++++++ coffin, perhaps.

Ah, that fully automatic nail gun coming into play again…
Posted by nobodyspecial - Fri 09 Oct 2020 09:19
I'd pay that happily and buy a $1200 1600p monitor for it (cmon dell, with gsync chip!), but be happy on my 1200p monitor for a while longer even. 4k means nothing to most people. Get over it. Wake us when it hits 10% of the market, until then 1080p is what 66% of those buyers are doing pretty much. Right now 1440p+4k users don't hit 7% IIRC (checked a week or two ago). Close enough. AMD is advertising cpus at 1080p with the 5000's because that is what people are using 2/3 of the time. 4k just won't mean much to me for a few years at least. My games look great maxed at 1080 (nothing down), and I won't play a game unless I CAN do that 100%. I wait for the right card to get that done. Everyone I grew up with playing games, building pc's, all do the same today. They want to see what they dev built.
Posted by Iota - Sun 11 Oct 2020 18:51
@Tarinder.

Just out of curiosity, do you know how the Gears 5 game engine scales resolutions? I purely ask because I get pretty much the same FPS on my 2080 FE when rendering at the 2K or 4K resolution in the game (4K definitely looks a lot better though).

https://imgur.com/a/o8tP4Mj