HEXUS Forums :: 20 Comments

Login with Forum Account

Don't have an account? Register today!
Posted by ZaO - Wed 17 Dec 2014 13:47
Relatively good for value for money this thing. I think these are gonna start selling like crazy with their new price! Shame Amd don't have a nice Cpu equivelant to go with this thing lol… >_<
Posted by wingtit - Wed 17 Dec 2014 14:45
Very nice, but one would be the preferred option.
Posted by crossy - Wed 17 Dec 2014 15:42
We'd still opt for three GeForce GTX 980s instead of four GPUs housed inside two R9 295X2s, based on our testing, but AMD does win out when value is as important as sheer performance.
Unless I'm reading the figures wrong, I'd slightly disagree with that last bit. According to your benchmarks, the AMD solution (cheaper remember!) manages to beat the best of the Green team in Bioshock, Crysis and Grid benchmarks, and it's pretty close on the 3D Mark, with NVidia pulling ahead on Tomb Raider and Total War.

I'd come to the conclusion that - ignoring the noise and power draw - that the top end from NVidia and AMD are pretty evenly matched. Decision then becomes whether to take the purchase cost saving and put that against the increase power needed to run the beastie.

Not that this is likely to be an issue for me since I'm firmly stuck on the 1080p res gaming.
Posted by AlexKitch - Wed 17 Dec 2014 16:05
1200W?! That's like running a powerful microwave oven. Potentially for hours on end. Every day.

That'll make for an eye-watering electricity bill.
Posted by Brian224 - Wed 17 Dec 2014 16:15
AlexKitch
1200W?! That's like running a powerful microwave oven. Potentially for hours on end. Every day.

That'll make for an eye-watering electricity bill.

The difference between both options is about 7p an hour, so you would need to game for about 21,000 hours on quad Crossfire to eat up the price difference of about £1500. That is 8 hours a day, 7 days a week for 7 years. If you played 24/7 (or in shifts with friends) you could cut the payback on the Nvidia cards to about 2 1/2 years though :)
Posted by Spreadie - Wed 17 Dec 2014 16:53
Brian224
AlexKitch
1200W?! That's like running a powerful microwave oven. Potentially for hours on end. Every day.

That'll make for an eye-watering electricity bill.

The difference between both options is about 7p an hour, so you would need to game for about 21,000 hours on quad Crossfire to eat up the price difference of about £1500. That is 8 hours a day, 7 days a week for 7 years. If you played 24/7 (or in shifts with friends) you could cut the payback on the Nvidia cards to about 2 1/2 years though :)

Price difference of £1500? Are these cards free then?

The review says they'd still opt for three 980s, not specifically the Waterforce trio. Three OC'd 980s might come in at £1500 total, less the £1000 ish for the 295x2s. That's nearer £500 than £1500.
Posted by Roobubba - Wed 17 Dec 2014 19:54
Then you'd have to factor in whether it's worth using them for mining, and the relative hash rates you'll get between the red and green team.

I am pretty sure that I could do the calculations, but really I can't be bothered :)
Posted by Brian224 - Wed 17 Dec 2014 20:01
Spreadie
Price difference of £1500? Are these cards free then?

The review says they'd still opt for three 980s, not specifically the Waterforce trio. Three OC'd 980s might come in at £1500 total, less the £1000 ish for the 295x2s. That's nearer £500 than £1500.

True, I was taking the price of the graphics setup that was used in the comparative tests as some of the comments on noise and temperature may not apply to the standard cards. Even so, it would still take about 2 1/2 years at 8 hours a day, 7 days a week of 4K gaming to recover the difference. Frankly, if you can afford either setup, I doubt you will worry too much about the electricity bill - especially if you factor in the free space heating provided by the AMD cards :)
Posted by KeyboardDemon - Thu 18 Dec 2014 08:55
What I found most interesting was that for less than £500 it is possible to get a dual GPU set up that beats the single GTX980 cards which start at £430 for stock reference models and slightly more for overclocked non-reference models. Had the choice been available to me when I bought my GTX780Ti then my GPU may well have been flying a different colour.
Posted by GrahamC - Thu 18 Dec 2014 11:04
Brian224
AlexKitch
1200W?! That's like running a powerful microwave oven. Potentially for hours on end. Every day.

That'll make for an eye-watering electricity bill.

The difference between both options is about 7p an hour, so you would need to game for about 21,000 hours on quad Crossfire to eat up the price difference of about £1500. That is 8 hours a day, 7 days a week for 7 years. If you played 24/7 (or in shifts with friends) you could cut the payback on the Nvidia cards to about 2 1/2 years though :)

BOTH… are eye-watering on the running cost front is the point. These set ups will raise your elecy bill each month compared to a ‘normal’ setup.
Posted by AlexKitch - Thu 18 Dec 2014 12:59
Well, numbers aside, I think I'm a bit unusual in that I'd rather actually not own power guzzling hardware like this, regardless of whether I could afford the electricity bill.

I'm actually more impressed by moderately powerful setups that manage to be ‘Green’. My own setup at home could be described as ‘high end’ in terms of its benchmark and gaming performance, and can probably chew through 500W on full throttle, but when I'm just sat listening to music or writing code it's using barely anything at all. Probably less than 100W. This pleases my OCD.
Posted by Michael H - Thu 18 Dec 2014 22:53
Tarinder
Positive performance-related upsides are ameliorated by huge power consumption and increased noise that is rather too conspicuous when compared to a custom setup like the Gigabyte WaterForce

I don't think the use of ameliorated is correct as melioration/amelioration is a positive development process.

Perhaps counterpoised would be more apposite?

I only discovered the word melioration when looking for an alliterative title for a F1 blog post a few months back, so it stuck out like a sore thumb when I read it.
Posted by scaryjim - Thu 18 Dec 2014 23:20
GrahamC
… These set ups will raise your elecy bill each month compared to a ‘normal’ setup.

Turning on more things uses more electricity - shocker! ;)</dailymail>

If you've invested heavily in your gaming rig as your main pastime you'll probably find that the cost of running your PC during gaming for a month isn't actually any more expensive than, say, going to the movies once a week, or having a gym membership. I mean, it's not like your computer is just using that power up to no end - you're investing that running cost in your entertainment. 1200w from the wall costs about 15p an hour. A frugal (say, 200W) machine will cost around 4p an hour. I'm willing to bet that plenty of people would consider 11p an hour a reasonable cost for being able to turn all the pretties up to maximum on a 4k screen ;)
Posted by ZaO - Fri 19 Dec 2014 01:59
I'm pretty sure my electric costs 26p pKWH during the day. So, if I gamed on a rig drawing 1000W while gaming for 4 hours a day, that'd be £87.36 per quarter. Now if I had a rig that only drew 500W while gaming, that'd be £43.68 per quarter. Something like that can make a big deal to me when it comes to paying the bills! Please correct me if I'm wrong on those calculations. I do have a habbit of messing up with numbers :P
Posted by glenster - Fri 19 Dec 2014 04:29
Doesn't “ameliorated” mean “made better”?
Posted by ElManCub - Fri 19 Dec 2014 10:52
crossy
We'd still opt for three GeForce GTX 980s instead of four GPUs housed inside two R9 295X2s, based on our testing, but AMD does win out when value is as important as sheer performance.
Unless I'm reading the figures wrong, I'd slightly disagree with that last bit. According to your benchmarks, the AMD solution (cheaper remember!) manages to beat the best of the Green team in Bioshock, Crysis and Grid benchmarks, and it's pretty close on the 3D Mark, with NVidia pulling ahead on Tomb Raider and Total War.

I'd come to the conclusion that - ignoring the noise and power draw - that the top end from NVidia and AMD are pretty evenly matched. Decision then becomes whether to take the purchase cost saving and put that against the increase power needed to run the beastie.

Not that this is likely to be an issue for me since I'm firmly stuck on the 1080p res gaming.
But it is 4 gpus versus 3 so I'd imagine red to win anyway but what if it had been 4 for the green team?
Posted by crossy - Tue 23 Dec 2014 15:26
ElManCub
But it is 4 gpus versus 3 so I'd imagine red to win anyway but what if it had been 4 for the green team?
I know what you're getting at, but I'll politely suggest that the way that these top end cards deliver is pretty irrelevant. I mean the kind of “xtr3m3” gamer that's going to sink (a lot!) of money into these kind of products isn't going to much care whether than 60fps+ is delivered by a single GPU, dual, trio or quads. In addition, the NVidia fan boys (of whom I'm not one) have been saying that the 970/980's are “slam dunks” against naff AMD products. Hexus' benchmarks - as I said before - kind of put the lie to this assertion, and in fact the top end devices from each camp are actually pretty evenly matched.
Posted by claylomax - Wed 24 Dec 2014 12:13
AlexKitch
1200W?! That's like running a powerful microwave oven. Potentially for hours on end. Every day.

That'll make for an eye-watering electricity bill.

Are you sure? A 800W microwave uses around 2400W, a 1200W probably use more than 3000W. Always check the sticker at the back not the number on the door.
Posted by jackjack - Tue 06 Jan 2015 07:35
Excellent review! Thank you!
Posted by go4brendon - Tue 06 Jan 2015 10:55
I recently upgraded to a 295x2. Probably one of the finest gfx card I've ever owned. My only gripe is coil noise, couldn't imagine running 2 in SLi mode.