I think we'd all welcome less power usage and less heat, provided we still have the performance we like. My graphics card alone uses a lot more power than a current gen games console haha… :/
Replacing your GTX780 with a 970 to save the planet? “Just off to the shops to save the planet dear”, don't think that really flies as an excuse, but am willing to give it a go ;)
But replace your i7 4820K with a Pentium Anniversary Edition? They lost me there.
Edit to add: Non paid for version of the article seems to be linked here
https://sites.google.com/site/greeningthebeast/energy
So, replace your PC parts (ie: make them redundant, increase waste, increase landfill) with new parts that require power to create in order to save energy?
Oh, and let's also not forget the cost.
Don't get me wrong, I do think that newer architectures should be more energy efficient, but don't make everyone replace stuff just for the sake of it.
Also, how about investing some of that time and money in clean, renewable energy - let's make generating it more efficient too!
I'd quite happily change my system for a more energy efficient machine, but I do not have the wherewithall to do that. My ‘old’ machine is what I have and that is that. Unless of course the manufacturers are willing to reduce the price of the components (I'm must be having a laugh :p ), I can't see myself doing what has been suggested anytime soon.
In an ideal world we all want to be saving power where ever we can, but we don't or can't, and therefore the power drain will continue :/
ZaO
I think we'd all welcome less power usage and less heat, provided we still have the performance we like. My graphics card alone uses a lot more power than a current gen games console haha… :/
I had a couple of 6990s a few years back (bought with bitcoins, to mine yet more).
Used to happily idle away at just over 1000W.
Cat used to love sitting by the vents.
Long since been sold off, but still have a hulking 1200W supply feeling underused.
goldcd
ZaO
I think we'd all welcome less power usage and less heat, provided we still have the performance we like. My graphics card alone uses a lot more power than a current gen games console haha… :/
I had a couple of 6990s a few years back (bought with bitcoins, to mine yet more).
Used to happily idle away at just over 1000W.
Cat used to love sitting by the vents.
Long since been sold off, but still have a hulking 1200W supply feeling underused.
To be honest its probably more power efficient and quiet than a PSU just powerful enough to run your system. Assuming you aren't still running 7900GTX.
I'm disappointed how shallow this study was, it should have at least gone further than testing two machines and comparing 10 arbitrary off the shelf rigs.
goldcd
I had a couple of 6990s a few years back (bought with bitcoins, to mine yet more).
Used to happily idle away at just over 1000W.
Cat used to love sitting by the vents.
Long since been sold off, but still have a hulking 1200W supply feeling underused.
That's a ridiculous amount of power to be used while idle….. Something must've been wrong!?
I love having a lot of headroom with the psu. While an overpowered psu can draw a bit more power when the computer is idle, they can be more efficient when the computer is under load, as they're not as near to their maximum output.
The heat thing is also something I like to look at as some sort of bonus sometimes haha.. I don't have central heating at my place, only a 2.5KW electric fan heater that I bought. So if I'm gaming in the winter, I don't need the heater on so much. I might think about hooking up a frying pan in place of the heatsinks so I can cook with it too. Now that's efficiency! :P
Do you knit yogurt? Concerned that your yurt is the right colour? Want to game it up but worried about your carbon footprint? Well now you can play games to your hearts content with the game-o-cycle! Our patented game-o-cycle uses 100% renewable power for all your gaming needs. A unique set of chains and cogs allows upto 40 bicycles to power your PC (You'll need em). Fun for all the commune!
On a more serious note, efficiency has to be balanced with expenditure. No point “upgrading” to a more efficient part if the money saved on electricity bills is going to be less than the cost of the upgrade over the life of that upgrade.
All my upgrades create second hand parts which get recycled to family and friends. It's a waste to chuck good parts into the landfill.
Efficiency comes with technological progress. As does “cleaner” energy supply. No one would give a monkeys how much your PC used if it was being supplied by electricity from fusion power.
We aren't even at Type 1 on the
https://en.wikipedia.org/wiki/Kardashev_scale
Putting on my imaginary Treasury politician hat….. This means that we should all contribute to the cost of these power stations by way of a series of non-means-tested graphics card tax bands, with either CPU-integrated/<25 watts being tax-free, and an then a one-off purchase tax set at, say, £40 for every 20 additional watts. Or alternatively, perhaps a simpler way would be to create a new VAT category of “computer graphics output equipment” of, say, 50% or 80%. Maybe even have it over 100%.
You know it's probably the sort of thing they would start thinking about if they saw stories like this…
/s
I've long been someone who picks lower power alternatives when putting together a PC - speed isn't everything, so if something can be 99% of performance but with 10% less power (or more), then it's worth it. My current CPU is running at stock speed, but undervolted, saving me around 16watts on the CPU side on full load; not much, but it all helps.
I should hopefully see a few efficiency increases going from E6550/GTX480 -> i7 6700K/GTX970, the difference alone between the power the 480 uses and the 970 uses is pretty substantial to say the least.
I'm guessing that a heavy processor overclock with increased voltages is not helping!!!
Please don't let my wife see this
A fuller comparison would also include:
- Average duration of gaming power load compared to idle./normal workload, and both as a percentage of yearly power-on time
- Energy cost of manufacturing a new card and disposal of the old card
- Average working lifetime of a card, compared to card power (bets on high-end card owners upgrading more regularly than mid/low-end card owners)
- Amortised energy cost of manufacture over projected card lifetime
They were doing this story about TV screens a few years back. If anything PC parts are using less power than a few years ago because the industry is already self-aware that the power requirements were getting out of hand.
The only people guilty of using a lot of power are the people with dual or triple high end cards and this is a minority or users.
If you want to moan on about unnecessary power use in home electronics, how about getting people to sit closer the TV instead of having an 80 or 90inch one.
NB: I went from a GTX460 to a GTX970 and the power (measured at socket) was almost the same
I think APU's and FreeSync monitors will have a very big impact between now and 2020. Yo can't get much more efficient than removing the GPU and it's associated power overheads.
jigger
I think APU's and FreeSync monitors will have a very big impact between now and 2020. Yo can't get much more efficient than removing the GPU and it's associated power overheads.
errm GAMING. High end gaming. On an APU? High end. really?
maybe one day.
In the meantime, why waste so much time doing these BS studies when we could just build a raft of nuclear power stations and be done with it.
DanceswithUnix;11281But replace your i7 4820K with a Pentium Anniversary Edition? They lost me there.[/quote
Yup.
For a start few bought the 4820k for gaming, and then to imagine that a Pentium Anniversary Edition is going to make no difference to performance is absurd. Maybe not in the artificial benchmark they tested (unigine heaven) but I'd expect to see some significant minimum/99% FPS differences in actual games.
The Display is an old Apple Cinema HD 23" display that's been replaced with, not another £500 specialist IPS photo editing display, but a TN that's at least half a decade newer and half the price.
You only have to glance through the rest of the article too to see this there's a lot of pointless ‘research’ here. Tables with TDP figures labelled as ‘nameplate power’ and ‘W/Ghz’ figures calculated using them.
Shame on Hexus for reporting it.
Shallow reporting, not enough substance… but things like this do at least make us all think and start a dialogue… I reckon the natural progression of upgrades and tech improvements will mean that we all gradually move into more efficient kit.
However, it is all mute anyway, because by then the zombie apocalypse will have decended upon us and we won't get much chance to game, and before long no chance at all once all the power stations fail. finally, once the powerstation staff are all zombies themselves, all those nuclear power stations are going to go into meltdown and we'll all be toasted anyway :)
Power consumption problems solved!
Vorlon99
Shallow reporting, not enough substance… but things like this do at least make us all think and start a dialogue… I reckon the natural progression of upgrades and tech improvements will mean that we all gradually move into more efficient kit.
However, it is all mute anyway, because by then the zombie apocalypse will have decended upon us and we won't get much chance to game, and before long no chance at all once all the power stations fail. finally, once the powerstation staff are all zombies themselves, all those nuclear power stations are going to go into meltdown and we'll all be toasted anyway :)
Power consumption problems solved!
I'll let you off cos you're new here, but as we've gone over in other threads in quite some length. Modern reactor designs have automatic fail-safe cut-outs that in the event of everyone stopping doing anything would shut the core down safely. This constant scaremongering of nuclear power station design serves only to worry the public into naive thinking that “nuclear bad, wind and solar good”. It's rubbish and we should get on with nuclear while we have time before oil really gets scarce.
ik9000
I'll let you off cos you're new here, but as we've gone over in other threads in quite some length. Modern reactor designs have automatic fail-safe cut-outs that in the event of everyone stopping doing anything would shut the core down safely. This constant scaremongering of nuclear power station design serves only to worry the public into naive thinking that “nuclear bad, wind and solar good”. It's rubbish and we should get on with nuclear while we have time before oil really gets scarce.
Well, thanks, but I think you misconstrued my comment, bearing in mind I was talking about zombies (not real) then I would have thought you would see my comment for what it was, ie humerous. However, to continue with the serious tone you have adopted, (and I agree with you re nuclear power), in the event of a Zombie scenario, ALL the utilities would break down, there would could easily be a situation where power is disrupted by violent action/re-action to the apocalypse, and the flow of water (essential coolant) could be disrupted, by bodies blocking intakes, pumps failing etc etc, so there could easily be the odd reactor that goes into melt-down. After-all, the earthquake in Japan (no zombies involved) and Chernoble (no zombies involved) certainly went wrong…
But my original post was meant in a humerous vain… happy ?
ik9000
errm GAMING. High end gaming. On an APU? High end. really?
maybe one day.
In the meantime, why waste so much time doing these BS studies when we could just build a raft of nuclear power stations and be done with it.
I use a 7850k and was very surprised just how much performance is on offer. With some overclocking and fine tuning it can run just about anything at 1920x1080 on mid/high. Yeah it's not high end and you will be at about 30-40FPS most of the time, but you can see the potential for high end performance APU's in the future.
I can see a Zen based APU with HMB running powering games at 2560x1440 next year. Add FreeSync and the new API's to the mix, and I think the future for APU's looks very good.
Vorlon99
Well, thanks, but I think you misconstrued my comment, bearing in mind I was talking about zombies (not real) then I would have thought you would see my comment for what it was, ie humerous. However, to continue with the serious tone you have adopted, (and I agree with you re nuclear power), in the event of a Zombie scenario, ALL the utilities would break down, there would could easily be a situation where power is disrupted by violent action/re-action to the apocalypse, and the flow of water (essential coolant) could be disrupted, by bodies blocking intakes, pumps failing etc etc, so there could easily be the odd reactor that goes into melt-down. After-all, the earthquake in Japan (no zombies involved) and Chernoble (no zombies involved) certainly went wrong…
But my original post was meant in a humerous vain… happy ?
once the core is shut down water is only needed for a temporary period to cool the reactor. Once it is cool no further water is needed. It's only an issue if there's no water or other coolant (liquid metal etc) available for the automatic systems to use.
Fukushima stood up to a larger earthquake than it was designed for, a simultaneous Tsunami that it was never designed for, and still didn't go into meltdown. Had the japanese cultural inability to admit a problem not got in the way there would have been time for proper assistance to have been given to pump water into the thing properly before there was any release of anything. Remeber that was several days afterwards that it happened. And again, modern designs are far better on a number of levels.
ik9000
once the core is shut down water is only needed for a temporary period to cool the reactor. Once it is cool no further water is needed. It's only an issue if there's no water or other coolant (liquid metal etc) available for the automatic systems to use.
Fukushima stood up to a ……
Thanks m8 for the additional info, and for taking the time to explain… seriously appreciated :)
The level of stupid in this article amazes me.
The ideal would be low enough power consumption to eliminate all PC fans - then we reduce all that damn noise pollution too - the bane of my life.