I tried using my GPU for video transcoding with Handbrake. The speed was nice but eh file sizes were much larger, so I stopped.
I've also used my GPU intermittently for Folding@home for a couple of years now.
Not the present 1070ti but, if I were daft enough to invest in a 3090, I'm sure it would keep our office nicely warm through the winter, with the central heating radiator turned off.
Used to do folding@home years ago, as well as graphics work.
Now? Nope, games and for reading stuff online.
DxO uses the GPU for RAW conversion.
Space heater in winter? FE and rendering, some of my software does use CUDA and GPU compute stuff that I'd be lying if I said I understood in any detail, but certain cards do help certain packages. Open GL is more important than CUDA IIRC.
ik9000
Space heater in winter?
Damn it, you beat me to it :p
I could be pedantic and say everything I do on a PC is at least displayed by the GPU, but no, apart from being able to watch high res video and play games, it serves not much other purpose for me :D
Use Lightroom a lot. Don't know if it helps or not
MPC-BE hardware acceleration, and if I'm feeling really crazy I even enable it in Firefox.
Fogoldgaming
Use Lightroom a lot. Don't know if it helps or not
adobe say it can do. But my laptop runs it with just iris graphics so not sure how much it actually matters.
Yes!, a heater! it heats my front room up great! couple of hours Prime + FurMark, nice an toasty!
Back in 2010, I wrote a proggie in college that used CUDA to accelerate the workflow. But that was for a class, and CUDA wasn't one of the technologies I kept using after college.
Nowadays, the only thing would be video acceleration, which happens in the background, and considering that my Matrox Millenia is fast enough to play DVDs, I don't need a modern dGPU for that.
I did run some GPU-based WCG and f@h for awhile. I still run WCG, but it hasn't had a GPU-accelerated project in years. For f@h, I used to have issues on my late-2000's nVIDIA GPU where it would use so much GPU that moving the mouse was laggy, so I stopped. With my more recent AMD GPUs, f@h always recorded the minimal credit even if I finished early, so I lost confidence that the results were completing properly, and went back to just running WCG on my CPU.
I have my web browsers set to not use GPU acceleration, as I find I have fewer visual artifacts with that disabled.
I use mine for anything but gaming. Mostly photo editing.
I've used it for some Photoshop and Sony Vegas acceleration in the past but I've no idea how much it helped me. Aside from that, sod all.
Yes. video work. thats about it as i dont game, so 5700 XT are pretty overkill CUZ its not like i work video a lot.
I wish i could game, but games of today are so stupid.
\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\]]]]]]]]]]]]]]]]]]]]]]]]]]][[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[
opencl, compute stuff. Most of my programs can use gpu to speed up and help these days
Let me tell there are many applications that can use the GPU, although not going to use it 100%: video players, window compositing (Windows Aero or whatever is called these days), web browsers, MS Office, etc. I know that Adobe Lightroom can use it for various tasks too - you only know it is there because there is an option to turn it off if it does not work for you.
I do not use the GPU for anything that is demanding other than games. Heck, I do not tend to use the home computer other than for games.
Mostly encoding video whilst streaming. Gives about a 10% hit, so i'm not really doing much streaming atm since the RTX2080 FE cant quite do that and 100+fps in warzone, but that wont be an issue with the 3080 ;)
Other than gaming just BOINC (MilkyWay@Home, Collatz, PrimeGrid & Einstein@Home)
Folding @ home. Web browsing. Once in a long while, Adobe stuff.
i used to, but they are to slow, 2x 1070 jetstream's, and SLI has never worked on x570 boards because nVidia are to cheap to fix it, anything that uses CUDA will just make the second card stop working after a while, or be disable on boot. while both cards work fine, along with both slots running single cards without SLI, still 3x faster for encoding H.264 NVenc over H.264 CPU ( ryzen 3950x ) in handbrake
cost to CUDA performance has never been worth upgrading, outside of some rubbish games
I tried frying an egg on one once…
3D rendering, 2D design (where it's used), encoding/decoding/transcoding video… just using the software in some cases can be as heavy if not heavier than gaming as well.
To be honest I do look for gpu support in software these days, there's no point in having an expensive gpu if all it's going to do is just display stuff on the screen.
Mining on my Plex/Sonarr server, but otherwise only gaming.
Well, first and foremost I use the GPU in my PC to get an image on my monitor. A close second would be gaming. ;)
As for general purpose usage, not so far. If that will change in the future, who knows, but I'm definitely happy that I have the opportunity to do so if I so choose.
DaVinci Resolve, otherwise the rendering speed is pretty naff on my Hades Canyon.
Yes! Everyone does! Modern browsers are designed to use the GPU for video playback/decoding and some other features. Also modern video players use GPUs for video decoding and effects. But lso, yeah! I do purposefully use my GPU for accelerating 3D rendering. And, of course, rendering the viewport. And to accelerate other creative apps like photoshop, illustrator, etc.
The built in GPU on my CCTV desktop is used by the software for decoding the camera streams, does that count?
As above, tried out some folding and mining previously. Also tried out using it for encoding, but the file sizes were poor, so stuck with using my CPU.
I use it to display many many chrome tabs
pp05
As a fan heater.
This is exactly what I intend to use my 3090 for.
From this question I'm wondering if I have a couple of old cards I can use as table tennis bats.
Yes I use my GPU for render in 3D with several programs.
BOINC GPU projects - e.g. Collatz, PrimeGrid, Milky Way etc…
Mostly for Distributed Computing (Boinc).
Not now. When I was Daytrading, it was a necessity.
Multiple screens with multiple windows and multiple data feeds.
The data was very intense, and needed to be up to date and quickly displayed.
All the time…
Editing images and high-resolution videos…
Almost every other day…