Do high-end GPUs waste power when you’re not gaming?

0
327

[ad_1]

FeelGoodLuck / Shutterstock.com

You bought a beefy GPU for gaming, or maybe you created an AI-generated piece of art, and when it’s running at full throttle, it consumes a lot of power. But what about the boring day-to-day stuff? Is it a waste?

Idle power is nearly equivalent across all GPUs

Whenever new GPUs are released, there’s always talk of how much power they draw, and curious PC builders want to know if they’ll need a new power supply to support new (and almost always more) power demands.

It would be easy to assume that those new beefier GPUs consume a lot of power. In fact, a thought like “I spend almost all my time working on spreadsheets and documents for work and almost no time gaming, so I’m probably wasting a ton of electricity on this thing” might cross your mind.

Fortunately, that is not the case. Despite the significant difference between the maximum capabilities of different graphics card generations, there is little variation between idle or near-idle loads.

In terms of capacity and maximum processing power, there is a big difference between cards like the GTX 1060 and RTX 3080. But their idle power consumption varies by an almost negligible amount. A GTX 1060 uses around 5W of power while idle and an RTX 3080 uses around 15W of power while idle.

That’s not exactly the same power consumption, but it’s a pretty trivial difference. You’re not exactly destroying the environment or risking paying a power bill you can’t afford with that variation in idle power consumption. At 12 cents per kWh and using your computer for 8 hours per day, the difference in idle power consumption between the two cards equates to ~$0.29 per month.

If you want to delve into the energy consumption data for different cards, you can poke around the internet and search for specific cards. However, a useful shortcut to a lot of digging and comparisons is to check out the power consumption stats in the in-depth card reviews on TechPowerUp, like this review of the ASUS RTX 3080 Noctua OC.

For every card they review, they test it extensively, giving you power breakdown data for idle, multiple monitors, gaming, and peak power consumption. You may not want to worry so much about power consumption data, but if you do, it’s there. Simply find your card or a similar model and drill down.

Speaking of worrying about power consumption, if you want to test your PC’s overall power consumption at idle and under load (or any other device), check out our guide to monitoring power usage.

Power usage under load is a different story

Naturally, power consumption under load is different for newer, more powerful cards, which is why you often have to upgrade your power supply to meet the demands of a new GPU.

Under load, while playing a demanding game or doing some rendering work, the aforementioned GTX 1060 could reach 125 W of power consumption.

On the other hand, an RTX 3080 could easily hit 345 W while playing demanding games. That’s a 220W difference, which is certainly not a trivial amount.

However, it still doesn’t have as much of an impact on your energy bill as you might think. Let’s say you’re gaming for four hours a night, playing games that really bind your GPU. Again, using 12 cents per kWh as our benchmark, you’d spend $1.80 per month on the GTX 1060’s power consumption and $4.97 per month on the GTX 3080’s power consumption.

So assuming every second of your gaming pretty much throttled the GPU (which it probably isn’t), you’d only spend $3.17 more a month for the same number of games.

And who knows, maybe upgrading your GPU will even save you money! There’s a little joke in the PC gaming community that right when you update your build, you’ll be too busy to enjoy it – a kind of update curse, if you will. So add that beefy new GPU with the confidence that you’ll probably be too busy with school, work, kids, or all of the above to add to your power bill.

[ad_2]