this post was submitted on 12 Feb 2025
353 points (95.8% liked)
Comic Strips
13816 readers
1992 users here now
Comic Strips is a community for those who love comic stories.
The rules are simple:
- The post can be a single image, an image gallery, or a link to a specific comic hosted on another site (the author's website, for instance).
- The comic must be a complete story.
- If it is an external link, it must be to a specific story, not to the root of the site.
- You may post comics from others or your own.
- If you are posting a comic of your own, a maximum of one per week is allowed (I know, your comics are great, but this rule helps avoid spam).
- The comic can be in any language, but if it's not in English, OP must include an English translation in the post's 'body' field (note: you don't need to select a specific language when posting a comic).
- Politeness.
- Adult content is not allowed. This community aims to be fun for people of all ages.
Web of links
- [email protected]: "I use Arch btw"
- [email protected]: memes (you don't say!)
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It depends on the computer, but the power usage could easily be 250W+. While not a ton of power, it adds up quickly.
But that's only if you don't have your computer set to sleep/hibernate
Idle power is not usually that high unless you are talking about a multi socket server.
A gaming PC is usually less than 100W and an office PC is usually less than 25W at idle.
Wasting 25 W while entering and leaving sleep mode is a matter of 5 key strokes and 3 seconds?
~~25W still adds up. General rule of thumb is to add a zero to the wattage to get the cost to run it for a year. I don't want to spend $250 a year letting my computer idle.~~
I definitely misremembered things
That's some hella expensive electricity you're buying there. I'm getting mine at 14 cents/kWh, which is roughly 1.2€/W per year. This isn't even close to the cheapest option available.
You know what, you're right. Idk what the fuck I was thinking. I must have misremembered the math from the last time I did it.
I swear I did the math like a year ago and it added up, but that's clearly a false memory. It's closer to $1 per watt per year. I downvoted my own comment
It could've been closer to the truth in 2022. At least in Europe when the energy prices skyrocketed I think I paid closer to 1€/kWh.
Maybe it was 2022. Working from home has fucked my perception of time.
That calculation only makes sense if you never shut down your computer, instead of only when you accidentally hit "restart" and need to go right away.
Lots of people leave their computers running 24/7, though. The TLC said the power draw would be small, so I just wanted to point out that what might look like a negligible amount of power can add up to be more than youd expect.
That's not really what's being discussed here, though. There's a big difference between doing it all the time and only doing it once in a blue moon.
I mean, a beefy GPU could be ~400W, and a beefy CPU another ~200W. But that's peak draw from those components, which are designed to drastically reduce power consumption if they aren't actually under load. You don't have to power down the components in sleep/hibernation to achieve that -- they can already reduce runtime power themselves. One shouldn't normally have software significantly loading those (especially after a reboot). If you've got something that is doing crunching in idle time to that degree, like, I don't know, SETI@Home or something, then you probably don't want it shut off.
The reason fans can "spin up" on the CPU and the GPU when they're under load is because they're dissipating much more heat, which is because they're drawing much more power.