this post was submitted on 23 Feb 2024
566 points (100.0% liked)
196
16504 readers
12 users here now
Be sure to follow the rule before you head out.
Rule: You must post before you leave.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I have a Watt-Meter right on my PC plug next to my monitor so I can always see how much I consume. It's crazy how much the monitors alone take up, it's kind 40 KW/h each. I'm considering removing one of them.
Do you mean kW or kWh?
Neither hopefully. The former at least is a unit of power, but 40 kW is enough to heat up a whole apartment building.
In reality a large and older monitor might use a couple hundred watts. A small modern 24" will probably use closer to 50 W (guesstimating), which is still a decent chunk of the power draw of a budget build.
True, they probably ment to say 40W, or as the EU energy label likes to say: 40kWh/1000h
What? That’s a really stupid way to put it. You can’t just “per x hour” a per hour based unit.
40 kilowatt-hours used per one thousand hours of runtime is a perfectly valid if cumbersome unit
KWh is a measure of total energy, not instantaneous power. Your watt meter was saying that since last reset of the value it measured 40 KWh of energy use. That's not an insignificant amount - a Chevy Bolt can go around 180 miles on 40KWh. Watts, or kilowatts, are instantaneous power. That same Bolt can easily pull 100KW while accelerating and if it could somehow do that for an hour, it would have used 100KWh. It could never make it the whole hour as it has a 65KWh battery, so it would run out after 39 minutes.
What you're describing is kWh, not kW/h. You need to multiply power with time to get back to energy. An appliance using 1kW of power for 1h "uses" 1kWh of energy. The same appliance running for 2h requires 2kWh instead.
kW/h doesn't really make sense as a unit, although it could technically describe the rate at which energy consumption changes over time.
Autocorrect seems to disagree, but autocorrect doesn't know shit about power vs energy. Fixed it.
A typical wall outlet can only draw 1800w (1.8kw) no way is it drawing 40kwh (kw/h is a nonsense unit in this context). If it's drawing 40wh ghats actually quite low, a typical monitor is closer to 80-100w while powered on.
Where I live electricity is about 10c/kwh (cheap I know) so a 100w monitor is costing me about a cent an hour. More than worth it imo but you make your own decisions.