this post was submitted on 10 Aug 2024
94 points (100.0% liked)

Technology

37708 readers
223 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] chameleon@fedia.io 20 points 3 months ago (1 children)

It's absolutely not the case that nobody was thinking about computer power use. The Energy Star program had been around for around 15 years at that point and even had an EU-US agreement, and that was sitting alongside the EU's own energy program. Getting an 80Plus-certified power supply was already common advice to anyone custom-building a PC which was by far the primary group of users doing Bitcoin mining before it had any kind of mainstream attention. And the original Bitcoin PDF includes the phrase "In our case, it is CPU time and electricity that is expended.", despite not going in-depth (it doesn't go in-depth on anything).

The late 00s weren't the late 90s where the most common OS in use did not support CPU idle without third party tooling hacking it in.

[–] sxan@midwest.social 1 points 2 months ago

The Energy Star program had been around for around 15 years at that point

And, for computers, was almost exclusively limited to monitors. In 2009, the Energy Star specification was version 4.0, released in 2006. In that specification, the EPA's objective was to get 40% of the computers on the market to have power management capabilities 2010 -- 40% by the year after Bitcoin was introduced. Intel's 2009 TCO-driven upgrade cycle document mentions power management, but power use isn't included in any of the TCO metrics.

All of the focus on low-power processing units in 2009 was for mobile devices and DSPs. Computer-oriented energy savings at the time was focused on processes, e.g. manually powering down computers or use of suspension and hibernation - there was very little CPU clock scaling available for desktop computers -- you turned them off to save power. DVFS didn't become widely available -- or effective -- until 2006, and a study published in 2009 (again, the same year Bitcoin was introduced) found that "only 20% of initiatives had measurable targets."

So, yes: technically, there were people thinking about these sorts of things, but it wasn't a common consumer consideration, and the tools for power management were crude: your desktop was on and consuming power -- always the same amount of power -- or it was off. And people did power down their computers to save energy. But, like I said, if your desktop was on, it was consuming the same amount of energy whether you were running a miner or weren't. There was a motto at the time bandied about by SETI@home, that your computer was using energy anyway, so you might as well do science with the spare CPU cycles. That was the mindset of most people who had computers at the time.