this post was submitted on 10 Aug 2024
94 points (100.0% liked)
Technology
37708 readers
223 users here now
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Calling those that don't depend on proof-of-work "more energy efficient" is understating it to the point of being dishonest. The difference is not that they're more efficient in any conventional way. It's that they don't have the amazing bitcoin feature of relying for their operation on the practice of deliberately wasting enormous amounts of energy for the purpose of being able to prove that you've wasted enormous amounts of energy.
All the way through the cryptocurrency crash which the average reader of headlines might've thought had put an end to it by now, the bitcoin network has kept on burning up absurd amounts of power.
C'mon, that's being disingenuous. Back when Bitcoin was released, nobody was giving a thought to computer energy use. A consequence of proof-of-work is wasted energy, but a focus on low-power modalities and throttling have been developed in the intervening years. The prevailing paradigm at the time was, "your C/GPU is going to be burning energy anyway, you may as well do something with it."
It was a poor design decision, but it wasn't a malicious one like you make it sound. You may as well accuse the inventors of the internal combustion engine of designing it for the express purpose of creating pollution.
It's absolutely not the case that nobody was thinking about computer power use. The Energy Star program had been around for around 15 years at that point and even had an EU-US agreement, and that was sitting alongside the EU's own energy program. Getting an 80Plus-certified power supply was already common advice to anyone custom-building a PC which was by far the primary group of users doing Bitcoin mining before it had any kind of mainstream attention. And the original Bitcoin PDF includes the phrase "In our case, it is CPU time and electricity that is expended.", despite not going in-depth (it doesn't go in-depth on anything).
The late 00s weren't the late 90s where the most common OS in use did not support CPU idle without third party tooling hacking it in.
And, for computers, was almost exclusively limited to monitors. In 2009, the Energy Star specification was version 4.0, released in 2006. In that specification, the EPA's objective was to get 40% of the computers on the market to have power management capabilities 2010 -- 40% by the year after Bitcoin was introduced. Intel's 2009 TCO-driven upgrade cycle document mentions power management, but power use isn't included in any of the TCO metrics.
All of the focus on low-power processing units in 2009 was for mobile devices and DSPs. Computer-oriented energy savings at the time was focused on processes, e.g. manually powering down computers or use of suspension and hibernation - there was very little CPU clock scaling available for desktop computers -- you turned them off to save power. DVFS didn't become widely available -- or effective -- until 2006, and a study published in 2009 (again, the same year Bitcoin was introduced) found that "only 20% of initiatives had measurable targets."
So, yes: technically, there were people thinking about these sorts of things, but it wasn't a common consumer consideration, and the tools for power management were crude: your desktop was on and consuming power -- always the same amount of power -- or it was off. And people did power down their computers to save energy. But, like I said, if your desktop was on, it was consuming the same amount of energy whether you were running a miner or weren't. There was a motto at the time bandied about by SETI@home, that your computer was using energy anyway, so you might as well do science with the spare CPU cycles. That was the mindset of most people who had computers at the time.