this post was submitted on 25 Jun 2025
606 points (98.4% liked)

Greentext

6627 readers
1391 users here now

This is a place to share greentexts and witness the confounding life of Anon. If you're new to the Greentext community, think of it as a sort of zoo with Anon as the main attraction.

Be warned:

If you find yourself getting angry (or god forbid, agreeing) with something Anon has said, you might be doing it wrong.

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[โ€“] LostXOR@fedia.io 65 points 1 week ago (8 children)

This article estimates that GPT-4 took around 55 GWh of electricity to train. A human needs maybe 2000 kcal (2.3 kWh) a day and lives 75 years, for a lifetime energy consumption of 63 MWh (or 840x less than just training GPT-4).

So not only do shitty "AI" models use >20x the energy of a human to "think," training them uses the lifetime energy equivalent of hundreds of humans. It's absolutely absurd how inefficient this technology is.

[โ€“] 1rre@discuss.tchncs.de 6 points 1 week ago* (last edited 1 week ago)

Human energy needs are incredibly variable so the estimates for normal consumption are wrong for most people, but when you get into essential systems (basically cardiovascular and nervous, not even including digestive or any muscle movement) you actually need even less - the average (by weight, height & age) man needs 1950kcal or so and the average woman (by height, weight & age) needs 1450kcal or so

When we replace AI with brains in jars I'm sure we can cut it down even more though

load more comments (7 replies)