this post was submitted on 10 Jun 2025
401 points (81.9% liked)

ADHD memes

10493 readers
699 users here now

ADHD Memes

The lighter side of ADHD


Rules

  1. No Party Pooping

Other ND communities

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] Sir_Kevin@lemmy.dbzer0.com -2 points 4 days ago* (last edited 4 days ago) (2 children)

People act like generating a meme is going to burn down the rain forest. That's like thinking you'll save the environment by taking one less shower each week. It's a drop in the ocean compared to what FANG and other corporations are doing at scale. Direct your hate towards them. Not the guy using *50Wh making an image.

*Number pulled from my ass but the point is that it's less than your last gaming session.

[–] brucethemoose@lemmy.world 14 points 4 days ago* (last edited 4 days ago) (1 children)

Let's look at a "worst case" on my PC. Let's say 3 attempts, 1 main step, 3 controlnet/postprocessing steps, so 64-ish seconds of generation at 300W above idle.

...That's 5 watt hours. You know, basically the same as using photoshop for a bit. Or gaming for 2 minutes on a laptop.

Datacenters are much more efficient because they batch the heck out of jobs. 60 seconds on a 700W H100 or MI300X is serving many, many generations in parallel.

Not trying to be critical or anything, I hate enshittified corpo AI, but that's more-or-less what generation looks like.

[–] Sir_Kevin@lemmy.dbzer0.com 4 points 4 days ago (1 children)

Thanks for taking the time to crunch the numbers. Good points!

[–] brucethemoose@lemmy.world 4 points 4 days ago* (last edited 4 days ago)

At risk of getting more technical, some near-future combination of bitnet-like ternary models, less-autoregressive architectures, taking advantage of sparsity, and models not being so stupidly general-purpose will bring inference costs down dramatically. Like, a watt or two on your phone dramatically. AI energy cost is a meme perpetuated by Altman so people will give him money, kinda like a NFT scheme.

...In other words, it's really not that big a deal. Like, a drop in the bucket compared to global metal production or something.

The cost of training a model in the first place is more complex (and really wasteful at some 'money is no object' outfits like OpenAI or X), but also potentially very cheap. As examples, Deepseek and Flux were trained with very little electricity. So was Cerebras's example model.

[–] copygirl@lemmy.blahaj.zone 5 points 4 days ago (1 children)

I've heard some scary numbers when it comes to waste, but I don't have a source, nor do I intend to go digging for one because I'm already depressed enough. But you addressed neither of my other grievances. In the end I'd just prefer a future where work is automated, and not creativity nor thinking. I will speak up in this small space where I might be heard, when I believe corporations are betting on getting people hooked on AI because they've never learned to think or bothered to create for themselves, just so they can extract massive profits.

By all means, keep investing and being interested in specialized AI, AI research and AI ethics. But stay away from generative (text/image/video) AI.

[–] Sir_Kevin@lemmy.dbzer0.com 4 points 4 days ago* (last edited 8 hours ago) (1 children)

I didn't address your other grievances because for the most part I agree with you.

I don't have a problem with AI when used by an artist/creator as a tool. Even better when using renewable energy. What irks me is blanket statements that demonize any and all use of AI. Or when people misrepresent themselves as a creator when all they did was enter a prompt.

[–] copygirl@lemmy.blahaj.zone 2 points 9 hours ago (1 children)

I happened across a podcast episode that was about AI, that I was listening to with friends. I don't know if you want to take away anything from it but I figured I'd mention it here in case anyone wants to. Look for Serious Inquiries Only episode 477, "Debunking Bad AI Research, and Bad Coverage of AI Research". For you it might not be super interesting, since it's trying to explain the matter to those who might not already know much, debunking some bad studies, but towards the end they talk about the environmental impact. And this is with two experts, I believe.

One thing that pops up there is that training a "moderately large" model requires produces twice the CO₂ output of an average American over their entire lifetime. They mention water usage is really bad, too. And "moderately large" refers to what a University research team might be cooking up. Big companies have magnitudes more environmental impact from training their huge models.

(There is also a part 2, with the followup episode.)

[–] Sir_Kevin@lemmy.dbzer0.com 1 points 8 hours ago

That sounds worth a listen! Thanks!