this post was submitted on 22 Apr 2025
728 points (94.9% liked)

Fuck AI

2518 readers
771 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] Voyajer@lemmy.world 4 points 2 days ago (2 children)

I mean Adobe firefly addresses the properly licensed dataset issue and afaik it's all viewable (though I'd much prefer something anyone could use offline locally). Environmental impact will always be an issue unless we see some evidence of mitigation either from direct green energy use or at least creating additional green energy generation from any organization doing the base model training.

Environmental impact of gen AI pales in comparison to the environmental impact of alternatively making all the generated pieces manually. Let's say Shutterstock switches purely to genAI images trained on their own licensed stock images. Do you think their total carbon output will go up or down now that they've stopped doing photoshoots of people and objects in seemingly random situations?

[–] Trainguyrom@reddthat.com 3 points 1 day ago

There's a good amount of research going into reducing the compute needed for training and inference, as well as a ton of R&D going into making far more energy efficient hardware for training and inference

Just like how 3D rendering has gone from dedicated $40,000 workstations and render farms to something that's just done for funsies on your phone, the capabilities of these really powerful models will eventually be squished onto the cheapest, lowest power mass market computers of the day

The biggest long term challenge will be the training data and licensing of outputs. If AI outputs are stuck in a legal state where you simply can't use them commercially, the whole industry will collapse and return to the most ignored corners of university computer science programs. If models aren't required to get licensing for all training data we'll probably just keep seeing companies hoovering up data in the most unethical possible ways to train their big models