this post was submitted on 08 Jun 2025
1288 points (97.0% liked)

Microblog Memes

8006 readers
3071 users here now

A place to share screenshots of Microblog posts, whether from Mastodon, tumblr, ~~Twitter~~ X, KBin, Threads or elsewhere.

Created as an evolution of White People Twitter and other tweet-capture subreddits.

Rules:

  1. Please put at least one word relevant to the post in the post title.
  2. Be nice.
  3. No advertising, brand promotion or guerilla marketing.
  4. Posters are encouraged to link to the toot or tweet etc in the description of posts.

Related communities:

founded 2 years ago
MODERATORS
 
top 50 comments
sorted by: hot top controversial new old
[–] DragonAce@lemmy.world 7 points 5 hours ago
[–] bizza@lemmy.zip 18 points 11 hours ago

He tweeted, with a ghibli-slop avatar

[–] supersquirrel@sopuli.xyz 24 points 14 hours ago (1 children)

In 30 years the world will be an ecological wasteland from all the energy usage we spent pursuing dumb shit hype like "AI".

[–] Tryenjer@lemmy.world 4 points 8 hours ago* (last edited 8 hours ago) (1 children)

It seems we are heading towards the fallout timeline.

[–] Noodle07@lemmy.world 2 points 8 hours ago

That would be the best case scénario

[–] menas@lemmy.wtf 20 points 15 hours ago (1 children)

Running LLM in 30 years seems really optimistic

[–] TriflingToad@sh.itjust.works 3 points 10 hours ago (3 children)

how so? they can't make locally run LLMs shit and I assume hardware isn't going to get any worse

[–] frezik@midwest.social 5 points 9 hours ago* (last edited 9 hours ago) (1 children)

There are local LLMs, they're just less powerful. Sometimes, they do useful things.

The human brain uses around 20W of power. Current models are obviously using orders of magnitude more than that to get substantially worse results. I don't think power usage and results are going to converge enough before the money people decide AI isn't going to be profitable.

[–] jj4211@lemmy.world 4 points 9 hours ago (1 children)

The power consumption of the brain doesn't really indicate anything about what we can expend on LLMs... Our brains are not just biological implementation of the stuff done with LLMs.

[–] frezik@midwest.social 5 points 9 hours ago (2 children)

It gives us an idea of what's possible in a mechanical universe. It's possible an artificial human level consciousness and intelligence will use less power than that, or maybe somewhat more, but it's a baseline that we know exists.

[–] Tryenjer@lemmy.world 3 points 8 hours ago

Yeah, but a LLM has little to do with a biological brain.

[–] spicehoarder@lemm.ee 1 points 8 hours ago

You're making a lot of assumptions. One of them being that the brain is more efficient in terms of compute per watt compared to our current models. I’m not convinced that’s true. Especially for specialized applications. Even if we brought power usage below 20 watts, the reason we currently use more is because we can, not that each model is becoming more and more bloated.

[–] Buddahriffic@lemmy.world 2 points 9 hours ago (1 children)

I was thinking in a different direction, that LLMs probably won't be the pinnacle of AI, considering they aren't really intelligent.

[–] menas@lemmy.wtf 2 points 9 hours ago

Assuming they would be enough food to maintain and fix that hardware, I'm not confident that we will have enough electricity to run LLM on massive scale

[–] BlessedDog@lemmy.world 5 points 16 hours ago (4 children)

This guy's name translates to something like "Matt Cock"

load more comments (4 replies)
[–] Fleur_@aussie.zone 16 points 21 hours ago

Unlike you bigots, I've already masturbated to AI generated images

[–] throwawayacc0430@sh.itjust.works 28 points 23 hours ago

Step 1: Give Robots Voting Rights

Step 2: ???

Step 3: Plot twist, all those Robots are actually under direct control of the Evil Corporation Inc. and they already won every future election.

Long Live the Cyberlife CEO!

[–] vga@sopuli.xyz 16 points 23 hours ago* (last edited 23 hours ago)

They're called artificial persons, you fascist.

[–] Cowabunghole@lemmy.ml 33 points 1 day ago (9 children)

The type of guy to say "clanka" with a hard r

load more comments (9 replies)
[–] HugeNerd@lemmy.ca 9 points 22 hours ago

What's wrong with large labia majora?

[–] TheBat@lemmy.world 27 points 1 day ago

And then your LLM-in-law ends up using as much water as Detroit.

load more comments
view more: next ›