this post was submitted on 27 Jul 2024
579 points (98.2% liked)

Programmer Humor

19503 readers
518 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] leisesprecher@feddit.org 29 points 3 months ago (5 children)

I wonder what will happen with all the compute once the AI bubble bursts.

It seems like gaming made GPU manufacturing scale enough to start using them as general compute, Bitcoin pumped billions into this market, driving down prices (per FLOP) and AI reaped the benefit of that, when crypto moved to asics and crashed later on.

But what's next? We've got more compute than we could reasonably use. The factories are already there, the knowledge and techniques exist.

[–] SlopppyEngineer@lemmy.world 35 points 3 months ago (1 children)

Finally very detailed climate simulations to know how hard we're screwed

[–] pkill@programming.dev 3 points 3 months ago

...made using the arguably the most criminally environmentally disastrous tech we've invented in the past few decades. How ironic!

[–] magic_lobster_party@kbin.run 11 points 3 months ago

It will be used for more AI research probably.

Most of the GPUs belong to the big tech companies, like OpenAI, Google and Amazon. AI startups are rarely buying their own GPUs (often they’re just using the OpenAI API). I don’t think the big tech will have any problem figuring out what to do with all their GPU compute.

[–] BehindTheBarrier@programming.dev 5 points 3 months ago

Compute becomes cheaper and larger undertakings happen. LLMs are huge, but there is new tech moving things along. The key part in LLMs, the transformer is getting new competition that may surpass it, both for LLMs and other machine learning uses.

Otherwise, cheaper GPUs for us gamers would be great.

[–] abbadon420@lemm.ee 4 points 3 months ago

I'll buy a couple top tier gpu's from a failed startup on ebay to run my own ai at home.

[–] msgraves@lemmy.dbzer0.com 2 points 3 months ago

i think open source will build actually useful integrations due to the available compute