this post was submitted on 22 May 2025
516 points (97.3% liked)

memes

16015 readers
2715 users here now

Community rules

1. Be civilNo trolling, bigotry or other insulting / annoying behaviour

2. No politicsThis is non-politics community. For political memes please go to !politicalmemes@lemmy.world

3. No recent repostsCheck for reposts when posting a meme, you can only repost after 1 month

4. No botsNo bots without the express approval of the mods or the admins

5. No Spam/AdsNo advertisements or spam. This is an instance rule and the only way to live.

A collection of some classic Lemmy memes for your enjoyment

Sister communities

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] cm0002@lemmy.world 18 points 1 month ago* (last edited 1 month ago) (7 children)

It depends on your definition of "usually", high end GPUs for data centers, AI, workstations or "enthusiasts" yea. For these applications you're starting at like 16

GPUs for us plebs, no

[–] BombOmOm@lemmy.world 14 points 1 month ago* (last edited 1 month ago) (4 children)

It's also fairly cheap to buy 32+ GB of RAM, lots of choices for under $80. Meanwhile, I'm not even sure how you find a video card with 32GB of VRAM (not that you really need this much, 12GB and 16GB are pretty solid for a video card nowadays).

[–] 30p87@feddit.org 8 points 1 month ago* (last edited 1 month ago) (2 children)

Afaik for consumers only the 5090 has 32GB VRAM. So you're correct, practically impossible to find. And even if you find it, prone to spontaneous combustion.

For servers, it tops out at 288GB currently, with the AMD Mi355X.

[–] Anivia@feddit.org 3 points 1 month ago

Afaik for consumers only the 5090 has 32GB VRAM

Only if you don't count Apple Silicon with its shared RAM/VRAM. Ironically a Mac Mini / Studio is currently the cheapest way to get a GPU with lots of vram for AI

load more comments (1 replies)
load more comments (2 replies)
load more comments (4 replies)