this post was submitted on 19 Feb 2025
110 points (99.1% liked)
Privacy
901 readers
82 users here now
Protect your privacy in the digital world
Welcome! This is a community for all those who are interested in protecting their privacy.
Rules
PS: Don't be a smartass and try to game the system, we'll know if you're breaking the rules when we see it!
- Be nice, civil and no bigotry/prejudice.
- No tankies/alt-right fascists. The former can be tolerated but the latter are banned.
- Stay on topic.
- Don't promote proprietary software.
- No crypto, blockchain, etc.
- No Xitter links. (only allowed when can't fact check any other way, use xcancel)
- If in doubt, read rule 1
Related communities:
- !opensource@programming.dev
- !selfhosting@slrpnk.net / !selfhosted@lemmy.world
- !piracy@lemmy.dbzer0.com
founded 3 months ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Yes for gaming, but for LLMs I've heard that the bandwidth limitations of using system RAM as vram hurts performance worse than running on the CPU using system memory directly, since smaller models are more memory bandwidth limited.
I've never tried to run AI on an igpu with system memory though so you could try it, assuming it will let you allocate like 32GB or more like 64GB. I think you'll also need a special runner that supports igpus.