136
submitted 7 months ago by alessandro@lemmy.ca to c/pcgaming@lemmy.ca
you are viewing a single comment's thread
view the rest of the comments

Only slightly related question: is there such a thing as an external nVidia GPU for AI models? I know I can rent cloud GPUs but I am wondering if long-term something like an external GPU might be worth it.

[-] baconisaveg@lemmy.ca 6 points 7 months ago

A 3090 (used) is the best bang for your buck for any LLM / StableDiffusion work right now. I've seen external GPU enclosures, though they probably cost as much as slapping a used 3090 into a barebones rig and running it headless in a closet.

[-] AnotherDirtyAnglo@lemmy.ca 3 points 7 months ago

Generally speaking, buying outright is always cheaper than renting, because you can always continue to run the device potentially for years, or sell it to reclaim some capital.

this post was submitted on 17 Nov 2023
136 points (91.5% liked)

PC Gaming

7686 readers
328 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 1 year ago
MODERATORS