this post was submitted on 21 Feb 2024
277 points (95.1% liked)

Technology

61263 readers
4002 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Deceptichum@kbin.social 32 points 11 months ago (2 children)

Sick, I only need 90gb of VRAM!

[–] QuadratureSurfer@lemmy.world 15 points 11 months ago (1 children)

I've got it running with a 3090 and 32GB of RAM.

There are some models that let you run with hybrid system RAM and VRAM (it will just be slower than running it exclusively with VRAM).

[–] Deceptichum@kbin.social 16 points 11 months ago (1 children)

Yeah but damn does it get slow.

I always find it interesting how text is so much slower than image generation. I can do a 1024x1024 in probably 20s, but I get like 1 word a second with text.

[–] aBundleOfFerrets@sh.itjust.works 5 points 11 months ago

Languages are complex and, more importantly, much less forgiving to error

[–] DarkThoughts@fedia.io 1 points 11 months ago (2 children)

Hopefully we see more specific hardware for this. Like extension cards with pretty much just tensor cores and their own ram.

[–] topinambour_rex@lemmy.world 1 points 11 months ago

Graphic cards without video connection exists since a while.

[–] Deceptichum@kbin.social 1 points 11 months ago (2 children)

I’d love to see some consumer level AI stuff, sadly it all seems to be designed for server farms and by the time it ages out into consumer prices it’s so obsolete there’s no point in getting it.

[–] raldone01@lemmy.world 1 points 11 months ago (1 children)

Do they want consumer ai cards to exist though?

Think about the data!

[–] Deceptichum@kbin.social 1 points 11 months ago (1 children)

Card makers? They only want money, if theres enough consumer level demand they will make them.

[–] raldone01@lemmy.world 1 points 11 months ago

I guess your right.

[–] DarkThoughts@fedia.io 1 points 11 months ago

It's not quite consumer level I'd say but Coral.ai has some small Google Edge based TPUs.