this post was submitted on 03 Jun 2024
1299 points (96.4% liked)

Technology

57574 readers
3511 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] spez_@lemmy.world 36 points 2 months ago (9 children)

I don't want to avoid it. I just want it locally

[–] kaityy@lemmy.blahaj.zone 1 points 2 months ago (3 children)

At least with the more advanced LLM's (and I'd assume as well for stuff like image processing and generation), it requires a pretty considerable amount of GPU just to get the thing to run at all, and then even more to spit something out. Some people have enough to run the basics, but most laptops would simply be incapable. And very few people would have resources to get the kind of outputs that the more advanced AI's produce.

Now, that's not to say it shouldn't be an option, or that they force you to have some remote AI baked into your proprietary OS that you can't remove without breaking user license agreements, just saying that it's unfortunately harder to implement locally than we both probably wish it was.

[–] echodot@feddit.uk 2 points 2 months ago* (last edited 2 months ago) (2 children)

That's true but if you don't mind the fact that the AI can't learn anything new you can actually go hardware optimization routes and get pretty good performance. We're starting to see AI chips being made. They will do for AI what GPUs did for graphics.

However these hardware optimized chips are only for running the AI you still need GPUs for training it. I could see a situation where new models are trained by big companies and then the results are sold to individuals who then buy the packages and install them on local chips.

[–] kaityy@lemmy.blahaj.zone 1 points 2 months ago (1 children)

interesting. are these ai chips actually being released on open markets yet, or are thongs still in development phases?

[–] echodot@feddit.uk 2 points 2 months ago

They're available on the open market but you have to buy them as integrated systems since no especially available motherboard has a socket for them, don't even think there's a standard for a socket. They come soldered to the board which isn't the best because when a better version comes out you basically have to throw everything away and start again.

But in a few years I suspect we'll have proper socketed versions.

load more comments (5 replies)