this post was submitted on 18 Oct 2024
165 points (100.0% liked)

technology

23313 readers
72 users here now

On the road to fully automated luxury gay space communism.

Spreading Linux propaganda since 2020

Rules:

founded 4 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] FunkyStuff@hexbear.net 10 points 1 month ago (1 children)

I might need reeducation because I think that image is probably the closest thing to an appropriate usecase for LLMs I've seen ever.

[–] Zvyozdochka@hexbear.net 8 points 1 month ago* (last edited 1 month ago) (1 children)

I don't think the idea of it is terrible, I can see it's use cases, but I think it being implemented in MySQL directly is extremely silly. Imagine having to have all the instances of your database server running on some beefy hardware to handle the prompts, just do the actual processing on a separate machine with the appropriate hardware then shove it into the database with another query when it's done.

[–] alexandra_kollontai@hexbear.net 5 points 1 month ago (1 children)

Probably useful in data science

[–] invalidusernamelol@hexbear.net 2 points 1 month ago* (last edited 1 month ago)

It's nice to be able to do something like this without having to use an ORM. Especially if you need a version of the data that's limited to a certain character size.

Like having a replica on the edge that serves the 100 character summaries then only loads the full 1000+ character record when a user interacts with it.

A summary of the review is also more useful than just the first 100 characters of a review.

If the model that's used for that is moderately light I could see it actually using less energy over time for high volume applications if it is able to lower overall bandwidth.

This is all assuming that the model is only run one time or on record update though and not every time the query is run...