this post was submitted on 09 Jan 2025
489 points (99.2% liked)

Opensource

2121 readers
91 users here now

A community for discussion about open source software! Ask questions, share knowledge, share news, or post interesting stuff related to it!

CreditsIcon base by Lorc under CC BY 3.0 with modifications to add a gradient



founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] IrateAnteater@sh.itjust.works 5 points 2 months ago (2 children)

Since VLC runs on just about everything, I'd imagine that the cloud service will be best for the many devices that just don't have the horsepower to run an LLM locally.

[–] GenderNeutralBro@lemmy.sdf.org 2 points 2 months ago

True. I guess they will require you to enter your own OpenAI/Anthropic/whatever API token, because there's no way they can afford to do that centrally. Hopefully you can point it to whatever server you like (such as a selfhosted ollama or similar).

[–] zurohki@aussie.zone 1 points 2 months ago

It's not just computing power - you don't always want your device burning massive amounts of battery.