this post was submitted on 20 Aug 2024
1189 points (97.8% liked)
Technology
59179 readers
2145 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Nah, local LLMs are easily in the range of transcribe/summarize. I bet you could do that nicely with llama 8B without even needing a gpu.
Cant wait to have these
You already can I think? Ollama is something you can install, and then you can set up a webui like sillytavern for roleplays, or some other more fitting ui for whatever you want. Also, Linux is great for projects like these, on windows it's fucking a pain to set up, Linux it's easy.