this post was submitted on 08 Sep 2024
749 points (94.2% liked)
Microblog Memes
5726 readers
1663 users here now
A place to share screenshots of Microblog posts, whether from Mastodon, tumblr, ~~Twitter~~ X, KBin, Threads or elsewhere.
Created as an evolution of White People Twitter and other tweet-capture subreddits.
Rules:
- Please put at least one word relevant to the post in the post title.
- Be nice.
- No advertising, brand promotion or guerilla marketing.
- Posters are encouraged to link to the toot or tweet etc in the description of posts.
Related communities:
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I haven't looked into the chatbot thingy at all yet, but if it meets basic quality standards (local LLM, not too large in size, actually helpful), then personally, I do actually think that it should be included by default, because it'll primarily help out the kind of users who don't know to install add-ons.
Like, people had the same complaint with the translation feature they included, and I'm just seeing my dad who doesn't speak English, who would never hear of such an add-on, where this just opens up a big chunk of the web to him.
Its functionality to integrate with whatever LLM you use - local or SAAS. I can't say I'm excited about the feature, but I think it's also a bit silly that people are angry about it (though I take the point about development priority).
It's healthy for Firefox's market share to keep feature parity with Edge and other browsers that have the same function but with a manufacturer-pushed service.
Wait, it'll actually let you use local LLMs?
That would legitimately help me out. I use LLMs a lot for simple data restructuring, or rewording of explanations when I'm reading through certain sources. I was worried they would just do a simple ChatGPT API integration and have that be the end of it, but maybe this will end up being something I'd actually use.