this post was submitted on 24 May 2024
58 points (100.0% liked)
askchapo
22845 readers
212 users here now
Ask Hexbear is the place to ask and answer ~~thought-provoking~~ questions.
Rules:
-
Posts must ask a question.
-
If the question asked is serious, answer seriously.
-
Questions where you want to learn more about socialism are allowed, but questions in bad faith are not.
-
Try !feedback@hexbear.net if you're having questions about regarding moderation, site policy, the site itself, development, volunteering or the mod team.
founded 4 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Yeah the Google one is noticeably worse than the latest ChatGPT models.
Most of the dumb responses it gets are old reddit jokes, so I wonder if they saw how people use reddit:search and decided that means they should train it on mostly reddit.
It also might just be a cheaper model.
I read that they paid Reddit $60M to use it for training data
Google is pivoting into AI hard so I doubt their model is cheap at all. Unless they're running a much smaller version for Google search compared to bespoke Gemini conversations.
Never underestimate the ability of capitalists to cut corners.
Cutting so many corners off this thing we're just running in circles now...
I wouldn't be surprised if they're using a variant of their 2B or 7B Gemma models, as opposed to the monster Gemini.
Almost surely. If they're generating an overview at search time they need a very fast model. You can use a cache for the most common searches but otherwise you have to do inference at search time, so you need a fast/small model