this post was submitted on 02 Sep 2024
87 points (100.0% liked)
Technology
37724 readers
488 users here now
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
The comment that you're replying to is fairly specifically criticising the usage of the word "hallucination" to misrepresent the nature of the undesirable LLM output, in the context of people selling you stuff by what it is not.
It is not "pushing" another "thing to criticise about LLMs". OK? I have my fair share of criticism against LLMs themselves, but that is not what I'm doing right now.
When we extend analogies they often break in the process. That's the case here.
Originally the analogy works because it shows a phony selling a product by what it is not. By making the phony to precompute 4*10¹² equations (a completely unrealistic situation), he stops being a phony to become a muppet doing things the hard way.
Emphases mine. Those "ifs" represent a completely unrealistic situation, that does not show anything useful about the real situation.
We know that LLMs output "hallucinations" way more than just once, or 0.000001% of the time. They're common enough to show you how LLMs work.