this post was submitted on 25 Oct 2024
313 points (100.0% liked)
TechTakes
1427 readers
336 users here now
Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.
This is not debate club. Unless it’s amusing debate.
For actually-good tech, you want our NotAwfulTech community
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I think it's weird that "hallucination" would be considered a cute euphemism. Would you trust something that's perpetually tripping balls and confidently announcing whatever comes to them in a dream? To me that sounds worse than merely being wrong.
I think the problem is that it portrays them as weird exceptions, possibly even echoes from some kind of ghost in the machine. Instead of being a statistical inevitability when you're asking for the next predicted token instead of meaningfully examining a model of reality.
"Hallucination" applies only to the times when the output is obviously bad, and hides the fact that it's doing exactly the same thing when it incidentally produces a true statement.
I get the gist, but also it's kinda hard to come up with a better alternative. A simple "being wrong" doesn't exactly communicate it either. I don't think "hallucination" is a perfect word for the phenomenon of "a statistically probable sequence of language tokens forming a factually incorrect claim" by any means, but in terms of the available options I find it pretty good.
I don't think the issue here is the word, it's just that a lot of people think the machines are smart when they're not. Not anthropomorphizing the machines is a battle that was lost no later than the time computer data representation devices were named "memory", so I don't think that's really the issue here either.
As a side note, I've seen cases of people (admittedly, mostly critics of AI in the first place) call anything produced by an LLM a hallucination regardless of truthfulness.
Obvious bullshit is a good way to put it. It even implies the existence of less obvious bullshit.
Reminds me of A Scanner Darkly a bit, yeah I would not trust someone like that