this post was submitted on 27 Nov 2023
1035 points (98.2% liked)

People Twitter

4604 readers
931 users here now

People tweeting stuff. We allow tweets from anyone.

RULES:

  1. Mark NSFW content.
  2. No doxxing people.
  3. Must be a tweet or similar
  4. No bullying.
  5. Be excellent to each other.

founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] merc@sh.itjust.works 5 points 7 months ago (6 children)

It doesn't "lie" though, it just generates a plausible sequence of words. The sort-of fortunate thing is that facts are often plausible, and it's going to be trained on a lot of facts. But, facts aren't the only word-sequences that are plausible, and LLMs are trained to be creative, and that means sometimes choosing a next-word that isn't the best fit, which might end up meaning the generated sentence isn't factual.

Calling it a "lie" suggests that it knows the truth, or that it is being deceptive. But, that's giving "spicy autocomplete" too much credit. It simply generates word salads that may or may not contain truths.

[–] Anticorp@lemmy.ml 3 points 7 months ago (5 children)

The industry word for it is "hallucination", but I'm not sure that fits either.

[–] merc@sh.itjust.works 2 points 7 months ago (4 children)

It's better than lying, but it still implies consciousness. It also implies that it's doing something different than what it normally does.

In reality, it's always just generating plausible words.

[–] MistakenBear32@lemmy.dbzer0.com 1 points 7 months ago (1 children)

It's bullshitting... Faking it till it makes it, if you will.

[–] merc@sh.itjust.works 2 points 7 months ago

No, that implies a goal. It's just spicy autocomplete.

load more comments (2 replies)
load more comments (2 replies)
load more comments (2 replies)