this post was submitted on 12 Jun 2024
393 points (95.4% liked)

Technology

59422 readers
2896 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] MentalEdge@sopuli.xyz 20 points 5 months ago* (last edited 5 months ago) (1 children)

There's also the fact that they can't tell reality apart from fiction in general, because they don't understand anything in the first place.

LLMs have no way of differentiating fantasy RPG elements from IRL things. So they can lose the plot on what is being discussed suddenly, and for seemingly no reason.

LLMs don't just "learn" facts from their training data. They learn how to pretend to be thinking, they can mimic but not really comprehend. If there were facts in the training data, it can regurgitate them, but it doesn't actually know which facts apply to which subjects, or when to not make some up.

[–] Buffalox@lemmy.world 9 points 5 months ago (1 children)

They learn how to pretend

True, and they are so darn good at it, that it can be somewhat confusing at times.
But the current AIs are not the ones we read about in SciFi.

[–] SpaceNoodle@lemmy.world 7 points 5 months ago (1 children)

I'd argue that referring to it as "AI" is a stretch since it's all A and no I.

[–] Barbarian@sh.itjust.works 6 points 5 months ago

This is why I strictly refer to these things as LLMs. That's what they are.