this post was submitted on 23 May 2024
953 points (100.0% liked)

TechTakes

1384 readers
280 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 1 year ago
MODERATORS
 

Source

I see Google's deal with Reddit is going just great...

you are viewing a single comment's thread
view the rest of the comments
[–] PersonalDevKit@aussie.zone 6 points 5 months ago* (last edited 5 months ago) (1 children)

Couldn't that describe 95% of what LLMs?

It is a really good auto complete at the end of the day, just some times the auto complete gets it wrong

[–] milicent_bystandr@lemm.ee 3 points 5 months ago

Yes, nicely put! I suppose 'hallucinating' is a description of when, to the reader, it appears to state a fact but that fact doesn't at all represent any fact from the training data.