this post was submitted on 05 Jun 2024
750 points (99.6% liked)

The Onion

4509 readers
1608 users here now

The Onion

A place to share and discuss stories from The Onion, Clickhole, and other satire.

Great Satire Writing:

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] deadbeef79000@lemmy.nz 8 points 5 months ago (1 children)

That's assuming the CEO isn't already hallucinating.

At least when an LLM hallucinates you can tell it and it won't fire you.

[–] TheObviousSolution@lemm.ee 4 points 5 months ago (1 children)

It doesn't have the power to do so. But it does have the power to shrug off your questions. Has an LLM ever shrugged off your questions?

[–] deadbeef79000@lemmy.nz 4 points 5 months ago* (last edited 5 months ago)

Sort of, I had GitHub Copilot hallucinate an AWS Cloud formation template stanza.

Asked it for the source it used for the stanza, which it then gave me the URL for.

Told it that the crap it just gave me wasn't on that page.

It apologies and told me to RTFM.

So, yeah, even super auto correct is a dick.