this post was submitted on 21 Aug 2024
643 points (98.1% liked)
Programmer Humor
19503 readers
1097 users here now
Welcome to Programmer Humor!
This is a place where you can post jokes, memes, humor, etc. related to programming!
For sharing awful code theres also Programming Horror.
Rules
- Keep content in english
- No advertisements
- Posts must be related to programming or programmer topics
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
it just struck me that LLMs would be so massively improved by simply making them prepend "i think" before every statement, instead of having them confidently state absolute nonsense and then right after confidently state that they were completely incorrect.
I've been experimenting with ChatGPT a little more the past couple of weeks. It sounds confident and authoritative. What is funny is when you find inaccuracies. It seems good at knowing you're trying to correct it. I haven't tried lying to it when I'm correcting it yet but I wonder if it would also accept those even if they're nonsensical lol.