What makes it special? Is there a particular dish/sauce that's made with it that you like?
m_f
I saw a comment elsewhere that found a way to make the hallucinations useful:
I've found this to be one of the most useful ways to use (at least) GPT-4 for programming. Instead of telling it how an API works, I make it guess, maybe starting with some example code to which a feature needs to be added. Sometimes it comes up with a better approach than I had thought of. Then I change the API so that its code works.
Conversely, I sometimes present it with some existing code and ask it what it does. If it gets it wrong, that's a good sign my API is confusing, and how.
The human is happy that they're special, but then find out that they're simply used as a tool because of that. There's an extra sting because humans aren't being used for something cool and exciting, they're just brains in a vat calculating waste management routing.
That looks like the perfect hangover food lol
I've had similar food before, but never exactly that, looks good! The history is interesting, being invented for coal miners is a very WV.
The metaphor of “stochastic parrots” has become a rallying cry for those who seek to preserve the sanctity of human cognition against the encroachment of large language models. In this paper, we extend this metaphor to its logical conclusion: if language models are stochastic parrots, and humans learned language through statistical exposure to linguistic data, then humans too must be stochastic parrots. Through careful argumentation, we demonstrate why this is impossible—humans possess the mystical quality of “true understanding” while machines possess only “pseudo-understanding.” We introduce the Recursive Parrot Paradox (RPP), which states that any entity capable of recognizing stochastic parrots cannot itself be a stochastic parrot, unless it is, in which case it isn’t. Our analysis reveals that emergent abilities in language models are merely “pseudo-emergent,” unlike human abilities which are “authentically emergent” due to our possession of what we term “ontological privilege.” We conclude that no matter how persuasive, creative, or capable language models become, they remain sophisticated pattern matchers, while humans remain sophisticated pattern matchers with souls
The paper is tongue-in-cheek, but gets to an important point. Anyone saying "But LLMs are just ..." has to explain why that "..." doesn't also apply to humans. IMO a lot of people throwing around "stochastic parrots!" just want humans to be special, and work backwards from there.
It's easy to harrumph at this article if you hate AI and all that, but I think it's interesting to try to come up with a somewhat objective definition of creativity. I do think it gets at an important part of the creative process, "Necessity is the mother of all invention". When you're working locally and stuff starts getting weird because of nonlocal constraints, then you have to start getting creative to make it all work coherently as best you can.
lemm.ee going down is a huge loss for Lemmy, but welcome! Hopefully we'll be a good replacment for it
Do we know what this one's name is?
EDIT: To answer my own question, it appears not. Swedish text here says in English:
Sofus is a small black animal that accompanies Moomin and helps and to some extent imitates him in some of Tove Jansson's episodes of the series, something that is reflected in his English name Shadow, which means "shadow". In the first episode, it is Sofu's cousin who has this role, but the cousin then does not have time to be in the series anymore and hands over to Sofus at the beginning of the second episode. The name of Sofu's cousin is never mentioned in the series
In a similar spirit, the Juicy Lucy was invented in MN, though two different bars claim to be the ones that invented it.