488
Someone got Gab's AI chatbot to show its instructions
(mbin.grits.dev)
Rumors, happenings, and innovations in the technology sphere. If it's technological news or discussion of technology, it probably belongs here.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
is there any drawback that even necessitates the prompt being treated like a secret unless they want to bake controversial bias into it like in this one?
Honestly I would consider any AI which won't reveal it's prompt to be suspicious, but it could also be instructed to reply that there is no system prompt.
A bartering LLM where the system prompt contains the worst deal it's allowed to accept.