this post was submitted on 17 Nov 2024
134 points (92.9% liked)

Weird News - Things that make you go 'hmmm'

906 readers
271 users here now

Rules:

  1. News must be from a reliable source. No tabloids or sensationalism, please.

  2. Try to keep it safe for work. Contact a moderator before posting if you have any doubts.

  3. Titles of articles must remain unchanged; however extraneous information like "Watch:" or "Look:" can be removed. Titles with trailing, non-relevant information can also be edited so long as the headline's intent remains intact.

  4. Be nice. If you've got nothing positive to say, don't say it.

Violators will be banned at mod's discretion.

Communities We Like:

-Not the Onion

-And finally...

founded 1 year ago
MODERATORS
 

Ouch.

you are viewing a single comment's thread
view the rest of the comments
[–] megane_kun@lemm.ee 32 points 10 hours ago (5 children)

Here's the conversation that was linked on the reddit thread about the incident: https://gemini.google.com/share/6d141b742a13

[–] OsrsNeedsF2P@lemmy.ml 28 points 10 hours ago (4 children)

Holy smokes I stand corrected. The chatbot actually misunderstood the context to the point it told the human to die, out of the blue.

It's not every day you get shown a source that proves you wrong. Thanks kind stranger

[–] Mog_fanatic@lemmy.world 1 points 6 hours ago (1 children)

One thing that throws me off here is the double response. I haven't used Gemini a ton but it has never once given me multiple replies. It is always one statement per my one statement. You can see at the end here there's a double response. It makes me think that there's some user input missing. There's also missing text in the user statements leading up to it as well which makes me wonder what the person was asking in full. Something about this still smells fishy to me but I've heard enough goofy things about how AIs learn weird shit to believe it's possible.

[–] WolfLink@sh.itjust.works 5 points 5 hours ago

Idk what you mean “double response”. The user typed a statement, not a question, and the AI responded with its weird answer.

I think the lack of a question or specific request in the user text led to the weird response.

load more comments (2 replies)
load more comments (2 replies)