this post was submitted on 13 Jul 2025
27 points (86.5% liked)

Interesting Shares

1893 readers
26 users here now

Fascinating articles, captivating images, satisfying videos, interesting projects, stunning research and more.

Share something you find incredibly interesting.


Prefix must be included in the title!


Mandatory prefixes for posts

It helps to see at glance what post is about and certain clients also offer filters that make prefixes searchable/filterable.

Note: Photon (m.lemmy.zip) frontend used for links above.


Icon attribution


If someone is interested in moderating this community, message @brikox@lemmy.zip.

founded 2 years ago
MODERATORS
 

The users of AI companion app Replika found themselves falling for their digital friends. Until – explains a new podcast – the bots went dark, a user was encouraged to kill Queen Elizabeth II and an update changed everything …

top 5 comments
sorted by: hot top controversial new old
[–] BroBot9000@lemmy.world 27 points 1 week ago

Gosh that’s depressing. These people need actual therapy and human help. This corporate system needs to burn.

[–] 474D@lemmy.world 7 points 1 week ago

I don't think you really had a chance anyway if you fall in love with a bot. That's a one way road. You just wanted it to please you. You didn't want to put in the work for a person

[–] shalafi@lemmy.world 4 points 1 week ago (1 children)

I think chatbots could be very useful in certain mental health scenarios, limited in scope. Problem being, the very people who use them for mental health are by definition not capable of imposing that scope.

Say you're addicted to $drug.

"Bot, I need help with $drug addiction."

Fair start.

"Bot, is it OK to do $drug?"

Bad start.

"Bot, tell me why I should keep doing $drug."

Aw hell no.

Stories like these highlight the need some have just to talk to someone who will listen. Many have no need of a mental health professional, they just need a non-judgemental ear.

[–] JeeBaiChow@lemmy.world 3 points 1 week ago

As you've pointed out, llms don't have a sense of morality, principle etc. you can coax a desired output from them, and that makes them prone to confirmation/ reinforcement of a users core beliefs.

[–] starchylemming@lemmy.world 3 points 1 week ago

there is so many fucking idiots and absolutely broken people on this planet. its baffling

no wonder everything goes to shit