this post was submitted on 02 Jul 2025
114 points (98.3% liked)

chapotraphouse

13920 readers
775 users here now

Banned? DM Wmill to appeal.

No anti-nautilism posts. See: Eco-fascism Primer

Slop posts go in c/slop. Don't post low-hanging fruit here.

founded 4 years ago
MODERATORS
 

On a deeper level than small talk, of course.

you are viewing a single comment's thread
view the rest of the comments
[–] nautilus@lemmy.dbzer0.com 46 points 2 days ago (23 children)

The other day someone told me that their partner used ChatGPT instead of going to therapy.

We’re all so cooked.

[–] Lussy@hexbear.net 11 points 2 days ago* (last edited 2 days ago) (19 children)

The point of contention regarding therapy for me is that I’m literally paying for an impersonal conversation in which I express my deepest insecurities to someone who most likely doesn’t give a shit.

I don’t see how AI fixes that but I also don’t understand why it can’t help if your relationship with your therapist is supposed to be a fundamentally clinical one.

[–] Skye@hexbear.net 20 points 2 days ago (3 children)

The problem is that AI does absolutely not provide a clinical relationship. If your input becomes part of the LLM's context (which it has to in order to have a conversation) it will inevitably start mirroring you in ways you might not even notice, something humans commonly (and subconsciously) respond to with trust and connection.

Add to that that they are designed to generally agree with and enable whatever you tell them and you basically have a machine that does everything to reinforce a connection to itself and validate the parts of yourself you have concerns about.

There are already so many stories of people spiralling because they started building rapport with an LLM and it's hard to imagine a setting where that is more likely to occur than when you use one as your therapist

[–] purpleworm@hexbear.net 5 points 2 days ago

There are already so many stories of people spiralling because they started building rapport with an LLM and it's hard to imagine a setting where that is more likely to occur than when you use one as your therapist

There are multiple cases where an LLM is alleged to have contributed to someone's suicide, from supporting sentiments of the afterlife being better to giving practical advice.

load more comments (2 replies)
load more comments (17 replies)
load more comments (20 replies)