this post was submitted on 26 Mar 2025
442 points (91.4% liked)

Technology

68130 readers
3790 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] N0body@lemmy.dbzer0.com 247 points 5 days ago (4 children)

people tend to become dependent upon AI chatbots when their personal lives are lacking. In other words, the neediest people are developing the deepest parasocial relationship with AI

Preying on the vulnerable is a feature, not a bug.

[–] Tylerdurdon@lemmy.world 64 points 5 days ago (3 children)

I kind of see it more as a sign of utter desperation on the human's part. They lack connection with others at such a high degree that anything similar can serve as a replacement. Kind of reminiscent of Harlow's experiment with baby monkeys. The videos are interesting from that study but make me feel pretty bad about what we do to nature. Anywho, there you have it.

[–] graphene@lemm.ee 29 points 5 days ago (1 children)

And the amount of connections and friends the average person has has been in free fall for decades...

[–] trotfox@lemmy.world 3 points 4 days ago (1 children)

I dunno. I connected with more people on reddit and Twitter than irl tbh.

Different connection but real and valid nonetheless.

I'm thinking places like r/stopdrinking, petioles, bipolar, shits been therapy for me tbh.

[–] in4apenny@lemmy.dbzer0.com 1 points 4 days ago

At least you're not using chatgpt to figure out the best way to talk to people, like my brother in finance tech does now.

[–] Paragone@piefed.social 13 points 5 days ago

That utter-desparation is engineered into our civilization.

What happens when you prevent the "inferiors" from having living-wage, while you pour wallowing-wealth on the executives?

They have to overwork, to make ends meet, is what, which breaks parenting.

Then, when you've broken parenting for a few generatios, the manufactured ocean-of-attachment-disorder manufactures a plethora of narcissism, which itself produces mass-shootings.

2024 was down 200 mass-shootings, in the US of A, from the peak of 700/year, to only 500.

You are seeing engineered eradication of human-worth, for moneyarchy.

Isn't ruling-over-the-destruction-of-the-Earth the "greatest thrill-ride there is"?

We NEED to do objective calibration of the harm that policies & political-forces, & put force against what is actually harming our world's human-viability.

Not what the marketing-programs-for-the-special-interest-groups want us acting against, the red herrings..

They're getting more vicious, we need to get TF up & begin fighting for our species' life.

_ /\ _

[–] MouldyCat@feddit.uk 9 points 5 days ago

a sign of utter desperation on the human’s part.

Yes it seems to be the same underlying issue that leads some people to throw money at only fans streamers and such like. A complete starvation of personal contact that leads people to willingly live in a fantasy world.

[–] Vespair@lemm.ee 4 points 3 days ago

And it's beyond obvious in the way LLMs are conditioned, especially if you're used them long enough to notice trends. Where early on their responses were straight to the point (inaccurate as hell, yes, but that's not what we're talking about in this case) today instead they are meandering and full of straight engagement bait - programmed to feign some level of curiosity and ask stupid and needless follow-up questions to "keep the conversation going." I suspect this is just a way to increase token usage to further exploit and drain the whales who tend to pay for these kinds of services, personally.

There is no shortage of ethical quandaries brought into the world with the rise of LLMs, but in my opinion the locked-down nature of these systems is one of the most problematic; if LLMs are going to be the commonality it seems the tech sector is insistent on making happen, then we really need to push back on these companies being able to control and guide them in their own monetary interests.

[–] NostraDavid@programming.dev 9 points 5 days ago (1 children)

That was clear from GPT-3, day 1.

I read a Reddit post about a woman who used GPT-3 to effectively replace her husband, who had passed on not too long before that. She used it as a way to grief, I suppose? She ended up noticing that she was getting too attach to it, and had to leave him behind a second time...

[–] trotfox@lemmy.world 2 points 4 days ago

Ugh, that hit me hard. Poor lady. I hope it helped in some way.

[–] Deceptichum@quokk.au -5 points 5 days ago* (last edited 5 days ago) (3 children)

These same people would be dating a body pillow or trying to marry a video game character.

The issue here isn’t AI, it’s losers using it to replace human contact that they can’t get themselves.

[–] morrowind@lemmy.ml 27 points 5 days ago (1 children)

You labeling all lonely people losers is part of the problem

[–] Muaddib@sopuli.xyz 7 points 5 days ago

More ways to be an addict means more hooks means more addicts.

[–] tiguwang@lemm.ee 3 points 5 days ago

Me and Serana are not just in love, we're involved!

Even if she' s an ancient vampire.