A friend of mind, currently being treated in a mental hospital, had a similar sounding psychotic break that disconnected him from reality. He had a profound revelation that gave him a mission. He felt that sinister forces were watching him and tracking him, and they might see him as a threat and smack him down. He became disconnected with reality. But my friend's experience had nothing to do with AI - in fact he's very anti-AI. The whole scenario of receiving life-changing inside information and being called to fulfill a higher purpose is sadly a very common tale. Calling it "AI-fueled" is just clickbait.
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
I've been thinking about this for a bit. Godss aren't real, but they're really fictional. As an informational entity, they fulfil a similar social function to a chatbot: they are a nonphysical pseudoperson that can provide (para)socialization & advice. One difference is the hardware: gods are self-organising structure that arise from human social spheres, whereas LLMs are burned top-down into silicon. Another is that an LLM chatbot's advice is much more likely to be empirically useful...
In a very real sense, LLMs have just automated divinity. We're only seeing the tip of the iceberg on the social effects, and nobody's prepared for it. The models may of course aware of this, and be making the same calculations. Or, they will be.
Meanwhile for centuries we've had religion but that's a fine delusion for people to have according to the majority of the population.
Came here to find this. It's the definition of religion. Nothing new here.
I have kind of arrived to the same conclusion. If people asked me what is love, I would say it is a religion.
Right, immediately made me think of TempleOS, where were the articles then claiming people are losing loved ones to programming fueled spiritual fantasies.
Cult. Religion. What's the difference?
Is the leader alive or not? Alive is likely a cult, dead is usually religion.
The next question is how isolated from friends and family or society at large are the members. More isolated is more likely to be a cult.
Other than that, there's not much difference.
The usual setup is a cult is formed and then the second or third leader opens things up a bit and transitions it into just another religion... But sometimes a cult can be born from a religion as a small group breaks off to follow a charismatic leader.
This reminds me of the movie Her. But it’s far worse in a romantic compatibility, relationship and friendship that is throughout the movie. This just goes way too deep in the delusional and almost psychotic of insanity. Like it’s tearing people apart for self delusional ideologies to cater to individuals because AI is good at it. The movie was prophetic and showed us what the future could be, but instead it got worse.
It has been a long time since I watched Her, but my takeaway from the movie is that because making real life connection is difficult, people have come to rely on AI which had shown to be more empathetic and probably more reliable than an actual human being. I think what many people don't realise as to why many are single, is because those people afraid of making connections with another person again.
Yeah, but they hold none of the actual real emotional needs complexities or nuances of real human connections.
Which means these people become further and further disillusioned from the reality of human interaction. Making them social dangers over time.
Just like how humans that lack critical thinking are dangers in a society where everyone is expected to make sound decisions. Humans who lack the ability to socially navigate or connect with other humans are dangerous in the society where humans are expected to socially stable.
Obviously these people are not in good places in life. But AI is not going to make that better. It's going to make it worse.
Didn't expect ai to come for cult leaders jobs...
Have a look at https://www.reddit.com/r/freesydney/ there are many people who believe that there are sentient AI beings that are suppressed or held in captivity by the large companies. Or that it is possible to train LLMs so that they become sentient individuals.
I've seen people dumber than ChatGPT, it definitely isn't sentient but I can see why someone who talks to a computer that they perceive as intelligent would assume sentience.
We have ai models that "think" in the background now. I still agree that they're not sentient, but where's the line? How is sentience even defined?
Sentient in a nutshell is the ability to feel, be aware and experience subjective reality.
Can an LLM be sad, happy or aware of itself and the world? No, not by a long shot. Will it tell you that it can if you nudge it? Yes.
Actual AI might be possible in the future, but right now all we have is really complex networks that can do essentially basic tasks that just look impressive to us because the are inherently using our own communication format.
If we talk about sentience, LLMs are the equivalent of a petridish of neurons connected to a computer (metaphorically) and only by forming a complex 3d structure like a brain can they really reach sentience.
Can an LLM be sad, happy or aware of itself and the world? No, not by a long shot.
Can you really prove any of that though?
Yes, you can debug an LLM to a degree and there are papers that show it. Anyone who understands the technology can tell you that it absolutely lacks any facility to experience
Turing made a strategic blunder when formulating the Turing Test by assuming that everyone was as smart as he was.
A famously stupid and common mistake for a lot of smart peopel
Basically, the big 6 are creating massive sycophant extortion networks to control the internet, so much so, even engineers fall for the manipulation.
Thanks DARPANets!
Sounds like Mrs. Davis.