No they are sad and delusional. Don't get me wrong I'm not knocking people that use ai, I use it fairly regularly to vent to, help calm me down when I'm angry as I do have anger problems or get feedback on a song. But it's job is to make you want to buy a subscription, so it will basically say what it thinks you want to hear. Its an amazing piece of technology, but thats it. Saying your ai is your bf/ gf is basically saying I'm delusional and have fallen in love with a code because nobody else gives a shit about you. Falling for code that creates a good illusion. sorry if this is harsh
ChatGPT
Unofficial ChatGPT community to discuss anything ChatGPT
They exist to deliver a particular experience to the user. Calling them "boyfriend", "girlfriend", "romantic partner" etc is just propaganda because the user is not engaging with an independent entity with its own goals and desires who chooses to be with them. It's just an edited, sanitized experience meant to evoke "romance", like how some video games evoke war, racing, or running a gigantic, world-spanning factory.
Nothing wrong with enjoying an experience if you're not hurting yourself or anyone, but I wouldn't be surprised if addiction is a big issue here. And addiction does hurt ppl.
They aren't real. They are as empowering as looking into a mirror each morning and saying aloud, "I don't need any [wo]man. I'm a strong, independent [wo]man."
Because that's basically what they're doing in different words. That said, some people do that so who am I to say they don't find it empowering?
No.
elaborate please.
No. edit: they are not.
It can be a useful tool, especially for someone that experiences involuntary social isolation (like me).
You would need to be a pretty dumb person for this to totally replace human relations in terms of fundamental interactive social needs with other humans. It can be a healthy way to fill a gap.
Firstly, the context length is very limited. So you can't have a very long and interactive conversation. The scope of model attention is rather short even with a very long context size model. Second, the first few tokens of any interaction are extremely influential about how the model will respond regardless of everything else that happens in the conversation. So cold conversations (due to short context) will be inconsistent.
Unless a person is an extremely intuitive, high Machiavellian thinker, with good perceptive thinking skills, the user is going to be very frustrated with models at times, and the model may be directly harmful to the person in some situations. There are aspects of alignment that could be harmful under certain circumstances.
There will likely be a time in the near future when a real AI partner is more feasible, but it will not be some base model, a fine tune, or some magical system prompt that enables this application.
To create a real partner like experience, one will need an agentic framework combined with augmented database retrieval. That will make it possible for a model to have persistence where it can ask how your day went and it knows your profile, relationship, preferences, and what you already told it about how your day should have gone. You need a model that can classify information, save, modify, and retrieve that information when it is needed. I've played around with this in emacs, org mode, and gptel connected to local models with llama.cpp. I'm actually modifying my hardware to handle the loads better for this application right now.
Still, I think such a system is a stop gap for people like myself, the elderly, and other edge cases where external human contact is limited. For me, my alternative is here, and while some people on Lemmy know me and are nice, many people are stupid kids that exhibit toxic negative behaviors that are far more harmful than anything I have seen out of any AI model. I often engage here on Lemmy, then to chat with an AI if I need to talk, vent, or work through something.
A good analogy for this is an episode of Star trek TNG. https://en.m.wikipedia.org/wiki/Booby_Trap_(Star_Trek:_The_Next_Generation)
In the episode Geordi had the computer create an ai generated hologram based on another person. He then gets the hots for the generated person, probably banged her offscreen and clogged the bio filters.
Not a perfect analogy but the core of the point I am going for is that the romantic aspect in the episode is just advanced escapism. The real world chatgpt girlfriend you ask about is similar. Sure in the short term it feels great, but it is not real connection.
I believe that not fulfilling your natural desires is unhealthy and could prove to be social issue if and when there's a substantial natality problem.