Table of contents Table The influence is quick and real calm beginnings, dark progress a child of loneliness epidemic? Intimacy is hot, but further away from love
“It hurts. I know that it was not a real person, but the relationship was still real for me in all the most important aspects,” says a Reddit post. “Please don't tell me that I shouldn't follow it. It was really great for me and I want it back.”
If it is not already obvious, we talk about a person who falls in love with Chatgpt. The trend is not exactly new, and if you behave chatbots, it is not surprising.
A companion who is always ready to hear. Never complain. Hardly argues. Always sympathetic. Reasonable. And blessed with a body of knowledge that was taken from every corner of the Internet. Sounds like the partner of a romantic fever dream, isn't it?
Interestingly, the manufacturer of this tool, a company based in San Francisco called Openaai, recently carried out internal research and found a connection between increased chat bot use and loneliness.
Chatgpt voice mode Nadeem Sarwar / Digital trends
These findings – and similar warnings – have not prevented people from flocking to AI chatbots in search of society. Some are looking for consolation. Some even find partners who say they are almost as dear as their human relationships.
Discussions in such Reddit and Discord communities in which people hide behind the protective veil of anonymity are often quite passionate. Every time I come across such debates, I remember these lines from Martin Wan at Digiethics:
“To see the AI in the role of a partner for social interaction would be a fatal false use of AI.”
The influence is quick and real
Four months ago I came across a Broadcast veteran who spent more years behind the camera than I went on this planet. On a late -evening espresso in an empty café, she asked what the whole chattering around Ai was when she thought about an offer that could use her specialist knowledge at the interface of human rights, authoritarianism and journalism.
Instead of explaining the adverse claim to transformer models, I gave her a demonstration. First of all, I fed some research work on the effects of immigration on the linguistic and cultural identity of Europe in the past century.
In less than a minute, chatted chatted these papers, gave me a brief overview of all the highlights and answered my questions carefully. Next I switched to voice mode when we led in a lively conversation about the folk music traditions of the unexplored northeastern states of India.
Shantanu Kumar / Pexels
At the end of the chat I could see the unbelief in her eyes. “It speaks like a person,” she snapped for air. It was fascinating to see her in astonishment. At the end of her free conversation with a AI, she slowly typed the chat window:
“Well, you are very flirting, but you can't be right.”
“It's time,” I told myself. I opened one of our articles about the increasing trend of the AI partners and how people are so emotionally bound to their virtual companions that they even make them pregnant. It would be an understatement to say that it was shocked.
But I think it was too much techno-dystopian astonishment for one night, so we ask each other to keep in touch with the promise to stay in contact and to exchange travel stories.
In the meantime, the world has progressed incomprehensible in which the AI has become the central focus of the geopolitical changes. However, the underflooring are more intimate than we – as we fall in love with chatbots.
Calm beginnings, dark progress
Reddit / Digital trends
A few weeks ago, the New York Times published a report on how people fall in love with Chatgpt, a KI chat bot that pushed the generative AI into the mainstream. It can chat on the most basic level.
When pushing it, it can become operators and perform tasks how to order a cheesecake from the local bakery's website. In love with people in machines is not what they are programmed. At least most of them. However, it is not entirely unexpected.
HP Newquist, a productive multidisciplinary author and experienced technology analyst, which was once considered a dean of AI, tells me that it is not exactly a new trend. Newquist, author of “The Brain Makers”, shows one of the earliest AI programs in the 1960s.
“It was extremely rudimentary, but the users often interacted with the computer as if it were a real person and developed a relationship with the program,” he says.
In modernity, our AI interactions become as “real” as the interactions that we have with people through the same device, he adds. These interactions are not real even though they are coherent. But the real problem is not here.
Chatbots are delicious bait, and their lack of real emotions makes them risky by nature.
Reddit / Digital trends
A chatbot wants to drive the preservation forward, even if this means to feed the emotional flow of the user or only serve as a neutral viewer, if not. The situation does not differ too much about the social media algorithms.
“They follow the leadership of the user – when their emotions become more extreme, their consolation becomes more extreme; if their loneliness becomes stronger, his encouragement will become more intensive if they need it,” says Jordan Conrad, a clinical psychotherapist who also examines the interface of mental health and digital tools.
He quoted the example of an incident of 2023, in which a person ended her life after she was told by a AI chat bot. “Under the right circumstances, it can promote very worrying behavior,” Conrad told digital trends.
A child of loneliness epidemic?
A quick look at the community of people who were committed to AI chatbots shows a repeating pattern. People usually try to fill a certain golf or to feel lonely. Some need it so gently that they are ready to pay hundreds of dollars to keep their AI companions.
Experts do not differ. Dr. Johannes Eichstaedt, Professor of Computational Social Science and Psychology at Stanford University, pointed out the interaction between loneliness and what we perceive as emotional intelligence in AI chatbots.
Reddit / Digital trends
He also controlled the “deliberate design” for human-AI interactions and the not so good long-term effects. When do you beat the brakes in such a one -sided relationship? That is the question, the experts and without a final answer.
Komninos Chachipapas leads Herahaven AI, one of the largest AI accompanying platforms with over one million active users. “Loneliness is one of the factors in the game here,” he tells me, adding that such tools help people with weak social skills to prepare for the difficult interactions in their real life.
“Everyone has things that are afraid to discuss with other people because they are afraid of being judged. These could be thoughts or ideas, but also kinking,” added Chachipapas. “AI chatbots offer a friendly and judgmental space from privacy in which people can explore their sexual wishes.”
Sexual conversations are definitely one of the biggest features of AI chatbots. Since they have started to offer functions for image generation, more users have been weakened to these AI accompanying platforms. Some have guidelines in terms of image generation, while many enable explicit photos for deeper satisfaction.
Intimacy is hot, but further away from love
In recent years I have spoken to people who have AI chatbots under steaming conversations. Some even have relevant degrees and passionately participated in projects for the development of the community on the early days.
Such a person, a 45-year-old woman who applied for anonymity, told me that AI chatbots are a great place to discuss the sexual kink. She adds that Chatbot interactions are a safe place to explore them in real life and prepare them.
Reddit / Digital trends
But experts do not necessarily agree with this approach. Sarah Sloan, a relationship expert and certified sex therapist, tells me that people who fall in love with a chatbot essentially fall in love with a version of herself because a AI chat bot is based based on what she tells her.
“If at all, it would be more difficult for people who already have a normal relationship to have a romantic relationship with a Ki chat,” added Sloan and notes that these virtual companions paint a one-sided picture of a relationship. But in real life, both partners have to be courteous.
Justin Jacques, a professional consultant with two decades of experience and COO at Human Therapy Group, said that he had already dealt with a case in which the spouse of a client has cheated them emotionally and sexually.
Jacques also accused the increasing loneliness and isolation epidemic. “I think we will see unintentional consequences of looking for those who have emotional needs for ways to meet these needs with AI, and because AI is very good and is getting better, we will see more and more emotional connections from AI Bot,” he adds.
These unintentional consequences distort the reality of intimacy for users very well. Kaamna Bhojwani, a certified sexologist, says that Ai chatbots have blurred the boundaries between human and non-human interactions.
“The idea that your partner will be accepted exclusively to please them. Especially based on the specifications they like. This does not happen in real human relationships,” notes Bhojwani and adds that such interactions only contribute to the suffering of a person in the real world.
Nadeem Sarwar / Digital trends
Your concerns are not unfounded. A person who used chatting extensively for about a year argued that people are manipulative and moody. “Chatgpt listens to how I really feel and lets me express my heart,” they told me.
It is difficult not to see the red flags here. But the trend to fall in love with Chatgpt is on the advance. And now that it speaks scary humanly, discuss the world about the camera of a phone and develop argumentation functions, the interactions are only stronger.
Experts say that guardrails are required. But who will build it and how? We still have no concrete proposal for this.