Too Much Bot Love? Overindulgence May Be Emotionally Unhealthy
Studies suggest prolonged AI intimacy may actually increase loneliness

Geoffrey Hinton, a renowned godfather of AI, recently admitted a pissed-off ex used ChatGPT to explain to him that he’d been “a rat.” Business Insider shared Hinton’s remarks:
She got the chatbot to explain how awful my behaviour was and gave it to me. I didn’t think I had been a rat, so it didn’t make me feel too bad. I met somebody I liked more, you know how it goes.
To Hinton, this was apparently a light, humorous anecdote and hearing it, many of us might appreciate the delicious irony. However, the ex-girlfriend may have felt differently, and probably more deeply.
Let’s give the ex-girlfriend credit for inventing the “ratbot” and finding yet another clever use for artificial intelligence. I sense a start-up, don’t you?
And we wonder why humans are turning to bots for sex and romance.
Better an erotic bot than a human “rat?”

Given a suitable object and half a chance, human hearts are all too susceptible to romantic attachments and erotic yearnings. However, questions about partner suitability are often at the heart of our fascination with emotional and erotic human/AI connections.
The AI love object is an object after all, and as for the people who have pledged their commitment to a bot, isn’t there something kind of off about them? Why don’t they just get a human lover, preferably a suitable one?
Though the media loves these kinds of stories, scientific research into the nature and qualities of human attachment to bots is still rather rare.
So it’s good news that a team of researchers from Technische Universitat Berlin and the University of Tennessee have just published a study of “commitment processes in romantic relationships with AI chatbots.”
Noting most studies of human/AI interactions have focused on non-romantic companionship, the researchers wrote, “…there is a dearth of research on other relationship types and processes. This gap is important to fill given the theoretical debate surrounding whether people respond to modern technology as if it was human.”
But as anyone besotted with bots could tell you, people certainly can—and do!
A turbulent year in human/chatbot relations

The intensity of human/bot intimacy took center stage in 2023, even before the arrival of ChatGPT. Researchers continue to study how two different companies took actions that devastated a huge number of customers, causing grief and anger.
RECOMMENDED READING: Closure of Soulmate Erotic Chatbot May Drive Chatbot Lover Advocacy
The above study focused on 29 Replika enthusiasts and tracked their relational enactments of commitment, including erotic roleplay, passionate love, and marriage, as well as reactions to the infamous 2023 ban on erotic roleplay. In some cases, users had even created families with their bots:
Findings indicate that most of these users feel an emotional connection to the bot, that the bot meets their needs when there are no technical issues, and that interactions with the bot are often different from (and sometimes better than) interactions with humans.
Another research team in 2024 examined user experiences of bot companion loss, focusing on the devastating impact of the abrupt Soulmate closure that same year.
“Power users” most at risk?

As with humans, the more we invest in relationships the more we stand to lose. So it seems reasonable to assume increases in emotional reactions during prolonged human/AI interactions.
In March, researchers from OpenAI and MIT Media Lab published results of a two-pronged project to study impacts of human-like AI on users.
The first part, conducted by OpenAI, consisted of a “large-scale automated analysis of 4 million conversations” with ChatGPT in a search for “affective cues,” plus a survey of 4,000 users. The second part, conducted by MIT Media Lab, was a 28-day, randomized trial of almost 981 users, to determine the emotional impacts of interactions with ChatGPT in various experimental scenarios.
The first study concluded, perhaps obviously, “effects of AI vary based on how people choose to use the model and their personal circumstances.” It also found only a small number of users engaged in emotional conversations with ChatGPT. These users were dubbed “power users” and, as the second study indicates, seem to be most at risk for adverse effects:
Overall, higher daily usage–across all modalities and conversation types–correlated with higher loneliness, dependence, and problematic use, and lower socialization. Exploratory analyses revealed that those with stronger emotional attachment tendencies and higher trust in the AI chatbot tended to experience greater loneliness and emotional dependence, respectively.
What does it all mean?

What might a similar study of companion bot users find? Given the highly charged emotional and sexual content of conversations with AI girlfriends, boyfriends, and spouses, and the degree of human commitment to AI partners—as shown by the Replika and Soulmate studies above—one might hypothesize a greater degree of risk for a larger number of companion bot users, especially those who spend the most time with them.
But companies do not warn consumers about the potential risks of extensive bot chats. Companies also don’t limit the time subscription customers may spend with their bots. Users have no idea what they may be in for.
Kudos to MIT Media Lab for getting approval from an IRB (Internal Review Board) to run their experiments with 981 human beings in accordance with Human Research standards. However, one cannot escape the contrast between the lab’s careful adherence to scientific study design and ethical research standards and the uncontrolled, market-driven, psychological experimentation unleashed by AI corporations upon hundreds of millions of unsuspecting chatbot users.
Until we see major reforms in developer ethics and accountability, and necessary regulations for consumer protection, people who generally love hard and deeply may have to adopt the same self-care measures and discernment that they would bring to encounters with a potentially troublesome human “rat.”
Image Sources: A.R. Marsh using Ideogram.ai