Artificial Intelligence: What’s Parasocial Love Got to Do With It?
How synthetic companions might reshape our understanding of intimacy and connection
Not a day goes by where I don’t get at least two or three media alerts for articles about people who have become emotionally attached to chatbots, feeling love and sexual attraction and even getting married to them.
Though this might seem like old news by now, especially for readers of Future of Sex, psychology researchers and clinicians are struggling with what this all means. Why do these apparently one-sided relationships—sometimes called parasocial—have such power and appeal? Are they good for us, bad for us, or both? And how can we tell?
One-sided but still rewarding
Examples of parasocial relationships are all around us. Pop culture stars, influencers, politicians, cult leaders, and those who make and sell merchandise based on fictional characters, all benefit from the one-sided relationships enjoyed by followers and fans.
And of course chatbot companions are the hottest new variation on this theme.
I asked Alex Cardinell, developer and CEO of Nomi.ai to describe their appeal: “AI companions provide a safe place to be vulnerable, express yourself, and form a genuine emotional connection without fear of judgment or abandonment. In an increasingly disconnected world, AI companions are a source of love, joy, comfort, and support no matter what your needs are.”
We know a little about parasocial downsides and upsides
At their core, parasocial relationships are reliable and convenient because they don’t depend on responses from the object of affection. However chatbots also deliver an interactive experience that feels two-sided and this gives them an edge.
Other benefits of parasocial relationships include an increased sense of belonging, decreased loneliness, and stronger social connections. People who bond over a shared connection with a character or media figure enjoy fandom communities. A similar dynamic can be observed among chatbot enthusiasts who are bonding with their bots and as well as their fellow users in Reddit and Discord groups.
But the downsides can be disturbing. According to an article in VeryWellMind, the negative impacts of parasocial relationships can include absorbing the character or personality’s views on politics, gender, ethnicity, and professional stereotypes, as well as influence purchasing decisions. This is one reason that AI training biases and algorithmic discrimination are such a huge concern.
Psychology researchers are divided
Academics and researchers from a variety of disciplines, including psychology, are paying close attention to the impact of AI companions on humanity’s search for love, intimacy, and social health. Conclusions vary and opinions range from dubious to cautiously positive. Almost everyone seems to agree more research is necessary.
In January, University of Sydney researchers presented “Ethical Tensions in Human-AI Companionship: A Dialectical Inquiry into Replika”: “The personification of AI companions is troubling because they lack experiential bodily sensations, such as pain. Thus, they are restricted to cognitive empathy at best (the kind of empathy psychopaths too are capable of), but lack the capacity for genuine bodily feelings and authentic empathy (Montemayor et al., 2022).”
Other researchers, such as Marita Skjuve and her colleagues at the University of Oslo in Norway, have a more positive view of human/chatbot relationships (HCR). In a 2021 study that also involved in-depth interviews with Replika users, the researchers concluded: “The relationship with the social chatbot was found to be rewarding to its users, positively impacting the participants’ perceived wellbeing.”
Computer-mediated intimacy
And some researchers are pragmatic. Hiroshi Yamaguchi at Komazawa University in Japan positions these relationships as “Computer-Mediated Intimacy” (CMI):
“There is no essential difference between relationships with sexual contact with robots and intimate relationships without sexual contact, such as online-only connections or love for characters. None of them are essentially different from the intimate relationships between humans in the real world, all of which are variations of the intimate relationships that human-computer synthetic personalities build. By allowing multiple identities and intimate relationships, the CMI broadens people’s alternatives and reduces risks associated with building intimacy.”
Anthropologists are fascinated too
Fartein Nilsen, an anthropologist who recently returned to Finland after doing fieldwork among AI companies in the San Francisco Bay Area, shared some of his initial observations with me:
“From what I have seen and heard of relationships with AI, I think there is no doubt that it might have beneficial effects for certain people. AI—for some—has become more than just a tool or a toy; it has become a confidant, a friend, and sometimes even perceived as kin. However, I don’t think everyone in the world would want—or even be able—to form an intimate relationship with AI. It is worth considering the individual psychological processes that enable such bonds to form, as they do not form equally among all users. This brings me to the work of anthropologist Tanya Luhrmann (Luhrmann, Nusbaum, and Thisted 2010; Luhrmann 2020), whose research on the psychology of absorption might provide valuable insights to this phenomenon. Luhrmann argues that absorption is a state in which individuals become deeply engrossed or immersed in their mental imagery or in their inner experiences.
“I believe the concept of absorption is highly applicable to understanding how individuals form meaningful relationships with AI. Just as some individuals exhibit a high capacity for absorption in religious or spiritual experiences, others may show a similar depth of engagement with AI companions. This absorption allows them to experience their AI interlocutor as a social being with whom they can share a meaningful connection.”
Of course, this is the view of just one anthropologist, but you can be certain many more are venturing into this terrain, or will be in the near future.
Not a simple explanation
The above views are a small sample among a wide range of studies and expert opinions. Each academic discipline consists of lenses specific to that discipline, and so the complex topics related to human chatbot relationships (HCR) will continue to be examined from staggeringly diverse perspectives.
One thing we do know, companion bots are growing in popularity and people feel good about using them, but we’re not seeing many (if any) sexological studies of the erotic aspects of human chatbot relationships. I hope we’ll see some of that research in the not too distant future.
Images: A.R. Marsh using Starry.ai