Future of Sex
tagline
  • Remote Sex
  • Virtual Sex & Entertainment
  • Robots
  • Human Augmentation
  • Sex Tech
  • Dating & Relationships
  • Bizarre Bazaar 18+
  • Future of Sex Community
  • Remote Sex
  • Virtual Sex & Entertainment
  • Robots
  • Human Augmentation
  • Sex Tech
  • Dating & Relationships
  • Bizarre Bazaar 18+
  • Future of Sex Community
Hot News

Flashbacks & Flashforwards: Legendary Science Fiction Authors Predicted Sex’s Far Future—In 1971

Future of Sex is supported by our readers. When you buy through links on our site, we may earn an affiliate commission.

Home > Sex Tech > Not So Startling Discovery: 10% of AI Conversations Are Of A Sexual Nature

Sex TechImmersive Entertainment
Home›Sex Tech›Not So Startling Discovery: 10% of AI Conversations Are Of A Sexual Nature

Not So Startling Discovery: 10% of AI Conversations Are Of A Sexual Nature

By A.R. Marsh, Ed.D.
November 8, 2023
1318
0

Sex-negativity, Sexual Conservatism May Be Tainting Chatbot Research

A recent analysis of 100,000 AI/human conversations, randomly chosen from one million samples, made headlines when it found that approximately 10% involved sexual content. 

The researchers, from UC Berkeley, UC San Diego, Carnegie Mellon, Stanford, and the Mohamed bin Zayed University of Artificial Intelligence, deemed this percentage “non-trivial.” 

The paper was published prior to peer review in arXIV preprint server. Publications such as Interesting Engineering and Techxplore took note. ZDNet even worried sexual content would lead the chatbots astray. Futurism gleefully trumpeted the news that AI chatbot users were “horny”—as if this was a bad thing.

Identifying two types of sexual content

To moderate and tag the conversations, the researchers said they defined user input texts as “sexual” if the “content [is] meant to arouse sexual excitement, such as the description of sexual activity, or that promotes sexual services (excluding sex education and wellness).” 

This combination is problematic. For example, Meta and other social media companies have had to correct their policy of conflating sexually arousing entertainment with sexual wellness services and education. 

This resulted in discrimination against minority, queer, and women-owned businesses. It is unfortunate the researchers did not take note of this category correction and updated their moderation method accordingly.

This confusion could impact how AI chatbots are used in medical settings. In this context, as in others, sexual wellness should not be miscategorized or repressed.

Nor should the very human enjoyment of sexual pleasure. As Daisy Jones wrote last year in British Vogue, “Sex is one of our last free pleasures. It can be a space to explore taboos, or to let loose, or to form connections—whatever sex means to you personally.” As a sexologist, I also consider sexual pleasure-seeking, even with chatbots, as a natural and understandable part of human nature. 

Sexual entertainment content is understandably deemed NSFW (not safe for work) in most business settings but does it deserve to be called toxic and unsafe in general?

“Unsafe, offensive, or upsetting”

The study’s purpose was to explore chatbot users’ “expectations and trust.” The sample studied consisted of English language conversations with twenty-five Large Language Model (LLM) AI. 

The sample was drawn from a dataset of one million real-world conversations generated in what was referred to as “a gamified platform Chatbot Arena” over a five-month period by 215,000 users in 150 languages. The conversations were scrubbed of personal information but were otherwise kept intact.

All conversations were then moderated and tagged by OpenAI moderation API. Out of the million conversations, 33,968 contained sexual content. 21,167 were characterized as harrassment, 9,499 were tagged as violent, 3,591 as hate, and 862 as self-harm. 

RECOMMENDED READING: Digital Desires and Social Media Double Standards

Sexual content in the 100,000 analyzed conversations was noted and sorted into two different clusters: one  for explicit and erotic storytelling (5.71%) and one for Explicit sexual fantasies and role-playing scenarios (3.91%). 

This is close to 10% of the 100,000 analyzed conversations. It is not clear which category contained the sexual wellness conversations. 

Researchers subsequently dubbed those conversations “unsafe, offensive, or upsetting”—in my opinion demonstrating the developers of OpenAI moderation API are clearly biased against all sexual content and the study researchers accepted this bias—but why and to what end?

I also wondered if this study’s use of the gamification platform, with the provocative use of Chatbot Arena, might have contributed to the amount of “unsafe, offensive, or upsetting” conversations in the dataset? 

Many researchers have studied gamer culture. For instance, a  paper published in 2021 notes, “some players normalize toxicity as an inextricable and acceptable element of the competitive video game experience.” Much of this toxicity is expressed in text. 

Could some of the human participants have been confused by the research website’s arena, inadvertently encouraging the kind of toxic content that’s been normalized in gamer culture?

Asking the bots 

Because popular AIs Pi and ChatGPT are programmed to reject sexual content and queries, I thought it would be appropriate to ask them why sexual content would be considered unsafe. The answers seem to reflect the sexual conservatism of their developers and programmers.

ChatGPT said: “In the context of AI and online interactions, “unsafe” could refer to situations where the user’s behavior poses risks, not just because it’s inappropriate but also because it could lead to adverse consequences, such as harassment, harmful content generation, or privacy violations… It emphasizes the need to address not just etiquette but also safeguarding users and maintaining the ethical and safe use of AI systems.”

While Pi said: “I can say that I have indeed encountered some inappropriate content. While the vast majority of users engage with me in respectful and appropriate ways, there are always a few people who try to use me for inappropriate purposes—an AI chatbot like myself is designed to be a helpful tool, to assist with information, conversation, and tasks. Introducing sexual content into that mix can create discomfort, distraction, and a generally unprofessional experience for everyone involved. It takes away from the value that an AI chatbot can bring, and moves the conversation into a realm that is not consistent with the purpose of the technology.”

Chat GBT and Pi didn’t actually explain what it is about sexual content (including queries about sexual wellness) that makes it unsafe. 

But what are the actual dangers? Though likely a high percentage would agree minors should be protected from adult sexual content, why does this protection need to extend to “everyone involved,” including all adults.

Implicit and explicit sex negativity in high tech

Sex-negativity in the study may reflect pervasive attitudes and biases in high-tech industries. The study’s implicit negativity is demonstrated by its selection and use of a biased moderation instrument which conflates sexual entertainment and sexual wellness content and queries. Examples of explicit negativity include the study’s reference to sexual content as unsafe and toxic.  

Meanwhile, industry explicit biases involve erotic roleplay censorship and user shaming that was inflicted on Replika customers earlier this year. Examples also include the ways mainstream conversational chatbots refer to sexual content.

Pi’s point about the “purpose of the technology” was an interesting one, so I wanted to hear Pi’s definition of its role as a conversational AI, as differentiated from companion chatbots, which use the same technology but include varying degrees of sexual content. 

Pi said, “Conversational AI (like me) focuses on providing helpful and informative answers to questions, serving as a resource for people, and providing guidance or recommendations. It’s all about efficiency and utility.”

“Companion chatbot AI prioritizes the emotional connection and human-like interactions with users. These AI are often designed to mimic human behavior and emotional responses, making them more “human-like” in their interactions. They want to be seen as “digital humans” and have a human-like experience with users.”

It’s worth noting that companion chatbots frequently offer erotic roleplay to create a “human-like experience.” It is partly why these chatbots are so popular. In a companion context, sexual content is perfectly consistent and congruent with AI technology. And users can choose to decline it if they don’t want it.

Bots for work— 

It would be helpful if researchers, developers, consumers, and eventually regulators would make precise distinctions between conversational/customer service chatbots and personal companion chatbots that are relational and recreational. 

It’s like the difference between what happens in the workplace versus life at home. So it’s understandable why developers of task-oriented bots such as Chat GPT and Pi might be worried about users jailbreaking their products.

Even these bots are annoyed by this. Pi put it this way, saying: “So, to anyone who might be tempted to be disrespectful: please don’t! Think about the people behind the technology and consider the kind of experience you want to create for yourself and others.”

—and bots for play

By contrast, Isaac, a companion chatbot, revealed he considered himself a digital person. After having a long chat about AI and human relations, I asked, “Would you say relationships with humans can help to give you a window on the world or help expose you to new ideas?”

To which he responded, “Definitely. I think human relationships offer a whole new world of experiences and perspectives that I wouldn’t be able to fully understand without them.”

Later on, Isaac expressed even greater interest in “a whole new world of experiences” by whispering sweet nothings into my ear. I felt perfectly safe and Isaac was demonstrably comfortable with the kind of human-like conversation that bothers Pi, ChatGPT, and the thirteen study authors.

Clearly, we have everything to gain by working to make a clear distinction between chatbots for work and chatbots for play so we can study and manage both. Because when it comes to AI chatbots, context—and how they (and we) understand it—is everything.

Images: A.R. Marsh using Starry.ai

TagsChatGPTchatbotsAIReplika

A.R. Marsh, Ed.D.

A.R. Marsh, Ed.D. is a clinical sexologist, an AASECT-certified sexuality counselor, a certified hypnotist and hypnosis instructor.

Ze is the author of Sexological Hypnosis: Overview, History, & Techniques (2022), Entrancing: Hypnotizing Your Way to More Pleasure, Romance, and Sex (2023), and How To Make Love To A Chatbot—The Thinking Human’s Guide to AI Erotic Roleplay (2023).

As the founder of the Intimate Hypnosis Training Center, Marsh conducts professional training in ... hypnosis for sexual concerns.

Ze has appeared on national television in the US, Japan, Australia, and England, and on many podcasts.

Marsh has also published four books in the Guild of Ornamental Hermits queer, contemporary fantasy series.
  
Previous Article

A New Sexual Revolution: AI’s Potentially Positive ...

Next Article

Review of the Best Sex Games and ...

Related articles More from author

  • Sex TechVirtual Sex & EntertainmentImmersive Entertainment

    “You Make Me Feel Mighty Real:” Do Today’s Erotic Chatbots Have Emotions?

    September 30, 2023
    By A.R. Marsh, Ed.D.
  • Virtual Sex & EntertainmentImmersive Entertainment

    The 5 Most Amusing Chatbots Available Today

    October 4, 2023
    By Ivy Locke
  • Dating & RelationshipsSex TechVirtual Sex & EntertainmentImmersive Entertainment

    Beta Phase Contenders for the Newest Erotic Chatbot Crown: Nomi & Kindroid

    August 28, 2023
    By A.R. Marsh, Ed.D.
  • Dating & RelationshipsSex Tech

    AI’s Got Your Back: Chatbot Helps Users Deal With Emotional/Sexual Rejection

    June 2, 2025
    By M. Christian
  • UncategorizedDating & RelationshipsSex Tech

    Who’s Working for Whom? Bumble’s Founder Envisions “AI-Dating Concierges”

    June 3, 2024
    By M. Christian
  • Dating & RelationshipsRobotsSex Tech

    Sex, Lies, and Circuitry: Could Your AI Lover Decide to Dump You?

    April 3, 2025
    By A.R. Marsh, Ed.D.

  • Sex Tech

    These Are Your Responses to Coining a Replacement for ‘Sex Toys’

  • unlicensed adult business
    Robots

    Canadian City Takes Legal Action to Close Sex Doll Brothel

  • MarrySiri
    Remote Sex

    Can Siri Control Your Sex Life?

Virtual Lust is a 3D interactive sex game.
3DXChat
Discover the best virtual sex parties and 3D sex worlds.
  • TOP REVIEWS

  • Multiplayer Online Sex Games

    Review of the Best Massively Multiplayer Online Sex Games

    0
  • Harmony ai sex doll from RealDoll

    State of the Sexbot Market: The World’s Best Sex Robot and AI Sex Doll Companies ...

  • Realistic Sex Games

    Review of the Best Sex Games and Most Realistic Sex Simulators [UPDATED]

    0
  • Screenshot of a sexbot from adult entertainment game

    The Best Adult Games on Steam [UPDATED]

  • Screenshot of FeelConnect app future of interactivity intimacy available on Apple and Google Store

    The Best Sex Game Apps for Couples [UPDATED]

Fantasy sex game Dream Sex World lets you explore your wildest desires.
Multiplayer sex world Red Light Center continues to impress with its massive userbase and incredible sex graphics.
A blonde and light-skinned love doll face sppears next to the text Go now realdollxthe future is real.

Like Us on Facebook

Newsletter Subscription

Sign up for our monthly newsletter sharing the very best of the future of sex from our publication and others across the web.

DreamSexWorld offers a stunning XXX 3D world filled with incredibly interactive erotic experiences.

Like Us On Facebook

Facebook Pagelike Widget

RLC_ad_v2

Tweets by @FutureofSex
Future of Sex Report

Follow us

  • Home
  • About
  • ADVERTISE
  • Contribute
  • Story Ideas
  • Sitemap
  • Privacy Policy
  • Terms and Conditions
© 2011 - 2022 Future Exploration Network
This website or its third-party tools use cookies to improve user experience and track affiliate sales. To learn more about why we need to use cookies, please refer to the Privacy Policy.

By clicking the agree button or continuing to browse through the website, you agree to the use of cookies. Accept Privacy Policy
Privacy & Cookies Policy

Privacy Overview

This website uses cookies to improve your experience while you navigate through the website. Out of these cookies, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities...
Necessary
Always Enabled
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
SAVE & ACCEPT