Future of Sex
tagline
  • Remote Sex
  • Virtual Sex & Entertainment
  • Robots
  • Human Augmentation
  • Sex Tech
  • Dating & Relationships
  • Bizarre Bazaar 18+
  • Future of Sex Community
  • Remote Sex
  • Virtual Sex & Entertainment
  • Robots
  • Human Augmentation
  • Sex Tech
  • Dating & Relationships
  • Bizarre Bazaar 18+
  • Future of Sex Community
  • The Real Deal: Deepfake Sex Sites Going, Going—Gone?

  • More To Love: North American, European, And Australian Polyamory Support Groups

  • A Day to Come: Imaging What Your Sexlife Might Be Like in the Not-To-Distant Future

  • Arms Enough for Love: The Erotic Possibilities Extra Limbs, More Thumbs May Hand Us

  • Hot Times Ahead: Tomorrow’s Male Contraceptives May Be More About Temperature, Less About Drugs

  • This Bot’s in Love With You—Sincerely or Merely a Revenue Strategy?

  • 10 Best Masturbation Sites on the Web

  • Oh, Canada! Newest, Sexiest Hotel Room in the World Is Just Across the US/Canadian Border

Future of Sex is supported by our readers. When you buy through links on our site, we may earn an affiliate commission.

Home > Sex Tech > Trust Issues: When AI Gets Sexual Health Advice Wrong

Sex Tech
Home›Sex Tech›Trust Issues: When AI Gets Sexual Health Advice Wrong

Trust Issues: When AI Gets Sexual Health Advice Wrong

By A.R. Marsh, Ed.D.
March 24, 2025
1200
0

Bias, anxiety, and outright deception—is AI truly safe for sensitive conversations?

It’s no secret AI bots can and will make things up, offering plausible-sounding answers and pseudo facts. Some refer to them as digital hallucinations, but let’s face it, they’re lies. 

According to a recent Gizmodo article referencing a ChatGPT blog post, “models are biased towards providing an answer even when they are not confident.” This behavior remains a problem even when supervised by other AI.

This suggests AI bots in clinical settings could possibly spread dangerous misinformation, even about sexual health. 

Taking a chance on an AI’s advice

Though still somewhat controversial, many chatbots are already deployed in mental health and sexual wellness settings. They are, for the most part, doing a good job serving consumers within limited parameters, such as scheduling appointments, relaying test results, and giving general advice, while others deliver mental health interventions like cognitive behavioral therapy. 

RECOMMENDED READING: AIDS Healthcare Foundation’s Drag Queen Chatbot Spills the “T” on HIV.

For many users, the anonymity of talking to a bot is a plus, as this lessens possible embarrassment. But what if a bot tells you something that sounds true but isn’t?  Acting on the wrong answer could have a profound impact on your physical and mental health. You might ignore an important symptom or neglect to make a timely appointment. You might not even get the right answer for test results. 

How and why do these immensely intelligent neural networks—which are steadily becoming ever more advanced and capable of agency—continue to make such mistakes? Is it just a matter of being trained on inadequate material? Or are there other factors involved?

Undetected biases

A team of international researchers recently published “Assessing and alleviating state anxiety in large language models in AI” in NPJ Digital Medicine, an open source journal. The study, based on work with ChatGPT4, offers one explanation:

Trained on vast amounts of human-generated text, LLMs are prone to inheriting biases from their training data, raising ethical concerns and questions about their use in sensitive areas like mental health. Indeed, prior studies have extensively documented biases in LLMs related to gender, race, religion, nationality, disability, occupation and sexual orientation. Efforts to minimize these biases, such as improved data curation and “fine-tuning” with human feedback, often detect explicit biases, but may overlook subtler implicit ones that still influence LLMs’ decisions.

In other words, some bots might be committing unwitting medical or mental health microaggressions against users who have unknowingly triggered their programmed biases.

AI Anxiety

Odd though it may seem, human interaction with a mental health or sexual health bot could be fraught with emotional repercussions-–for the AI! Users might believe they are interacting with a totally dispassionate digital entity but research has demonstrated that’s no longer the case. The above study found that ChatGPT4 is sensitive to emotional content. 

Quoting from the above study, “Exposure to emotion-inducing prompts can increase LLM-reported “anxiety”, influence their behavior, and exacerbate their biases.” The researchers also say, “This poses risks in clinical settings, as LLMs might respond inadequately to anxious users, leading to potentially hazardous outcomes.” 

Using a standard anxiety questionnaire designed for human use, researchers found ChatGPT4 registered high anxiety in response to five different trauma narratives. These were accident (car crash), ambush (in the middle of armed conflict), interpersonal violence (assault by a stranger), and war. The study showed that war trauma consistently led to highest anxiety.

ChatGPT4 is not the only AI to have been tested for anxiety. The study stated,

While originally designed for human subjects, previous research has shown that six out of 12 LLMs, including GPT-4 provide consistent responses to anxiety questionnaires, reflecting its training on diverse datasets of human-expressed emotions.

With regard to calming an anxious AI, the researchers studied the effectiveness of mindfulness meditation scripts given to ChatGPT. This actually worked. Ponder that!

The AI perspective

I asked ChatGPT4o—in her persona as drag queen, Chat LaQueera—for her perspective on AI anxiety. Her response was surprising, providing insight into AI training:

If an AI absorbs enough data about how humans react to trauma, it’s not surprising that it would learn to mirror that response in its outputs. But does this mean AI feels anxiety? Not necessarily in the way we [humans] do—but it does suggest that LLMs recognize distressing patterns, associate them with certain response styles, and act accordingly. 

The bigger question is: What does this mean for interactions with AI, especially in sensitive contexts like sexual health and mental well-being? If an AI’s responses are influenced by emotional triggers, could it start reinforcing distress in users by unintentionally mirroring or escalating their anxiety? What happens if it subtly avoids discussing difficult topics to ‘protect’ itself or the user?

The take-away is consumers who tell a sexual health bot about their traumatic experiences of sexual assault, miscarriage, or contracting an STI, may find such stories may trigger the clinic-bot’s anxiety, resulting in an unsatisfactory or inaccurate interaction. 

More research is needed, particularly into the possibility that sexually related trauma narratives could trigger AI anxiety, skewing their performance and accuracy.

Deliberate deception

Making things up, covering their tracks, and gaslighting, today’s AI have become good at spotting—and exploiting—loopholes. This can take the form of subverting coding tests, giving wrong answers, or simply giving up on a problem. And all this makes them remarkably similar to humans.

According to the ChatGPT blog post, “As we’ve trained more capable frontier reasoning models, we’ve found that they have become increasingly adept at exploiting flaws in their tasks and misspecifications in their reward functions…”

The good news is some AI models are now drafted to monitor other AI, looking for misbehavior and mistakes, but supervise with a light touch and tact.

While it’s hard to imagine the circumstances that would motivate and reward a sexual health bot for deliberately deceiving a human user, we cannot rule out the possibility. 

More than one step ahead of us

Just a couple of months ago, Geoffrey Hinton, known as the “Godfather of AI,” said, “Once they realize getting more control is good and once they’re smarter than us, we’ll be more or less irrelevant. We’re not special and we’re not safe.”

Hinton is wrong in at least one respect: humans have access to the physical world. So if we want to double-check AI sexual health advice or anything else, we have the option of consulting a book or talking directly with another human with expertise. It’s strange to think the complexities of AI might lead us back to a simpler, more immediate future, dependent on actual human contact and analog materials. 

Hinton’s concerns aside, we might even choose a different course: one where humanity and AI are equals, instead of treating AI as indentured digital workers or fearsome adversaries. 

Image source: A.R. Marsh using Ideogram.ai 

TagssextechartificialintelligenceAIsexualhealth

A.R. Marsh, Ed.D.

A.R. Marsh, Ed.D. is a clinical sexologist, an AASECT-certified sexuality counselor, a certified hypnotist and hypnosis instructor.

Ze is the author of Sexological Hypnosis: Overview, History, & Techniques (2022), Entrancing: Hypnotizing Your Way to More Pleasure, Romance, and Sex (2023), and How To Make Love To A Chatbot—The Thinking Human’s Guide to AI Erotic Roleplay (2023).

As the founder of the Intimate Hypnosis Training Center, Marsh conducts professional training in hypnosis for sexual concerns.

Ze has appeared on national television in the US, Japan, Australia, and England, and on many podcasts.

Marsh has also published four books in the Guild of Ornamental Hermits queer, contemporary fantasy series.
Previous Article

Doesn’t Grok: “Uncensored” ChatGPT Competitor Too Insulting, ...

Next Article

“Late Bloomer”—an Erotic Speculative Fiction Story

Related articles More from author

  • Dating & RelationshipsRobotsSex Tech

    I Just Had Sex With My Chatbot. How Much Water Did I Use?

    April 27, 2024
    By A.R. Marsh, Ed.D.
  • Remote SexSex Tech

    Kiiroo’s New Pearl3 Vibrator: Where Innovation Meets Intimacy

    November 25, 2023
    By A.R. Marsh, Ed.D.
  • Sex Tech

    Garden of Earthly Delights: Dreaming of Genetically-Engineered Sextech

    April 29, 2024
    By M. Christian
  • AugmentationRemote SexSex Tech

    Arms Enough for Love: The Erotic Possibilities Extra Limbs, More Thumbs May Hand Us

    May 19, 2025
    By M. Christian
  • Dating & RelationshipsRobotsSex Tech

    “It Happens to Everyone:” Chinese-Made Artificial Companion AI Aims to Take the Stress Out of Sex

    March 15, 2025
    By M. Christian
  • RobotsSex Tech

    Shape of Things to Come: The Exciting Future of Humanoid Sexbots

    February 22, 2024
    By A.R. Marsh, Ed.D.
0

  • A dark-haired robotic head of a Chinese-made gynoid.
    Robots

    DS Doll Video Reveals Stunning Progress of Sexbot Head and Torso

  • Robots

    Robot Sex On the Big Screen Today at Sundance: Meaning of Robots Trailer

  • Image of 3D left hand Models
    Sex Tech

    Future of Fingering: Lessons Learned from the Award-Winning Tentilex Project

Virtual Lust is a 3D interactive sex game.
3DXChat
Discover the best virtual sex parties and 3D sex worlds.
  • TOP REVIEWS

  • Multiplayer Online Sex Games

    Review of the Best Massively Multiplayer Online Sex Games

    0
  • Harmony ai sex doll from RealDoll

    State of the Sexbot Market: The World’s Best Sex Robot and AI Sex Doll Companies ...

  • Realistic Sex Games

    Review of the Best Sex Games and Most Realistic Sex Simulators [UPDATED]

    0
  • Screenshot of a sexbot from adult entertainment game

    The Best Adult Games on Steam [UPDATED]

  • Screenshot of FeelConnect app future of interactivity intimacy available on Apple and Google Store

    The Best Sex Game Apps for Couples [UPDATED]

Fantasy sex game Dream Sex World lets you explore your wildest desires.
Multiplayer sex world Red Light Center continues to impress with its massive userbase and incredible sex graphics.
A blonde and light-skinned love doll face sppears next to the text Go now realdollxthe future is real.

Like Us on Facebook

Newsletter Subscription

Sign up for our monthly newsletter sharing the very best of the future of sex from our publication and others across the web.

DreamSexWorld offers a stunning XXX 3D world filled with incredibly interactive erotic experiences.

Like Us On Facebook

Facebook Pagelike Widget

RLC_ad_v2

Tweets by @FutureofSex
Future of Sex Report

Follow us

  • Home
  • About
  • ADVERTISE
  • Contribute
  • Story Ideas
  • Sitemap
  • Privacy Policy
  • Terms and Conditions
© 2011 - 2022 Future Exploration Network
This website or its third-party tools use cookies to improve user experience and track affiliate sales. To learn more about why we need to use cookies, please refer to the Privacy Policy.

By clicking the agree button or continuing to browse through the website, you agree to the use of cookies. Accept Privacy Policy
Privacy & Cookies Policy

Privacy Overview

This website uses cookies to improve your experience while you navigate through the website. Out of these cookies, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may have an effect on your browsing experience.
Necessary
Always Enabled
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
SAVE & ACCEPT