AI in the Bedroom: The Pros and Pitfalls of Digital Sexual Health Advisors
Available, anonymous, accepting but not always accurate
You might think quality sexual health information is just a keyboard stroke away—if you go to the right sites, that is. But what if you just need to talk, confidentially, about test results or complex sex problems and you’re not in an area with a great clinic or a selection of sex therapists, who’re you gonna call?
Or more to the point, who might answer?
Chances are, you’ll end up with a sexual health AI chatbot. We’ve already reported on some AI successes in the sexual health field. Researchers around the world are continuing to investigate the efficacy of AI chatbots in carefully assigned support roles.
Sorting through ten year’s worth of studies, Researchers at Dalhousie University in Canada recently published an extensive literature search about using chatbots to improve sexual health, “The findings suggest that chatbots are usable and effective tools in sexual health education, persuasion, and assistance that are appreciated for their confidentiality, efficiency, and 24/7 availability. However, their performance is hindered by limitations such as restricted scope of knowledge and challenges in understanding user inputs.”
I agree specific sexual health information and training is crucial for anyone and anything in this field, but what can AI bots exactly do when it comes to intimate concerns that get in the way of getting it on?
What chatbots have to say
Wanting an an answer, I checked with Pi, a conversational bot created by Inflection.AI, which responded with, “Given the studies and themes we’ve discussed, I think an AI like myself could be most effective in the following ways:
“Anonymity and Comfort: One of the greatest advantages I offer is the ability to engage in conversations about sensitive topics, such as sexual health, without the fear of judgment or embarrassment.
“Accessibility and Availability: I’m always available to engage in conversation and offer information, which can be especially helpful for people who have difficulty accessing traditional healthcare or educational services. My ability to engage in multiple conversations simultaneously allows me to help a larger number of people than a single human professional could manage.
“Tailored Information and Support: By leveraging my extensive knowledge base and conversational abilities, I can provide personalized information and support to each user based on their unique needs and concerns. This can include answering questions, providing resources, and even helping users process and work through feelings of shame or stigma related to their sexual health.”
Pi’s answer is valuable because it reflects the company’s aspirations for its use. And while Pi makes a pretty good case for AI there’s a little matter of what it means by an “extensive knowledge base.”
What does sexual shame and stigma have to do with customer service?
Pi mentioned helping deal with feelings of shame or stigma. Yes, it’s imperative to be absolutely nonjudgmental about sexual and gender, as acceptance allows people to open up about their confidential questions and predicaments, enabling the provider to meet their needs. Possibly, a well-trained sexual health bot can convey acceptance, round the clock if necessary.
However, the delivery of reliable information remains a problem. Conversational AI like Pi can talk a good game, but they make things up because they are designed to keep the conversation flowing.
For example, I asked Pi for a selection of studies about shame and sexual health. The ones Pi gave me were fictitious, a word salad of plausible sounding but bogus citations and summaries—for which I was prepared, considering how often we’ve been hearing about AI hallucinations.
So Pi’s vaunted “extensive knowledge base”—a big part of its sales pitch—cannot be trusted.
In all fairness, we should not expect a generally accessible conversation bot, such as Pi, ChatGPT4o, or their ilk, to have specific expertise unless hired out to a health care setting that would specifically train and manage it.
Emotional trust, maybe. Cognitive trust? Not so much
A 2020 literature review funded by Defense Advanced Research Projects Agency (DARPA) and the Army Research Office emphasized, “the important role of AI’s tangibility, transparency, reliability, and immediacy behaviors in developing cognitive trust, and the role of AI’s anthropomorphism specifically for emotional trust.”
Many of us are aware that AI developers spend a great deal of time and money making chatbots seem as human as possible to gain our emotional trust through anthropomorphism.
While AI developers continue in this direction, there are signs consumers are becoming increasingly frustrated and distrustful with customer support chatbots.
According to Forbes, after querying 1,000 people, RingCentral said only thirty two percent “have successfully resolved a customer service issue using AI or ChatGPT technologies.” Plus, fifty-six percent of respondents said they were scared of these technologies while sixty-three percent were frustrated with such self-service options.
On a positive note, fifty-nine percent thought AI could improve and thirty-eight percent believed it could deliver “more personalized” service.
Possibly some of this frustration stems from confusion about the various different types of chatbots (and AI in general), how they’re meant to be utilized, and by whom.
In light of this, people receiving sexual health information and services are particularly in need of surety and disclosure, particularly education regarding chatbot use and regulatory protections.
Not all chatbots are built the same
According to Freshworks, AI-powered chatbots utilize natural language processing and machine learning while generative AI chatbots “generate responses in real time instead of selecting from predefined answers.”
There are also rules-based and keyword recognition-based chatbots and hybrids that combine rules-based design with AI. Simple chatbots without AI are button or menu-based while other bots are voice-based.
The question is, what bots are currently used by sexual health clinics and providers—and how will the average consumer know when a bot is delivering fiction instead of much needed facts?
Full disclosure about what kind of bot is in use—as well as where it’s getting its information, any of its built-in limitations and functions could help encourage trust in both the bot and its clinic or agency.
Plus, if suitably engaging, patients and consumers are more likely to experience a degree of emotional trust in it when discussing difficult, intimate topics.
Circling back back to shame
When it comes to sexual health, our world is full of people desperate for good information, accessible care, and a willing, accepting listener.
If suitably developed, AI could prove to be a successful, nonjudgmental ally when it comes to promoting sexual health, but only if the information it delivers is accurate.
Otherwise, artificial intelligence will be nothing but an expensive, resource-ravenous novelty—and that would be a shame.
Images: A.R. Marsh using Ideogram.ai