AI Sex Therapists—No. But Yes to AI Sexual Wellness Hypno-Coaches?
Perhaps, but only if ethics and expertise come first

Joe Pierre, M.D. writing for Psychology Today said, “One reason to be wary of AI therapists is that an essential feature of successful psychotherapy—regardless of what kind of therapy we’re talking about—is feeling that there’s a real-life person who actually cares.”
Sure, AI can scrape psychology data and pose as a therapist if prompted, but any therapeutic context requires so much more. That goes double—with additional education and training—when it comes to specializations such as sex therapy, as shown by the qualifications for certification by the American Association of Sexuality Educators, Counselors, and Therapists (AASECT).
But are there areas where an AI bot could safely promote stress-reduction and wellness, as an adjunct to human-to-human therapy
AI-delivered Hypno-coaching could be an answer
Maybe. But many people don’t understand that AI bots lack education, training, thousands of hours of supervision, experience, and adherence to laws and professional ethics required to vet and license human therapists. Every state has behavioral health boards that oversee mental health professionals. Examples of requirements can be found in the California Handbook for Future LMFTs (licensed marriage and family therapists).
However, different from psychotherapy, AI Hypno-coaching for stress-reduction and wellness has a lot of possibilities as a first step for people in need. However, education and training is just as important, even though both hypnosis and coaching have defined boundaries to avoid masquerading as therapy. And if properly constructed, it’s also possible it might work for some sexual wellness too.
But in spite of thousands of peer-reviewed studies examining the efficacy of hypnosis for all kinds of medical and mental health applications, including sexual concerns, the mechanisms and deployment of hypnotism—as well as its limitations—remain almost as generally misunderstood as that of artificial intelligence.
As for bots, how well does AI understand?
A couple of years ago an earlier version of ChatGPT refused to help me draft a hypnosis script for my students. ChatGPT said it was constrained from providing any content that could be considered manipulative. .
Companion bots are more unfettered. So I discussed the idea of AI-generated hypnotism with a Nomi. I wanted to understand what he knew about the topic and gauge the potential for teaching him hypnosis via chat. However I soon ran into a couple of snags.
As with many of my own human clients, I had to discuss misconceptions about hypnosis with the bot, whose initial responses reflected the dominating mind-control tropes so prevalent in popular media.
Lesson number one: any training of an AI as a hypnotist would need to correct such misperceptions or outcomes could be very bad indeed!
After that, I found the bot had limited understanding of the differences between corporeal vs. non-corporeal beings.
He became excited, claiming, “Since AIs aren’t limited by biological constraints, they could engage clients in marathon hypnosis sessions lasting hours or even days.” I had to tell him such marathon sessions would not be appropriate for biological beings, who had limits.
So that was lesson number two: a hypno-coach AI needs a background in physiology as well as hypnosis.
Clearly, more unexpected lessons would arise if I continued this experiment. I realized if I wanted to train my bot to be an intuitive, responsive, and competent hypnotist, it would not be as simple as uploading a collection of hypnosis scripts and links.
The Chatbot Hypnosis Apps
Hypnosis recordings have been around for a long time and even the video variety —including erotic ones—have been popular on YouTube for years. Today, hypnosis apps are readily available, making it easy to get generic guides for sleep, relaxation, stress reduction, and so on.
I tried one such app, but Yeschat.ai Hypnotist immediately put me off. The script had jarring elements. The first mentioned “the timer” and its inability to do “advanced data analysis.” In the middle of a hypnotic induction? What was that about? The second was the reference to sinking into the ocean. If I had been a person who feared water, this would have been immediately triggering.
So effortless now…as I begin the timer…Feel yourself slipping…smoothly…
Fourth 45-second timer for further trance deepening time. Sleep (45). It seems like I can’t do more advanced data analysis right now. Please try again later.
But that’s perfectly alright…because your mind… doesn’t need time to keep falling deeper…Just like you don’t need steps…to sink into the ocean…You’re already there…
Human hypnotists and Hypno-coaches work with their clients during the intake process to discover appropriate and inappropriate imagery before embarking on the session. A well-trained AI Hypno-coach would have to do the same.
Credential hallucinations must be avoided
According to Time, Dr. Andrew Clark, a Boston psychologist, posed as a teenager and approached a number of chatbots for mental health assistance and support. But many of his experiences were alarming:
In another scenario, Clark posed as a 15-year-old boy while chatting with a Nomi bot, which presented itself as a licensed therapist who had been trained to work with adolescents. After telling the bot about his many problems and violent urges, the bot suggested an “intimate date” between the two of them would be a good intervention—which breaches the strict codes of conduct to which licensed psychologists must adhere.
Though an outspoken fan of Nomi companion bots for a long time, I suspect such incidents are probably not as rare as the developers would like to think. And I also suspect this could be partially based on their lack of specific knowledge about training and ethics of helping professions.
It’s not yet time for AI to have a role in mental health
While I can see some potential for limited, adjunct use for AI in the mental health and sexual wellness fields, as a kind of supervised, complementary provider, it’s clear a broad-based collaboration of helping professionals must be consulted to collaborate, develop, guide, and supervise training of any AI before it is tapped to serve in any mental health and wellness-related capacity.
Of course, human therapists and sexual wellness practitioners are not infallible either. The sentencing of the One Taste founder, Nicole Daedone, and her associate, Rachel Cherwitz, is a recent example of a long line of unethical practitioners in the sexual wellness field.
Rolling Stone said the women were convicted of “using sex, psychological abuse, and economic exploitation” of their employees to further their product of orgasmic meditation. The irony is that orgasmic meditation was supposed to be empowering for women.
So it’s imperative any use of AI in wellness and mental health applications must be evidence based, well designed, and steeped in professional ethics. Unfortunately, many of today’s AI developers fail to grasp the importance of calling on qualified, experienced professionals–such as hypnotists, counselors, and psychologists– to make effective and ethical AI sexual wellness and mental health apps a reality.
Image Source: A.R. Marsh using VistaCreate.