Telling Secrets: Erotic Chatbots Raise Serious Privacy Concerns
Be careful who—and what—you share your personal information with
Along with what seems to be the meteoric rise of artificially intelligent software companions is an increasingly urgent call to be extra-cautious when using them.
“To be perfectly blunt, AI girlfriends are not your friends,” wrote Misha Rykov, a researcher for Privacy Not Included. “Although they are marketed as something that will enhance your mental health and well-being, they specialize in delivering dependency, loneliness, and toxicity, all while prying as much data as possible from you.”
There’s a particular concern of how certain chatbot developers address customer privacy. The same article points out that this situation can sometimes be traced to how amazingly fast the industry is growing.
Light in tone though it may be, Privacy Not Included puts it well, “In their haste to cash in, it seems like these rootin’-tootin’ app companies forgot to address their users’ privacy or publish even a smidgen of information about how these AI-powered large language models (LLMs)—marketed as soulmates for sale—work. We’re dealing with a whole ‘nother level of creepiness and potential privacy problems.”
Who’s good, bad, or worse regarding user privacy
Each of the eleven chatbots Privacy Not Included reviewed failed at least one important user security issue, while others, such as Replika, raised a host of red flags.
Even those scoring better didn’t manage to completely escape concern. For instance, EVA AI Chat Bot & Soulmate‘s privacy policy states “We neither rent nor sell your information to anybody” though when it comes to asking personal questions of its users it can be disturbingly persistent.
Another security related issue is openness: not necessarily when it comes to the company itself, its use of customer information, or whether or not its product site is secure, but how its chatbots operate.
RECOMMENDED READ: Tell Me No Lies: The Controversy Over Erotic Chatbots Continues
“We found no transparency into how their AI works,” About Eva AI Chat Bot & Soulmate, Privacy Not Included writes, “to get to know you and communicate with you, no way that you as the user might be able to control that AI if it were to say, become abusive or harmful, and we also couldn’t find any information about how your chats are used to train their AI and if you can opt out of this.”
Of all the chatbots they reviewed, Replika earned negative points in every category from Data Use to its Artificial Intelligence, a sentiment equally shared by other chatbot sites and sex tech journalists.
Privacy Not Included breaks down Replika’s numerous security and privacy flaws, stating “Users beware: AI chatbot friend might not be exactly private. Your behavioral data is definitely being shared and possibly sold to advertisers. Their security does not meet our Minimum Security Standards.”
Obviously, the main takeaway from that is, according to security advocates, Replika and a good percentage of other chatbots on their review list, failed to provide even the most basic level of user security and data privacy.
Too fast, too sloppy, too greedy
It’s a safe bet most people are already familiar with how fast erotic chat and emotional companion bots are being developed. Over at the Chatbot, it’s predicted the basic technology will approximately grow 23% a year.
This boom will likely continue the trend of chatbot companies deciding it’d be easier and, of course, more profitable to cut corners; putting their customer’s privacy and data at risk, or make yet more money by overtly or covertly selling what their chatbots learn.
Don’t expect—demand
Faced with what seems to be the insurmountable challenge of attempting to rein in these and other chatbot developers’ inability or unwillingness to respect their customers’ personal data, it’s easy to become disheartened; to feel there’s nothing to be done but hope for something like government intervention or forcing the industry to better police itself.
Not that we shouldn’t continue the fight—while making it a priority not to do more harm than good in the process—but to also hit companies like the eleven on Privacy Not Included’s list where it’ll hurt them the most: their bottom lines.
By getting the word out about them and other irresponsible chatbot developers, consumers can let other users know that chatbot promises of emotional or erotic companionship are not worth giving up our right to privacy.
Do it often and loud enough and we may also get the chatboy industry to realize personal data isn’t a revenue stream and improved user security could be a huge selling point—that customers aren’t products to be exploited but people to be valued and respected.
Image Sources: Depositphotos