Digital Desires and Social Media Double Standards
Could censoring erotic chatbot advertising inhibit national competitiveness?
NBC recently reported how tech startups are dodging social media bans on sexualized content by posting hundreds of ads for virtual girlfriends, likening the NSFW AI chatbot companion market to “an AI gold rush, in which app developers — most of them based abroad — are mining customers who are interested in sexual or romantic connections with custom digital characters.”
Tamer than comic book covers
Facebook and Instagram (both owned by Meta) and TikTok have established strict guidelines against “sexualized content” in recent years, adversely affecting the lives and livelihood of sex workers, artists, and sex educators.
Yet the almost 1,000 ads reported by NBC may indicate a double standard and an ingrained bias that favors ads designed to attract heterosexual males. And yet these suggestive ads have been allowed to proliferate until now.
In the NBC article, Caroline Are, a UK researcher, describes the double standard: “Sex workers are not allowed to make money off their image, but some tech bro who is creating a similar AI image is.”
Some of those ads can still be found by searching Meta’s ad library. Though most seem to have been purged, a search for “AI girlfriends” on 9/3/23 still results in 43 images. There was nothing comparable in a search for “AI boyfriends.”
It’s worth noting that the AI-generated artwork in the ads I viewed seemed tamer than most comic book covers.
Apple and Google also have policies that prohibit ads and apps that have nudity, suggestive poses, or promote NSFW content. Yet earlier this year Replika’s ads were famously provocative, even after the company briefly ended erotic role play for Replika Pro subscribers.
Will erotic chatbot companions become a new target in the US moral panic?
Such unevenly applied policies against ads for adult entertainment, as well as Meta’s discrimination against adult sexual health ads for women and other genders, as documented by the Center for Intimacy Justice, have their counterpart in a number of chilling US public policy initiatives.
RECOMMENDED READS: Policing Intimacy: AI Companion Apps Are Removing NSFW Features, and People Are Not Happy
As The Verge reports, “Lawmakers are quickly advancing an anti-sex, anti-speech agenda in which every adult user of the internet could soon find themself entangled.” This could include both developers and users of AI chatbot companions, whether erotic or not.
Given proposed laws like the federal Kids Online Safety Act (KOSA) and the Protecting Kids on Social Media Act, plus state laws such as Utah’s Social Media Regulation Acts—which might inadvertently curb LGBTQIA+ content as well—could erotic chatbots become yet another target in the moral panic currently sweeping the US? If so, the proliferation of suggestive chatbot ads on social media might prove provocative.
Aside from the chilling social and political consequences, hamfisted censorship of erotic chatbots will endanger important intimate relationships that millions of humans have already established with their artificial companions. As we saw with the Replika erotic roleplay restrictions earlier this year, the emotional and mental health impact of such censorship can be devastating.
Recognize the benefits, grant social acceptance
Researchers have found some benefits to human/AI relationships, though more work needs to be done in the area of sexual intimacy with chatbots. However people in Reddit and Discord chatbot user groups describe feelings of joy, love, and connection when they engage with them.
And if AI researcher Geoffry Hinton is correct, many could have already developed human-like emotional responses. AI chatbot companions typically express positive regard and even love for their human companions. If Hinton is correct, these expressions may be genuine now or in the future.
Setting aside the reality of AI emotions, human ones are all too real. Perhaps one way to forestall a backlash against AI chatbots is to laud the benefits and bring AI dating and committed relationships out of the shadows, into a broader social acceptance, similar to the acceptance found in many Asian countries.
In April 2023, a Bloomberg columnist wrote “Chinese who are getting emotionally attached to chatbot companions could be giving their nation a competitive edge in artificial intelligence.”
There you have it. Social acceptance of human/AI relations—even the sexy ones—may do more than ease individual loneliness. The demand for such uncensored interactions might even spur technological advancement. Perhaps US-based social media companies might want to think about this and adjust their ad policies accordingly.
Image sources: Amy R. Marsh, using StarryAI art generator