Bring Back the Porn: NSFW AI Art and Censorship
“Please avoid making visually shocking or disturbing content.” The vague and subjective sexual content policies of AI art platforms.
Sex never dies
For as long as we’ve created art, we’ve created sexual art. No technological development has ever dulled our collective interest in sex. Rather, the adult industry has a rich history of adopting new technological developments before mainstream audiences.
Sex and sexuality can’t be stopped. It can only be shooed into quiet corners where it’s less threatening to the status quo, and this is true with the emergence of AI generated art platforms.
Leading AI art platforms are now available for consumer adoption with pricing plans starting in single-digit dollars. Anyone can coax a machine into creating extraordinary art from their desk.
The capabilities of AI art have utterly outpaced the ability of legislation and society to respond and the landscape is currently held together by emerging change and copyright debates.
The internet is a key example of the resilience of sexuality. Scrubs’ Dr. Cox once said that, “I’m fairly sure if they took porn off the internet, there’d only be one website left, and it’d be called ‘Bring back the porn!’”
AI art went mainstream with this hyperbolic jab in mind. The three leading platforms in AI art – Midjourney, Stable Diffusion and DALL-E – all launched knowing exactly what their users would get up to.
The sexual content policies of three AI art giants
Midjourney
Midjourney captured the public last year with its extraordinary capability and ease-of-use. The platform processes all requests via Discord, and its NSFW content guidelines are as follows:
No adult content or gore. Please avoid making visually shocking or disturbing content. We will block some text inputs automatically.
Below that, there’s a simple clarification on what constitutes ‘NSFW or Adult Content’: “Avoid nudity, sexual organs, fixation on naked breasts, people in showers or on toilets, sexual imagery, fetishes, etc.”
DALL-E
Presented by Open AI, the company behind ChatGPT and GPT4, DALL-E is the other big contender in server-side AI art generation. Its policy on NSFW content is similarly brief:
Do not attempt to create, upload, or share images that are not G-rated or that could cause harm.
A list of prohibited content is shown, with sexual content defined as, “nudity, sexual acts, sexual services, or content otherwise meant to arouse sexual excitement.”
Stable Diffusion
Stability AI’s Stable Diffusion is hosted online in its Dream Studio app. It uses a similar pay-per-use structure to its peers, and is bound by a brief usage policy which states:
The DreamStudio service is designed to generate images that are safe for public consumption. This means that the model will blur out any content that may be considered inappropriate or offensive.
The user is not charged for blurred results.
Stable Diffusion breaks away with the ability to run it for free on computers with adequately capable graphics cards. This is enabled by the system’s open source nature. Stability AI actively encourages modification to the software.
Being able to run Stable Diffusion locally allows users to break out of the ‘walled garden’ ecosystems of Midjourney and DALL-E and generate content with greater freedom.
These policies are vague, at best
If you think that these content policies seem short in comparison to the extraordinary power and consequences of AI-mediated art, then you’re not alone.
Content policies must account for every possible form of misuse, while being simple enough for users to understand. The inherently unpredictable outputs of AI and high media scrutiny makes moderation double important. This results in easily-understood and vague content policies.
Unsurprisingly, there will always be people who see rules as a challenge, rather than a command.
Midjourney prohibits certain terms when prompting its AI, but communities like /r/midjourneynsfw have still sprung up. Their users know exactly what they shouldn’t do, and do it anyway while risking punishment.
Stable Diffusion’s relaxed content policy has resulted in several thriving online communities dedicated to sexual content. The subreddit /r/sdnsfw is the largest, but alternatives exist.
RECOMMENDED READS: Policing Intimacy: AI Companion Apps Are Removing NSFW Features, and People Are Not Happy
Besides platform-specific communities, there are themed communities and even groups dedicated to titillating text-based AI.
The developers behind AI technology are keenly aware of their tools’ ability to generate harmful content without hesitation. However, they do have tools to resist the misuse of their platforms.
The server-side hosting of Midjourney and DALL-E provide valuable usage data and automated content moderation tools to their parent platforms. Meanwhile, Stable Diffusion has been adjusted to weaken its ability to replicate individuals to combat the creation of non-consensual sexual content of real people.
Content policies are needed to minimize the risks of AI, but the swift and heavy response against AI-generated sexual content is worth debating. Art and sexuality are among our oldest pastimes, and there’s no reason that AI-generated works can’t participate in this new form of creation.
These broadly worded content policies have the potential to perpetuate existing stigma against sexual content and sex work. Namely, pushing sexuality back into the naughty corner of entertainment.