Reality Under Siege: Challenges in Detecting and Combating Deepfakes
Navigating the ethics of AI-generated erotic content
Deepfake technology and nudification services are capable of creating sexualized, nonconsensual images of real people, whether celebrities or unknowns–and a shocking number of these are of women and others presumed female, including minors.
Celebrity deepfake images, such as those of singer Taylor Swift that flooded the internet in January, have drawn the most media attention—and views—yet hundreds of thousand, or perhaps millions, of people worldwide have had their careers, personal reputations and brands, social lives, and sense of privacy and security violated.
Sexually explicit, nonconsensual, deepfake images have also become a staple of revenge porn, according to an article published in the University of Cincinnati Law Review.
An Australian ABC News piece , noted most of these sorts of images are created by diffusion models, “a type of generative artificial intelligence model that can produce new and photorealistic images from written prompts.”
Revealing the extent of Non-Consensual Intimate Imagery (NCII)
Though hardly household names, organizations such as Graphika have investigated the deepfake problem and issued reports, such as their early 2023 “Deepfake it Till You Make It.”
However, their more recent report, reveals the truly massive extent of the undressing images problem:
“A group of 34 synthetic NCII providers identified by Graphika received over 24 million unique visitors to their websites in September, according to data provided by web traffic analysis firm Similarweb. Additionally, the volume of referral link spam for these services has increased by more than 2,000% on platforms including Reddit and X since the beginning of 2023, and a set of 52 Telegram groups used to access NCII services contain at least 1 million users as of September this year.”
Graphika concluded “the primary driver of this growth is the increasing capability and accessibility of open-source artificial intelligence (AI) image diffusion models. These models allow a larger number of providers to easily and cheaply create photorealistic NCII at scale. Without such providers, their customers would need to host, maintain, and run their own custom image diffusion models–a time-consuming and sometimes expensive process.”
Defending reality
According to its FAQ, Reality Defender, founded in 2018, is a “deepfake detection platform for enterprises, governments, and platforms to detect AI-generated content and manipulations across audio, video, image, and text files.” Unfortunately it is only accessible to large organizations, businesses, and governments.
Though Reality Defender’s detection services are not available to the average consumer, it’s helpful to look through the website’s blog posts. For starters, I recommend “A Brief History of Deepfakes”, paying special attention to this quote:
“No history of deepfakes can be complete without acknowledging contributions made by the average internet user. The open-source deepfake creation tools have been tested and refined by legions of hobbyists, who have utilized these tools for purposes of benign entertainment (memes, swapping out actors’ faces in classic movies) and more sinister, appalling goals, like the creation of deepfake pornography.”
“It was the participation and driving interest of everyday users, beginning in 2017, that has brought the technology to where it is now. This continued democratization of such powerful tools demonstrates the dire need for adequate countermeasures to detect and isolate deepfakes before they can be used for malicious purposes.”
So while average internet users have the ability to contribute to the development of deepfakes, other average internet users have no way to protect themselves from them.
Calls for regulation, social responsibility, and technical safeguards
In the US, politicians appear to be finally paying attention. In September 2023, Representatives Yvette D. Clarke (D-NY) and Congressman Glenn Ivey (D-MD) introduced the DEEPFAKES Accountability Act of 2023.
According to the Los Angeles Times, Senators Chris Coons (D-Del.), Marsha Blackburn (R-Tenn.), Thom Tillis (R-N.C.) and Amy Klobuchar (D-Minn.) have drafted legislation last October called “Nurture Originals, Foster Art, and Keep Entertainment Safe Act, or NO FAKES.”
The law would enable victims to sue deepfake creators plus platforms that distribute the material, though difficulties in forensic identification of the offending software plus Section 230 of the Communications Decency Act might make such cases difficult to impossible.
And in January 2024, US Representative Joe Morelle (NY) and Rep. Tom Kean (R-NJ) announced HR 3106, the Preventing Deepfakes of Intimate Images Act, a bill banning the sharing of deepfake pornography.
But regulations can be frustratingly ineffective. Scientific American points out that ideas such as digital watermarks and other technological safeguards can still be hacked by “sophisticated users.” The same article also points out the limitations of corporate social responsibility and self-regulation.
Our images, ourselves
Consent counts, but violations of digital consent seem to have run amok, making even the most enthusiastic sex tech proponent think twice about the implications and ethics, say, of AI-generated chatbot images and the creation of AR/VR sexual avatars.
As Graphika’s “A Revealing Picture” points out, “the increasing prominence and accessibility of these services [AI diffusion models] will very likely lead to further instances of online harm, such as the creation and dissemination of non-consensual nude images, targeted harassment campaigns, sextortion, and the generation of child sexual abuse material.”
So while it might be tempting to create a sexy chatbot or avatar that resembles your favorite celebrity, influencer, or the hottie next door, consent still counts, even in the digital realms.
So unless you have that express, personal consent from the object of your desire—don’t, just don’t. There are plenty of sexy, imaginary friends you can access through the wonders of technology and even plenty of erotic images you can make, without using images of the unsuspecting and nonconsenting.
Images: Depositphotos