Celebs Aren’t Alone, Adult Performers Are Harmed by Deepfake Tech Too
Content creators lose cash and lack legal options when their videos are pirated.
Over the past few years, the rise of deepfake technology has caused a significant stir online.
Arguably, its most attention-grabbing moments occurred when it was used to depict celebrities and public figures in unlikely situations: Jimmy Fallon interviewing Jimmy Fallon, a shirtless Vladimir Putin bragging about the 2016 election win, and so on.
What is deepfake technology?
The video manipulation technique uses artificial intelligence to edit and reproduce video content with a new face superimposed over the human body and face originally featured.
The sophisticated process poses a new breed of risk to people who are falsely depicted and demonstrates the dangers of trusting all that which meets the eye in the digital landscape.
To create a believable deepfake, hundreds (if not thousands) of images of a face are required to give the artificial intelligence program a realistic understanding of how it looks with unique expressions and during speech.
It’s for this reason that public figures are often used, with plenty of imagery available online to feed into the program. But while the initial previews of deepfake technology that began popping up on social media platforms often used original videos such as politician’s speeches and Saturday Night Live skits, it was only a matter of time before the adult entertainment industry saw an influx of celebrity faces digitally imposed into videos.
A growing number of victims being targeted by deepfake technology
FutureofSex.net reported on celebrity deepfakes inserted into adult videos in 2018 after the Screen Actors Guild-American Federation of Television and Radio Artists released a statement against such acts.
The actors’ union also addressed actions it was taking to “combat any and all uses of digital recreations, not limited to deepfakes, that defame [its] members and inhibit their ability to protect their images, voices and performances from misappropriation.”
Although the impact of deepfake technology in adult content has widely been discussed in relation to the victims whose faces are used, one adult producer spoke out in 2019 about the impact this trend is having on the original owners of the content.
She highlighted the fact that adult content producers are being victimized and affected as much as those whose faces are being used without consent.
Raquel Roper [NSFW] is an adult producer and performer who bears a striking resemblance to Selena Gomez, which made her content an ideal target for celebrity deepfake pornography.
Speaking out against the matter in a YouTube video, she said she hadn’t realized her content was being stolen until she saw a clip from one of her videos online and noticed a different face where her own should have been.
“For my content to be used in this way is really disheartening. I don’t think people realize how harmful it can be,” Roper said. She explained that the income she makes from her original videos supports her to continue as an independent producer.
I’m not a big company, so when my content is stolen and pirated, [they’re directly] stealing from adult performers.
I saw one of my videos and what looked like my face but it wasn’t my face. It was Selena Gomez’s face. It’s really upsetting because it’s not just my content that’s being stolen, it’s my identity. I didn’t give consent for this to happen, Selena Gomez has not given consent for her face to be morphed into an adult video.
What does the law say?
Roper added that what’s most concerning is the fact that she’s unable to take legal action, such as filing a suit for defamation.
This is because deepfakes of this nature are typically shared in nearly untraceable environments on the deep web, which are parts of the web not indexed or found by search engines.
I don’t know where this video came from, I don’t know who did it and there’s no way for me to take legal action—I think that’s what the scariest part is. I feel so shocked that this is legal. I’m scared that this is going to become more of a problem.
While defamation and copyright laws should technically apply and protect victims of adult deepfake content, it’s first necessary to track down the person responsible for creating the fake—typically someone operating on the deep web.
For non-adult content, such as celebrities making false statements, the laws don’t currently specify or account for the technology, meaning there is nothing in place to protect victims.
Whether or not lawmakers will review and update the current laws or introduce new regulations to address the threat posed by this technology remains to be seen, but there is a clear need to protect victims of all types when it comes to adult deepfake videos.
Image sources: Raquel Roper
Leave a reply
You must be logged in to post a comment.