Editor’s Note: Sophie Compton is a British Independent Film Awards (BIFA) longlisted, SXSW award winning documentary director and producer who tells women’s stories of injustice and healing. Reuben Hamlyn is a New York-based BIFA longlisted, SXSW award winning filmmaker of documentary and fiction. Their documentary, “Another Body,” opens in theatres and on VOD October 20th. The views expressed in this commentary are their own. View more opinion at CNN.
A friend sends you a message with a link. It’s a Pornhub URL, paired with the message: “I’m really sorry but I think you need to see this.” You click, and what pops up is your face, looking back at you, depicted in hardcore pornography. You go numb — you have never acted in porn in your life, and all you can think is who would do this, and why?
This is what happened to Taylor (whose name has been changed to protect her privacy), a 22-year-old engineering student from New England, who is the subject of our new SXSW Special Jury Award-winning feature documentary, “Another Body.” As Taylor discovers, the videos are deepfakes — videos doctored using artificial intelligence to insert one person’s face onto another person’s body.
Taylor’s anonymous perpetrator had uploaded six deepfake videos to several porn profiles, pretending to be her. Chillingly, he also included the names of her real college and hometown, and encouraged men visiting the profile to DM her, with a wink emoji. And they did — she started receiving disturbing messages on Facebook and Instagram from men she didn’t know.
But when Taylor called the police, a detective told her the perpetrator had a right to do it, and that no laws had been broken.
There are currently no federal laws in the United States against the creation and sharing of non-consensual deepfake pornography. We are determined to change this, calling for a federal law that makes non-consensual deepfake porn illegal, and changes to Section 230 of the Communications Decency Act, which shields online platforms from liability over user-generated content. This is the online landscape that has allowed creating and trading non-consensual deepfake pornography to develop into a thriving business.
Taylor is not alone. With advancements in artificial intelligence, deepfake pornography is becoming increasingly common — and it almost exclusively targets women. Researchers at the identity verification company Sensity AI found the number of pornographic deepfakes online roughly doubled every six months from 2018 to 2020. The company also found that a shocking 96% of deepfakes are sexually explicit and feature women who didn’t consent to the videos.
While it once took hundreds of images of a person’s face to create a convincing deepfake, it is now possible with just one or two images. When deepfakes were first created in 2017, they required significant computer processing power and some programming knowledge. Now, there are user-friendly deepfake applications available for iPhones. In other cases, deepfake creators take commissions, charging as little as $30 dollars to create explicit videos of a customer’s favorite celebrity, or ex, or teacher.
As a result, both celebrities and average citizens are finding their faces inserted into pornography without their consent, and these videos are being uploaded to porn sites for anyone to see. One of the most prominent deepfake porn websites is averaging about 14 million hits a month.
This practice is no longer niche — it has hit the mainstream. And with it, we run the risk of a new generation of young people who might consider watching a pornographic deepfake of their favorite actress — or their classmate — the norm.
The impacts on victims can be devastating. For Taylor, not knowing who had created the videos, or who had seen them — her classmates? Her friends? Her boss? — meant she didn’t know who to trust.
This triggered a period of extreme OCD and anxiety, where she reevaluated her social circle, and debated whether she would continue with her studies. Other survivors we spoke to, like Helen, a 34-year-old teacher, started experiencing panic attacks, and told us, “Every time I left the house, I got a sense of dread.”
Amnesty International coined the phrase “the silencing effect” to talk about the way online abuse against women can diminish female participation in public forums. In one report about harassment on Twitter, the organization found that many women shut down their accounts, censor what they post and are discouraged from pursuing public-facing careers, like journalism or politics. We’ve seen a similar response among women who are victims of deepfake abuse.
Danielle Citron, a professor at the University of Virginia School of Law, notes that Section 230 is the reason that there are 9,500 sites in the US that “are devoted to non-consensual intimate imagery.” She added, “It becomes something that is like an incurable disease. … You can’t get [non-consensual content] down. The sites don’t care. They can’t be sued.”
And it’s not only the deepfake porn creators who are profiting off this abuse. Deepfake porn sites are facilitated and enabled by search engines that drive web traffic toward deepfake content. Internet service providers host them and credit card and payment companies facilitate transactions on their sites, while other companies advertise their products under these videos.
In England and Wales, a new law will criminalize the sharing of pornographic deepfakes without consent. In the US, however, only a handful of states have laws addressing non-consensual deepfake pornography, and many of them are limited in scope.
With Taylor and other survivors, we set up the My Image My Choice campaign, and are calling for federal laws that would target both creators and platforms, criminalizing the creation and distribution of non-consensual deepfake pornography, and forcing sites to remove this content from their platforms.
While these laws alone may not put an end to the problem, they would go a long way in ensuring the entire existing business model that relies on violating women’s consent and privacy is no longer viable.