Essay by Maggie MacDonald
2018 has been the year of deepfakes: algorithmically generated videos that insert faces into existing video footage, fabricating convincing scenes which never took place in reality. They’re named after the Reddit user who circulated many of the earliest videos made using the technique—one that can now be mastered by anyone with a computer and some free time. The resulting videos are convincing, in fact they’re nearly indistinguishable from real footage: politicians delivering fake speeches, for example, and endless Nicolas Cage appearances. But, perhaps unsurprisingly, internet denizens have put deepfakes almost entirely to use in the pursuit of manufacturing pornography. Porn consumers have now become the producers of their own scenes, swapping celebrities’ faces into thousands of preexisting porn videos.
Deepfakes are being received as the harbingers of a moral apocalypse, with news coverage proclaiming them “a looming crisis for national security, democracy and privacy” and warning vaguely but alarmingly, that “AI-assisted fake porn is here and we’re all fucked.” But this framing forgets a crucial point: the techniques and technologies behind this practice are neutral. Machine learning is a tool that can be employed for good or for nefarious purposes. More than that, deepfakes are the most recent on a continuum of doctored sexual material spanning back over a century. From Tijuana Bibles to slash fan fiction to revenge porn, pornography has long before now been produced within a culture that does not value the consent of its subjects.
It’s not that the moral panic levied at the practice is undeserved. It does, of course, have consequences that must not be minimized. But framing deepfakes as the exception rather than the rule detracts from our ability to engage with our culture’s larger, underlying devaluation of women’s consent and disrespect for their bodily autonomy. This problem, in essence, is not unique to deepfakes, or even restricted to pornography; it saturates our social and political life, especially women’s involvement in it. Deepfakes are ethically troubling to be sure, but looking beyond the panic we could perhaps observe these media makers to help understand emerging media ecosystems.
Only this time last year the technique required dedicated teams equipped with training, tech expertise and advanced editing equipment, but since the early months of 2018 the videos have worked their way onto message boards and laptops the world over. Deepfakes train a deep learning algorithm to recognize the characteristics of a given face using an image bank. As explained by technology journalist Samantha Cole, after enough of this “training,” the assigned nodes arrange themselves to “correct” faces that appear in selected videos. Using a machine learning algorithm, a home computer, publicly available photos and some spare time, anyone can now fake a hyperrealistic video.
This type of content borrowing is typical of today’s participatory culture. In this media ecosystem, audiences construct their own culture through content appropriated from mass media, reshaping it to serve their interests. This circuit of simultaneous production and consumption depends heavily on the engagement of the average participant, the producer-consumer, or as Henry Jenkins names it, the ‘prosumer’.
As prosumers, deepfake creators are gesturing towards what they want to see in their media. When they appropriate and transform porn and celebrity in new hybrid forms of expression, they are harnessing the technique’s potential to demonstrate that anything can be made real. Unlike with studio-produced porn, deepfake communities act like fans. A free economy flourishes and engagement is incredibly high in many porn-sharing forums, especially ones that allow for creative freedom. We could have seen it coming: before this year, the immense popularity of porn GIFs suggested there was an active audience, eager to produce as well as to consume pornography. Porn scholar Helen Hester has defended these forums as exemplary spaces for participatory interaction, which is at odds with the image of porn browsers as “passive, thoughtless, and wholly receptive”. The tactics of deepfake prosumers could have been predicted, considering these fans are enmeshed in the modern world of porn – a system already riddled with piracy and an alarming disregard for consent.
Actresses including Gal Gadot, Emma Watson and Scarlett Johansson have been targeted by the practice but practically there isn’t much legal recourse for victims of a deepfake — a large part of why the practice took off so explosively. The ethical issues of consent and objectification have made it clear that a video itself does not need to be real in order for the personal damages it incurs to be. Defamation or copyright law may be a good place to start, but as one redditor put it, “You can’t effectively sue someone for exposing intimate details of your life when it’s not your life they’re exposing.” While sites like Reddit, Discord and PornHub have theoretically banned deepfakes under nonconsensual pornography clauses, the videos are emerging faster than they can be contained. There is currently no straightforward route for getting videos like these taken down, given their free circulation, anonymous creation and ability to perpetually re-emerge after they first appear.
It’s unquestionably alarming that porn is being faked without the consent of those depicted and there remains alarmingly little recourse for victims. Moderation of these videos is nightmarish, but neither legal nor regulatory responses have historically kept pace with any forms of porn making regardless of the type of technology involved. Treating this practice as an unprecedented moral problem brought about by technology ignores much larger cultural problems – a troubling red herring.
In this emerging media system, not only are media producers and consumers transformed from two separate categories into a shared pool of prosumers, but they interact with each other according to a new set of rules which none of us completely understands yet. A moral panic only obfuscates the cultural context from which these technologies emerge. Rather than misunderstanding emergent practices like this one, we should closely consider these techniques and communities in order to help us understand our media futures.
_______
Cole, Samantha. 2018. “We Are Truly Fucked: Everyone Is Making AI-Generated Fake Porn Now.” Motherboard. January 24, 2018.
Hester, Helen, Bethan Jones, and Sarah Taylor-Harman. “Giffing a Fuck: Non-Narrative Pleasures in Participatory Porn Cultures and Female Fandom.” Porn Studies 2, no. 4 (October 2, 2015): 356–66.
Jenkins, Henry. 2013. “Layers of Meaning: Fan Music Video and the Poetics of Poaching” in Textual Poachers: Television Fans and Participatory Culture. Routledge.
This piece was originally produced for Pause Button – an online publication about technology and culture published by the Milieux Institute at Concordia University in Montreal.
Maggie MacDonald is an MA candidate in Media Studies at Concordia University as well as the coordinator of the Media History Research Centre at Milieux Institute. Her research focuses on the transformation of pornography as a cultural industry through the platformization of pornographic content online.