Deep Fake as a New Form of Violence against Women

April 29, 2019 8:38 am || By

On 2017 the University of Washington created a synthetic video featuring the former president of USA, Barrack Obama.[1] On 2018, Belgian Political Party also generated synthetic video that featured current president, Donald Trump in an attempt to encourage the Belgian government to take a more compelling climate action.[2] Using similar tools, the actress Scarlett Johansson has her face stamped on a pornographic video that was generated by one of internet users.[3] All of these were possible with the help of “deep fake” technology.

To put it simply, deep fake is a result of artificial intelligence technology that can generate a realistic digitally manipulated video of a person doing or saying things that desired by the creators.[4] There are a lot of easy to use deep fake application that makes practically anyone who desires to create deep fake to do so.[5] Because of this easy to use nature of deep fake, currently there are concerns that this technology can create the new disinformation war[6] or become a challenge of democracy.[7] However, this writing attempts to discuss the closer impacts of this new technology to our daily life. More specifically, how it might become a harmful tools to conduct the act of male violence against women in a form of creation of pornographic videos using the women’s faces without their consent. Bear in mind that the conscious decision of using the word ‘male violence’ and ‘women’ does not come with the intention to put blame in certain group but instead comes from the bitter and depressing fact that in most cases males are often the perpetrator and women are often the survivor.[8] On another note, this decision does not necessarily undermine the struggle of my male counterparts that also becomes the object of various form of oppression in this patriarchal world. With that being clear, in this writing I argue that first, the capability of deep fake to create a personalized pornographic videos would be more harmful for women. Second, I argue that deep fake generated pornographic videos can create an easier controlling environment for women.

First, with the existence of Scarlett Johansson’s fake porn videos, there is certainly the possibility for the same action to be conducted to any ordinary women.[9] Realizing this misuse of deep fake, some pornographic website such as Pornhub even has made a policy that ban all types of deep fake generated porn videos,[10] however as internet is limitless, it offers a lot of alternative places to share this deep fake generated videos. Objectification against women to fulfill the desire of male fantasy is not uncommon in this society,[11] deep fake enables easier way to make a personalized porn videos using the faces of ordinary women collected from their social media without their consent. It will be socially more harmful to women as the society has various forms of social control to women especially in regards of sexual activity. The gender expectation embedded in the societal norms of the existence of women creates a clear distinction of what is considered appropriate and inappropriate for women to do.[12]  The display of women performing sexual activity, regardless of being fake or not, will be considered as inappropriate at best or infidelity at a slightly worse perception, and will result to victim-blaming at worst.

In regards of deep fake and its impact to women, it is not only the matter of trust of society—whether it is true or not, for instance—but it instantly will result to the unfair judgment to women because of the sexist and misogynist culture that is still rooted in the society. Simply, women’s body is shackled by one-sided social control ingrained to them because of their identity. Similar to revenge porn, it can result to women being forced to change jobs (or barred from having another job), move to another neighborhood, or death because of the tormenting pressure from the people around her.[13] Especially to the ordinary women whose voices’ rarely considered as significant, deep fake generated porn videos can result to more harmful impacts than their male counterpart.

Second, using deep fake technology, fake porn videos using women’s faces will result in a more controlling environment for women. Similar to the impacts of revenge porn in asserting male supremacy against the more inferior women, only with more complexities as, different with revenge porn, the perpetrator can easily create any video that they want. Statistically, 6% of women under 30 in USA are victims of revenge porn,[14] and around 421 cases of revenge porn has been reported as of 2018 only in Scotland.[15] Revenge porn is usually conducted by women’s previous partner as the means to blackmail, threaten, coercion, or to have a sense of enjoyment in putting someone in suffering situations.[16] With deep fake, the perpetrator can be someone completely stranger to the women. As the law of revenge porn is still largely missing in a lot of places,[17] the lack of protection to the women in the era of deep fake will further endanger women in controlling relationships or threatening situations.

            In conclusion, deep fake can create a more harmful environment for women to live in. The misogynist and sexist culture will further proliferate the use of deep fake to fulfill male fantasy that will result to the oppression of women. Similar to revenge porn only more complex and dangerous, deep fake generated porn videos using women’s faces without their consent is the manifestation of male violence against women. Thus, the lack of legal protection for women and the patriarchal structure work best in creating more miserable world for women in an era of deep fake.

Editor: Anisa Pratita Mantovani

Read another article written by Sri Handayani Nasution


[1] BBC (2017). Fake Obama created using AI video tool. [online] BBC News. Available at: https://www.bbc.com/news/av/technology-40598465/fake-obama-created-using-ai-tool-to-make-phoney-speeches [Accessed 12 Mar. 2019].

[2] Schwartz, O. (2018). You thought fake news was bad? Deep fakes are where truth goes to die. [online] the Guardian. Available at: https://www.theguardian.com/technology/2018/nov/12/deep-fakes-fake-news-truth [Accessed 12 Mar. 2019].

[3] Hern, A. (2018). AI used to face-swap Hollywood stars into pornography films. [online] the Guardian. Available at: https://www.theguardian.com/technology/2018/jan/25/ai-face-swap-pornography-emma-watson-scarlett-johansson-taylor-swift-daisy-ridley-sophie-turner-maisie-williams [Accessed 12 Mar. 2019].

[4] Citron, D. and Chesney, R. (2018). Deepfakes and the New Disinformation War. [online] Foreign Affairs. Available at: https://www.foreignaffairs.com/articles/world/2018-12-11/deepfakes-and-new-disinformation-war [Accessed 12 Mar. 2019].

[5] ibid

[6] Ibid

[7] Boylan, J. (2018). Opinion | Will Deep-Fake Technology Destroy Democracy?. [online] Nytimes.com. Available at: https://www.nytimes.com/2018/10/17/opinion/deep-fake-technology-democracy.html [Accessed 12 Mar. 2019].

[8] Mackay, F. (2015). Radical feminism. United Kingdom: PALGRAVE MACMILLAN, p.12

[9] Harwell, D. (2018). Fake-porn videos are being weaponized to harass and humiliate women: ‘Everybody is a potential target’. [online] The Washington Post. Available at: https://www.washingtonpost.com/technology/2018/12/30/fake-porn-videos-are-being-weaponized-harass-humiliate-women-everybody-is-potential-target/?utm_term=.0b273bb0ef2a [Accessed 12 Mar. 2019].

[10] Sharman, J. (2018). Pornhub bans AI-generated ‘deepfakes’ videos that put female celebrities into porn films. [online] The Independent. Available at: https://www.independent.co.uk/life-style/gadgets-and-tech/pornhub-twitter-deepfakes-ban-ai-celebrity-faces-porn-actress-bodies-emma-watson-jennifer-lawrence-a8199131.html [Accessed 12 Mar. 2019].

[11] Mackay, F. (2015), p.41.

[12] Pina, A., Holland, J. and James, M. (2017). The Malevolent Side of Revenge Porn Proclivity. International Journal of Technoethics, 8(1), p.31

[13] Poole, E. (2015). Fighting back against non-consensual pornography. USFL Rev., 49, pp.181-214.

[14] Data & Society Research Institute (2016). Nonconsensual Image Sharing: One in 25 Americans Has Been a Victim of “Revenge Porn”. [online] New York. Available at: https://datasociety.net/pubs/oh/Nonconsensual_Image_Sharing_2016.pdf [Accessed 12 Mar. 2019].

[15] BBC (2018). More than 400 revenge porn crimes reported. [online] BBC News. Available at: https://www.bbc.com/news/uk-scotland-45638796 [Accessed 12 Mar. 2019].

[16] Pina, A., Holland, J. and James, M. (2017).

[17] Poole, E. (2015).