A Posthuman Argument to a Human Problem: Discourse on Lethal Autonomous Weapons Systems
July 16, 2021 2:29 pm ||
Hizkia Yosie Polimpung, researching the historiography of the technological revolution, identified three sectors where development takes place, two motives, and one purpose.[i] Those three sectors are (1) military, (2) industry, and (3) domestic/personal. So then, the two motives are: to hurt human bodies and extract labor from human bodies. Finally, the one purpose is efficiency. Polimpung named this formula “technopolitics 321”.
One product of the development[ii], following Polimpung’s formula, is the incorporation and development of artificial intelligence (AI)—and other disruptive technologies—in the military. The intended result is the hard to define Lethal Autonomous Weapons Systems (LAWS). Perhaps it is easier to get out of the way of what LAWS is not—yet. They are not Terminators, Decepticons, Ultron, or HAL 9000. They are weapons that can, with the help of AI, identify and engage targets without intervention by a human operator. LAWS can take shape in a drone, a sentry gun, a missile system, etc. The difficulty in defining these types of weapons stem from the degree of separation of humans from the interaction, are they only highly automated, or are they actually autonomous?
The lack of human control is why Human Rights Watch and other organizations banded together under humanist ideals, raising ethical, legal, technical, and security concerns under the umbrella of the Campaign to Stop Killer Robots (KRC).[iii] Rebutting arguments that “robots would be more ethical than humans, they would not rape or commit other crimes” and “autonomous weapons systems are able to process data and operate with greater speed than those controlled by humans” with concerns regarding who would be accountable if LAWS were to cause unlawful deaths. And, a lack of meaningful human control undermine the basic principle of International Humanitarian Law (IHL) and human rights law. It is with these concerns KRC is advocating for a treaty to ban development, production, and use of LAWS.
A law for LAWS
Under the framework of the UN Convention on Conventional Weapons (CCW), the KRC and other states party to the CCW have called for a legally binding treaty to ban LAWS since 2014.[iv] But, up until 2021, the progress has been rather slow; causing the KRC to consider using alternative forums than CCW to develop a legal instrument.[v] Among others, Rosert and Sauer present two reasons why LAWS are quite peculiar to regulate.[vi] First, LAWS do not necessarily violate legally binding IHL principles and technological progress might render LAWS at least IHL-compliant or more compliant than remotely operated weapons. Second, “meaningful human control” as a principle of IHL is not yet legally binding.[vii] Further discussions surrounding LAWS as a subject matter are discussed better in what I would call “the human problem”.
The human problem
The human problem in the subject of LAWS arises from but goes beyond KRC’s argument that LAWS lack the human characteristic to make ethically complex choices and letting a machine decide who dies and lives crosses a moral threshold.[viii] Rosi Braidotti would call this a “cognitive panic at the prospect of the posthuman turn, blaming our advanced technology for it.”[ix] For Braidotti, life is not the exclusive prerogative of humans. Because if it were, we would need to question the totality of advanced capitalism. It threatens the sustainability of our planet, commodifies all that lives, and perpetuates our inequality.[x] Something that KRC does not espouse.
Indeed, the danger of technology as it is used within a capitalist system is all real. It is brought to the fore by the most vulgar and sensational realization of our humanity through warfare and mortality. LAWS is only the face, the body is a complex mix of technology, capitalism and class, gender, race, and ability. Queerly, the way to confront the face and the body is to turn and become posthuman.
The posthuman argument
The posthuman, succinctly put, is a way of thinking about new forms of subjectivity, emphasizing eliminating human exceptionality.[xi] We are working against the binaries of life/death and human/nonhuman (among others). Emily Jones posits that to understand the totality of human mortality and technology, we need to see not only LAWS but a broad array of human-machine life/death decision-making.[xii]
Datafication is already an integral part of conventional weapons operation within the United States military. However, its mundanity has made it invisible towards discussion of lethal use of digital intelligence. Jones argued that framing the issue using autonomy as its axiom creates a standard so high that already in use machine involvement in life/death decision-making is only set to increase.[xiii] In line with this thought, Jones also emphasizes the possibility and reality of human soldier enhancements through wearable military technologies and even bodily modifications.[xiv] An example of already existing technology is a wearable called Boomerang Warrior-X. Using acoustic detection and algorithm, it enables detection of incoming small arms fire.[xv] To take discussion surrounding technology and its uses to kill, Jones positioned it not only LAWS as an exceptional autonomous Other. Instead, using the posthuman framework, pushing the discussion further to reject the militaristic use-case of technology but maintaining a flat ontology.
Author: Alfredo (Research Intern CfDS)
Editor: Amelinda Pandu Kusumaningtyas (Research Project Officer CfDS)
[i] Polimpung, Hizkia Yosie. 2018. Teknopolitika Kemudahan Hidup. June 4. Accessed May 8, 2021. https://indoprogress.com/2018/06/teknopolitika-kemudahan-hidup/.
[ii] Polimpung also uses “mutation” to describe the reactive development of technology and other control methods under capitalism, reactive to the plight of laborers. See Polimpung, Hizkia Yosie. 2017. “Pekerja adalah Universal: Manusia dan Subjek dalam Mutasi Kapitalisme.” IndoProgress 1-28.
[iv] Human Rights Watch. 2020. New Weapons, Proven Precedent: Elements of Models for a Treaty on Killer Robots. October 20. Accessed May 9, 2021. https://www.hrw.org/report/2020/10/20/new-weapons-proven-precedent/elements-and-models-treaty-killer-robots#_ftn1.
[vi] Ibid 17-18; For strategic reasoning why militaries and defense ministries would like to develop LAWS. See Leys, Nathan. 2018. “Autonomous Weapon Systems and International Crises.” Strategic Studies Quarterly 48-73.
[vii] For a discussion on the ambiguity of “meaningful human control” from a military operation perspective. See Ekelhof, Merel. 2019. “Moving Beyond Semantics on Autonomous Weapons: Meaningful Human Control in Operation.” Global Policy 343-348.
[x] Ibid; For a discussion on AI and redlining. See Khan, Jeremy. 2020. A.I. and tackling the risk of “digital redlining”. February 11. Accessed May 9, 2021. https://fortune.com/2020/02/11/a-i-fairness-eye-on-a-i/.
[xiii] Ibid 112.
[xiv] Ibid 113.
[xv] Ibid 114.