PUBLIC PERCEPTION, APPROVAL OF AND BLAME ATTRIBUTION IN USING MILITARY DRONES: A GATEWAY INTO PUBLIC OPINION ON AUTONOMOUS WEAPON SYSTEMS

Date

2018-12

Journal Title

Journal ISSN

Volume Title

Publisher

Abstract

Within the past two decades, technological advances and a growing public policy priority to reduce human casualty in the battlefield have pushed militaries around the world to develop increasingly automated capabilities that are characterized by a gradual decline in human involvement. Autonomous weapon systems (AWSs), which include Legal autonomous weapons (LAWs), are considered the next transformational stage in military technology, yet public understanding of the far-reaching implications of using these war-fighting machines remains limited, especially its material impact on interstate dynamics. Land-based, seaborne and airborne unmanned vehicles, colloquially known as drones, represent the machine-learning military technology that is closest to AWSs and serve as the basis for which AWSs will derive their development from. While true AWSs have not yet been developed and deployed, this thesis seeks to understand the public's current perception of military drones and how it affects approval for the use of drones in combat and blame attribution in scenarios of errors, such as unintended collateral damage. This is achieved by a combination of examining evolving scholarly debates on public opinion and the legal accountability of employing armed drones and a survey procedure to examine how the public views drones within the context of machine autonomy versus human control, how approving it is of using drones in war and finally how it attributes blame when presented with an erroneous outcome. Analysis of both the existing literature and the survey data suggests the public still struggles to comprehend the capabilities of drones along a wide spectrum of autonomy. Furthermore, the study's findings underpin the current scholarly position that considerable support for drone usage exists when the subject is framed in a vacuum, without including contextual information that truly characterizes the reality of that usage. However, when presented with that contextual information, the option of deploying a human combatant remains preferable to the public due to an aversion to collateral damage even when there is high military utility. Finally, the public tends to find the human element at blame when it is asked to assess a collateral damage, regardless of the level of automation or autonomy involved. These findings are all emblematic of a degree of distrust in AWSs and incoherent, underdeveloped legal thinking within the public on the subject of accountability, which promise to complicate not just the rules of war and international legal regimes but also the interstate dynamics among AWS-wielding nations when fully autonomous warfighting platforms become a full-fledged reality.

Description

Keywords

Citation