the limits of the notion of criminal responsibility
TRANSCRIPT
The limits of the notion of criminal responsibility regarding Lethal Autonomous Weapon Systems
Sarah Leeman
ANR: 229357 SNR: 2041723
First Supervisor: Dr. Merel Noorman Second Reader: Prof.Dr. Kees Stuurman
Law and Technology
August 2020
II
III
Acknowledgements ________________________________________________________________ This academic year will be one to remember. It started out like any other year where academic
challenges and social moments with new friends alternated each other. This master’s program
allowed me to deepen my knowledge and meet amazing people from all over the world with
whom I hope to be lifelong friends. In March the normal academic dynamic drastically changed
when a surreal scenario was put upon all of us. It was unknown territory both for the University
and the students. I am profoundly grateful to my first supervisor, dr. Noorman, who took the
time necessary to guide me on this academic journey and point me in the right direction at
challenging crossroads
I further wish to thank Colonel Chris De Cock, Legal Advisor of the European External
Action Service - Director EU Military Staff and Colonel Stein Westlye Johannessen, Defence
Attaché of Norway to the Netherlands and former legal advisor of the NATO Special
Operations Command, SHAPE for their profound insights during exploratory interviews.
Lastly, I would like to thank my parents. They have stood by me through every high and
low. Completing this master would have been impossible without their support.
IV
Table of content ___________________________________________________________________________
Chapter 1: Introduction ................................................................................................................................. 1 1.1. Background of LAWS ........................................................................................................................... 2 1.2. Objective and research questions .......................................................................................................... 5 1.3. Significance of this thesis ..................................................................................................................... 6 1.4. Preliminary remarks and limitations ..................................................................................................... 7 1.5. Methodology ......................................................................................................................................... 8 1.6. Chapter overview .................................................................................................................................. 9
Chapter 2: Defensive Autonomous Weapon Systems ................................................................................... 11 2.1. The significance of increasing levels of autonomy in weapon systems ................................................ 11 2.2. Is a definition necessary? .................................................................................................................... 11 2.3. Characteristics and usage modalities Phalanx CIWS.......................................................................... 15
2.3.1. Lethal character ......................................................................................................................... 16 2.3.2. Autonomous character ............................................................................................................... 16 2.3.3. Defensive character .................................................................................................................... 17 2.3.4. Human-machine interaction ...................................................................................................... 18
2.4. Conclusion .......................................................................................................................................... 19
Chapter 3: Criminal Responsibility .............................................................................................................. 21 3.1. Criminal Responsibility in IHL ........................................................................................................... 22 3.2. The principles underlying the targeting process .................................................................................. 26 3.3. Conclusion .......................................................................................................................................... 30
Chapter 4: Analysis of criminal responsibility for defensive LAWS ........................................................... 31 4.1. Impact of the characteristics and usage modalities of the Phalanx CIWS on the principle of distinction .................................................................................................................................................................. 31 4.2. Consequences for the notions of criminal individual and command responsibility ............................. 35 4.3. Conclusion .......................................................................................................................................... 38
Chapter 5: Future prospects ......................................................................................................................... 39 5.1. A possible function creep .................................................................................................................... 39 5.2. Legal safeguards against a function creep .......................................................................................... 42 5.3. Impact of offensive LAWSs on the principle of distinction .................................................................. 47 5.4. Consequences for the notions of criminal individual and command responsibility and possible solutions .................................................................................................................................................... 49 5.5. Conclusion .......................................................................................................................................... 52
Chapter 6: Conclusion .................................................................................................................................. 54
Bibliography ................................................................................................................................................. 57
Appendix ....................................................................................................................................................... 66
V
1
Chapter 1: Introduction The Blitzkrieg of the German Armed Forces in World War II (WWII) confronted their
opponents with an innovative doctrinal approach. An important aspect of this military doctrine
was a renewed perspective on Auftragstaktik also known as mission command. This is “a form
of military tactics where the emphasis is on the outcome of a mission rather than the specific
means of achieving it”.1 The commander does not tell the subordinate exactly what to do and
how, instead he informs the subordinate of his intent. In fact, he describes the end state of what
needs to be achieved. The reasoning behind this was that the subordinate’s actions largely
depend on the situation where he finds himself in. In such an approach, which formed a radical
rupture with WWI doctrine, more emphasis was put on individual responsibility. From a legal
perspective the notion of criminal command responsibility became more important as the
subordinate received more freedom of action thus increasing the level of legal responsibility of
the commander for the behaviour of his soldiers. More autonomous lethal weapon systems
might lead to a new yet to determine balance between command and individual criminal
responsibility depending on the type of weapon system and the precise circumstances of
employment.
In this thesis I will argue that for the deployment of defensive Lethal Autonomous
Weapon Systems (LAWS), due to their increasing autonomy, the current legal framework is
not fully suited anymore resulting in a creeping regulatory gap. This calls for a re-evaluation or
adaptation of these notions.2 I use the term autonomy here to refer to “systems capable of
operating in the real-world environment without any form of external control for extended
periods of time”.3
In this chapter I will introduce the focus of the thesis. First, I will give a brief background
of the technology that will be explored and explain the challenges associated with this
technology by referring to existing literature. I will then present the main research question and
1 R. Wagner, ‘Agility and Self-Organisation – Success Factors for the Prussian Army in the 19th Century’ International Project Management Association <https://www.ipma.world/agility-and-self-organisation-success-factors-for-the-prussian-army-in-19th-century/> accessed 3 January 2020 2 P. Margulies, ‘Making Autonomous Weapon Accountable: command responsibility for Computer-Guided Lethal Force in Armed Conflicts’ (2016) No. 166 Roger Williams University Legal Studies Paper, <https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2734900> accessed 20 March 2020, p. 25 3 M. Noorman, D. G. Johnson, ‘Negotiating autonomy and responsibility in military robots’ Vol. 16 (2014) Ethics and Information Technology, <https://link.springer.com/article/10.1007/s10676-013-9335-0?shared-article-renderer> accessed 21 February 2020, p. 61; G. A. Bekey, Autonomous robots: From biological inspiration to implementation and control, (MIT Press 2005), p. 1
2
associated sub-questions to address the potential regulatory gap. Further, I will explain the
significance of the conducted research. Next, I will elaborate on the methodology and
limitations to answer the research question. Finally, I will give a concise overview of what each
chapter will entail.
1.1. Background of LAWS
During the course of history humans used all kinds of tools to wage war like spears, swords,
bows and later guns. Weapons evolved and different sorts of (less sophisticated) autonomous
systems have already been used since WWII.4 Human-supervised automated weapon systems
have thus been around for some decades.5 Autonomy of weapon systems is not something
completely new but over the last three decades the development accelerated through fast
evolving technologies such as artificial intelligence (AI) and robotics leading to new
applications of weapons. These applications can be simple algorithms that reinforce
calculations or complex autonomous systems that are found in modern unmanned combat air
systems.6 Some of these systems are able to automatically identify a target as a threat, the
system will then (still) send this data back to a human operator to be verified and to (dis)approve
the engagement.7 The highest degree of autonomy in contemporary systems however can be
predominantly found in defensive weapon systems. Examples of such systems are the U.S.
Phalanx close-in weapon system (CIWS), the surface-to-air Patriot missile battery and the
Israeli Iron Dome. These systems can “autonomously perform their own search, detect,
evaluation, track, engage and kill assessment functions to defend ships or particular areas
against fast moving and highly manoeuvrable threats”.8 This thesis will mainly focus on the
defensive systems as these systems (as opposed to offensive systems) are currently already part
of the military inventories and used on the battlefield. This allows to analyse the systems based
4 The German Wren torpedo’s passive acoustic homing seeker effectively made it the world’s first autonomously guided munition, see J. Campbell, Naval Weapons of World War Two (Naval Institute Press, 2002) 5 The RQ-1 Predator was used as an intelligence, surveillance and reconnaissance platform in former Yugoslavia, see P. Springer, Military Robots and Drones: A Reference Handbook (ABC-CLIO, 2013) 6 M. A.C. Ekelhof, ‘Lifting the fog of targeting: “Autonomous Weapons” and Human Control through the Lens of Military Targeting’ (2018) Vol. 71, No. 3 U.S. Naval War College, <https://www.jstor.org/stable/10.2307/26607067 > accessed 1 April 2020, p. 74 7M. A.C. Ekelhof, ‘Lifting the fog of targeting: “Autonomous Weapons” and Human Control through the Lens of Military Targeting’ (2018) Vol. 71, No. 3 U.S. Naval War College, <https://www.jstor.org/stable/10.2307/26607067 > accessed 1 April 2020, p. 74; F. Slijper, ‘Where to Draw the Line: Increasing Autonomy in Weapon Systems – Technology and Trends’ (2017) Paxforpeace, <www.paxvoorvrede.nl/> accessed 1 April 2020, p.10 8M. Ekelhof, ‘Complications of a Common Language: Why it is so Hard to Talk about Autonomous Weapons’ (2017) Vol. 22 No. 2 Journal of Conflict and security Law, <https://www.researchgate.net/publication/323416370_Complications_of_a_Common_Language_Why_it_is_so_Hard_to_Talk_about_Autonomous_Weapons> accessed 1 April 2020, p. 312
3
upon available unclassified data instead of the assumptions that have to be made regarding the
characteristics of offensive systems. Offensive systems besides some niche applications such
as the Harpy are mainly still in the research or development phases and will be far more diverse
in their abilities.
Both types of weapon systems come with unresolved challenges to existing ethical9 and
legal frameworks that require further clarification. Some scholars are enthusiastic about such
LAWS as a means to reduce harm due to human errors in judgement10 and thus limit the risk
that war fighters but also citizens and civilian cultural objects get harmed. Others see them as a
threat and something that humans have less of a grip on. These systems might be able to go
rogue and some are not a fan of further development and deployment.11 One of the essential
issues in this debate is the level of human responsibility and accountability. Technologies can
capacitate and simplify actions, but that does not mean they determine the acts of users, nor that
they determine the distribution of responsibility.12 Noorman makes the argument that “whether
or not and to what extent human actors are and will be considered to be responsible for the
behaviour of robotic systems is and will be the outcome of ongoing negotiations between the
various human actors involved, including designers and users, as well as scholars, military
authorities, human rights groups and politicians”.13 This also holds true for this case because
the analysis of human responsibility is multidisciplinary by nature. However, in this thesis, the
focus will be on the legal responsibility of the user more specifically the armed forces that
deploy and use a defensive LAWS. Hereby acknowledging that this responsibility will always
be a shared responsibility between a wider community of stakeholders such as the individuals
defining the desired characteristics of the capability, the designers, the political and military
authorities, the manufacturers, the programmers etc.
9 P. Lin, G. Bekey, K. Abney ‘Robots in war: issues of risk and ethics’ In R. Capurro, M. Nagenborg, Ethics and Robotics (AKA Verlag Heidelberg 2009), <https://digitalcommons.calpoly.edu/cgi/viewcontent.cgi?referer=https://www.google.com/&httpsredir=1&article=1010&context=phil_fac> accessed 1 April 2020, p. 49 - 67 10 M. Sassoli, ‘Autonomous Weapons and International Humanitarian Law: Advantages, Open Technical Questions and Legal Issues to be Clarified’ (2014) Vol. 90 International Law Studies, U.S. Naval War College, <https://digital-commons.usnwc.edu/cgi/viewcontent.cgi?article=1017&context=ils> accessed 14 November 2019, p. 310; R. Arkin, Governing Lethal Behvior in Autonomous Robots (CRC Press 2009) 11 P. Asaro, ‘On banning autonomous weapon systems: human rights, automation, and the dehumanization of lethal decision-making’ (2012) Vol. 94 No. 886 International Review of the Red Cross, <https://www.cambridge.org/core/services/aop-cambridge-core/content/view/992565190BF2912AFC5AC0657AFECF07/S1816383112000768a.pdf/on_banning_autonomous_weapon_systems_human_rights_automation_and_the_dehumanization_of_lethal_decisionmaking.pdf> accessed 15 November 2019, p. 692 12 M. Noorman, ‘Responsibility Practices and Unmanned Military Technologies’ (2014) Vol. 20 Science and Engineering Ethics, < https://link.springer.com/article/10.1007/s11948-013-9484-x > accessed 2 April 2020, p. 810 13 Ibid
4
In order to determine the legal responsibility of the military user the most appropriate
legal framework is that of international law and one of the most important notions of
international law is criminal responsibility as enshrined in International Humanitarian Law
(IHL). Criminal responsibility is crucial to bring judicial redress for grave breaches of
international humanitarian law, as provided for in the Geneva Conventions14, which can occur
by not respecting the principles underlying the targeting of an opponent. These principles are
distinction, proportionality and precautions in attack.
A possible worrisome evolution in this regard is the extended and incremental use of
autonomy in more offensive applications and the impact on criminal individual and command
responsibility. As systems get more offensive, the question rises if their use can still comply
with the current framework of IHL more specifically with the principles underlying the
targeting process which if not respected may lead to criminal responsibility. Through the
targeting process the commander manages the (kinetic and non-kinetic effects of the) mission
and makes sure his intent of the mission and operational concerns are reflected. Of course,
targeting directives describing the targeting process and the related procedures must be in
accordance with IHL principles. The targeting process is based on these principles to comply
with IHL in the context of an intention to engage (a) certain target(s) both in an offensive and
defensive mode.15 A well-developed operational targeting process is utilised by NATO.16 It is
one of the most elaborated and best documented processes resulting from the know-how of
NATO’s thirty Member States. This targeting process consists of six phases which form a
process that may appear sequential but they are actually iterative and bidirectional. 17 Phases
can be reached at the same time but they can also overlap.18
Phase 1 consists of the commander’s intent and objectives. Phase 2 is the target
development. Phase 3 is the capabilities analysis. Phase 4 is the commander’s decision, force
14 Also reflected in Article 8 of the ICC Statute 15 M. Roorda, ‘NATO’s Targeting Process: Ensuring Human Control Over and Lawful Use of ‘Autonomous’ Weapons (2015) No. 2015-13 Amsterdam Law School Legal Studies Research Paper, <https://www.peacepalacelibrary.nl/ebooks/files/401482448.pdf> accessed 1 April 2020, p. 152 16 NATO Standard, ‘AJP-3.9 Allied joint doctrine for joint targeting’ (2016) Ed. A version 1, < https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/628215/20160505-nato_targeting_ajp_3_9.pdf > accessed 3 January 2020, p. 1-1 17 M. A.C. Ekelhof, ‘Lifting the fog of targeting: “Autonomous Weapons” and Human Control through the Lens of Military Targeting’, (2018) Vol. 71, No. 3 U.S. Naval War College, <https://www.jstor.org/stable/10.2307/26607067 > accessed 1 April 2020, p. 65 18 M. A.C. Ekelhof, ‘Lifting the fog of targeting: “Autonomous Weapons” and Human Control through the Lens of Military Targeting’, (2018) Vol. 71, No. 3 U.S. Naval War College, <https://www.jstor.org/stable/10.2307/26607067 > accessed 1 April 2020, p. 65; M. Roorda, ‘NATO’s Targeting Process: Ensuring Human Control Over and Lawful Use of ‘Autonomous’ Weapons’, (2015) University of Amsterdam, Amsterdam Law School Legal Studies Research Paper, <https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2593697 > accessed 2 April 2020, p. 154 - 156
5
planning and assignment. Phase 5 is the mission planning and force execution and phase 6 is
the assessment.19 During phase 5 the targeting staff obtains the final positive identification
(PID) of the targets and then executes. Target execution in itself also comprises of seven steps:
find, fix, track, target, engage, exploit and assess.20
1.2. Objective and research questions Due to the increase in the deployment of autonomous weapon systems making use of AI with
increasing levels of automation, there have been some concerns regarding the attribution of
responsibility when employing these machines. The objective of this master thesis is to examine
how the current use of (defensive) autonomous systems affects the legal notions of criminal
individual and command responsibility and depending on the findings, discuss what this might
mean for (future) more offensive developments that are already in the R & D and
experimentation pipeline. The objective of this thesis research is thus to contribute to
contemporary debates in the discourse on autonomous weapons through a legal analysis of
responsibility based upon the current mainly defensive use. To provide such an assessment of
the effect of the use of defensive autonomous systems on criminal individual and command
responsibility, the principle of distinction underlying the targeting process will be used as a
prism.
I will specifically look into the defensive Phalanx system as a representative application
of the current field of employment of autonomous weapon systems.21 Today the vast majority
of LAWS are defensive as demonstrated in the book ‘Army of None’ by Paul Scharre, an author
who has studied the entire spectrum of LAWS. More specifically to investigate the effect of
defensive LAWS on criminal responsibility, the Phalanx system is examined by analysing the
effect of the principle of distinction underlying the targeting process in the context of LAWS.
With regard to the more offensive applications of LAWSs, I will briefly look at the
Israeli Harpy, an example of an offensive LAWS that is already used and possibly the precursor
of a widening range of offensive LAWS. Due to the different usage parameters and less
controlled environment in offensive situations, this trend might change the conclusions with
19 NATO Standard, ‘AJP-3.9 Allied joint doctrine for joint targetting’, (2016) Ed. A version 1, < https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/628215/20160505-nato_targeting_ajp_3_9.pdf > accessed 3 January 2020, p. 2-2; D. Fleck, The Handbook of International Humanitarian Law, (Oxford University Press 2013), p. 207-212 20 Ibid 21 P. Scharre, Army of None: Autonomous Weapons and the Future of War (W.W. Norton & Company 2018), p. 137 - 140
6
regard to the defensive use in terms of the principle of distinction and consequently regarding
criminal individual and command responsibility that can be the result of non-compliance with
this principle.
Hence, this thesis will answer the following main research question:
How does the deployment of LAWS impact the notions of criminal individual and
command responsibility?
To be able to answer this main research question there are multiple sub-questions that require
examination. Replying to the different sub-questions allows the formulation of an answer to the
main research question. The sub-questions are the following:
1. What is a LAWS?
2. What are the main characteristics of the Phalanx CIWS, a prime example of current
defensive applications?
3. What do the principles underlying the targeting process and the notions of criminal
individual and command responsibility entail in IHL?
4. How does the use of the (defensive) Phalanx CIWS (analysed through the principle
of distinction) impact the application of the notions of criminal individual and command
criminal responsibility?
5. What do the conclusions from sub question 4 mean for (future) offensive applications
of LAWS?
1.3. Significance of this thesis Technological advancements in military weapon systems (augmenting the level of autonomy)
can impact current legal frameworks and their underlying principles and related notions. The
significance of the research conducted in this thesis consists of the awareness it creates
concerning the legal notion of (criminal) responsibility that proved its importance to hold
individuals responsible for their behaviour at different moments in history (e.g. WWII, Vietnam
war, Yugoslavia). Although current (defensive) applications of LAWS still allow for criminal
responsibility to be attributed to some extent according the current framework of IHL, there lies
a danger in expanding the offensive orientation of LAWS development. This development
7
could challenge the existing responsibility practices22 and more specifically the current legal
boundaries of IHL as the meaning and significance of the underlying values, established norms
and principles reach their limits and require renegotiation. Hence, this thesis will focus on
existing systems and the possible impact of more offensive systems on the issuance of criminal
responsibility.
1.4. Preliminary remarks and limitations
This thesis is not about establishing whether LAWS are genuinely autonomous or not but it is
focused on the interaction of the different characteristics of machine autonomy and the legal
notions. As I will examine the notions of criminal individual and command responsibility
related to these weapon systems, the legal framework that I will explore in this regard is IHL
as this is the law that is applicable when deploying such weapon systems in situations of armed
conflict. These weapon systems are used in conflicts and war where criminal responsibility will
come into play when these systems cause grave breaches of IHL. The attribution of criminal
responsibility in IHL will always be an essential condition of fighting a just war meaning that
someone can be held criminally responsible for the deaths of or damage to enemies or civilians
due to a breach of IHL during the war.23 The research does not cover criminal offences not
related to the conflict nor genocide, crimes against humanity, crimes against peace not related
to a conflict.24 Further, the research focus will be on criminal individual and command
responsibility, not on state responsibility. I will neither look into the possible corporate
responsibility of the manufacturer as this form of corporate responsibility is situated outside of
the IHL boundaries.
Other defensive LAWSs exist like the Russian Arena, the Israeli Trophy and the German
AMAP-ADS however their characteristics are similar to the Phalanx system. The reason for
focusing on defensive LAWS is due to an iterative process. The vast majority of LAWS
currently in use today are defensive and have similar abilities. Focusing the research on a
specific defensive system (with similar characteristics as the other defensive systems) thus
22 M. Noorman, ‘Responsibility Practices and Unmanned Military Technologies’ (2014) Vol. 20 Science and Engineering Ethics, < https://link.springer.com/article/10.1007/s11948-013-9484-x > accessed 2 April 2020, p. 814 23R. Sparrow, ‘Killer robots’ (2007) Vol. 24 No. 1 Journal of applied philosophy, <https://wmpeople.wm.edu/asset/index/cvance/sparrow> accessed 20 March 2020, p.133-142; M. Noorman, D. G. Johnson, ‘Negotiating autonomy and responsibility in military robots’ Vol. 16 (2014) Ethics and Information Technology, <https://link.springer.com/article/10.1007/s10676-013-9335-0?shared-article-renderer> accessed 21 February 2020, p. 811 24 Gary D. Solis, The Law of Armed Conflict: International Humanitarian Law in War (Cambridge University Press 2016), p. 331-339
8
allowed more depth within the scope and size of this thesis and made the conclusions equally
relevant for the other defensive systems. There are other broader analyses possible focusing
more on offensive systems that require a number of assumptions or sets of assumptions due to
the diverse categories of these systems. Such an analysis would have remained much broader
with less specific conclusions.
1.5. Methodology
This thesis is mainly based on doctrinal legal research. Statutory legislation, case law and
academic literature on automation of LAWS as well as relevant IHL will be examined. There
will be an in-depth legal analysis of the distinction principle underlying the targeting process
and the notions of criminal individual and command responsibility when using defensive
LAWS. The reason for choosing the principle of distinction (and not proportionality nor
precautions in attack) underlying the targeting process is because it is seen as the ‘gatekeeper’
of target selection.25 The International Court of Justice finds this principle of cardinal
importance, which emphasizes its significance.26 The available academic literature also focuses
more on the distinction principle emphasizing again its relative importance compared to
proportionality and precautions in attack. This approach was validated through exploratory
interviews with Colonel Chris De Cock, Legal Advisor of the European External Action
Service - Director EU Military Staff and Colonel Stein Westlye Johannessen, Defence Attaché
of Norway to the Netherlands and former legal advisor of the NATO Special Operations
Command, SHAPE. Their practical experience combined with extensive theoretical knowledge
allowed me to confirm or adjust certain viewpoints and deepen or expand the research where
and when necessary.
The impact of the potential shift from defensive to more offensive applications of
LAWS on the notions of criminal individual and command responsibility has also been briefly
analysed.
The research will entail the examination of how the use of the Phalanx CIWS as a defensive
LAWS has an impact on the legal notions of criminal individual and command responsibility
by examining the principle of distinction underlying the targeting process. The insights gained
allow me to also draw some conclusions regarding more offensive applications of LAWS that
25 G. Swiney, ‘Saving Lives: The principle of distinction and the Realities of Mordern War’ (2005) Vol. 39 No. 3 The International Lawyer, <https://www.jstor.org/stable/40707812> accessed 2 April 2020, p. 734; Articles 51(5)(b) and 57(2)(a)(iii) of Additional Protocol I to the Geneva Conventions of 12 August 1949 26 Legality of the Threat or Use of Nuclear Weapons Advisory Opinion, I.C.J. Reports 1996, 8 July 1996 International Court of Justice (ICJ), 8 July 1996, para 78, p.35
9
are in the R&D pipeline or already deployed like the Harpy, a system that is programmed to
engage ground radar systems to suppress the enemy air defence.27 Not only the US is looking
into developing more offensive LAWS, also Russia, Israel and China are investing intensively
in the research and development. The bulk of the applications today however remain
defensive.28
The distinction made between defensive and offensive systems is mainly driven by the
fact that defensive systems are currently already part of military inventories and used in various
circumstances. Their current use in existing systems such as the Phalanx creates less challenges
regarding the IHL principles underlying the targeting process. As these systems are already
deployed it is important to research how criminal command and individual responsibility are
attributed and constitute a solid legal base. These findings will allow to check whether different
challenges could arise in case of more offensive applications as offensive systems are currently
still under development and subject to fast evolving (classified) technological progress. The
research regarding offensive LAWS is more hypothetical in nature as assumptions would have
to be made on the basis of limited available open source information regarding future
characteristics. Defensive systems on the other hand made it possible to analyze validated data
of already deployed systems with well described characteristics. The conclusions about
defensive systems are thus more concrete, relevant and validated.
1.6. Chapter overview
This thesis will be structured as follows: After the introductory chapter (1), possible definitions
of LAWS will be presented and the necessity of a univocal definition will be discussed. Also,
the main characteristics of the Phalanx as a defensive LAWS and the link with the notions of
criminal individual and command criminal responsibility will be identified and discussed
(chapter 2). Further, the relevant current legal framework for the notions of criminal individual
and command criminal responsibility, namely IHL will be discussed. Not only will the history
of the different legal provisions and the ideas behind the notions be elucidated, also the relevant
case law explaining the mentioned legal provisions will be explored (chapter 3). The following
chapter focuses on the main research question and thus provides an assessment of the
implications the use of the Phalanx CIWS may have on the principle of distinction underlying
27 P. Scharre, Army of None: Autonomous Weapons and the Future of War (W.W. Norton & Company 2018), p. 5, 47-48 28 P. Scharre, Army of None: Autonomous Weapons and the Future of War (W.W. Norton & Company 2018), p 89
10
the targeting process, which will allow drawing conclusions regarding the notions of criminal
individual and command responsibility as understood in the current legal framework of IHL
(chapter 4). Chapter 5 will, building on the conclusions from chapter 4, elaborate on what this
might mean for (future) more offensive applications. Ultimately the last chapter will consist of
an overall conclusion integrating the most important findings (chapter 6).
11
Chapter 2: Defensive Autonomous Weapon Systems This chapter analyses the technological dimension of defensive LAWS and more particularly
the Phalanx system as it is crucial to understand its characteristics for the further research.
Firstly, the broader significance of increasing levels of autonomy of weapon systems will be
briefly discussed. Secondly, the various definitions of LAWS will be explored and the necessity
for a univocal definition for the purpose of this thesis will be brought up. Thirdly, the
characteristics of the Phalanx system such as the (non-)lethal character, the autonomy aspect,
and the defensive character will be analysed. Combined with these characteristics, the terms of
use will be discussed focused on the impact they might have on the notions of criminal
individual and command responsibility. A conclusion will wrap up this chapter recapping the
most important findings for the further research.
2.1. The significance of increasing levels of autonomy in weapon systems
Over the past years, the autonomy of military systems has evolved and significantly increased.29
The quest for more autonomy as stated in chapter one is not something new. However, the level
of autonomy in systems is currently growing exponentially due to the fast evolution of the
underlying technologies more specifically AI and robotics. The first autonomous systems were
rather rudimentary in nature and were used for simple tasks referred to as dull, dirty and/or
dangerous.30 Gradually the possible applications increased and more complex tasks could be
carried out by autonomous systems. Now, a point is reached where autonomous systems are
able to perform lethal actions in a complex environment. This brings us into a domain where
especially the legal and ethical dimension need to be scrutinized. An important aspect here is
the role and responsibility of the human factor.
2.2. Is a definition necessary?
29 M. Ekelhof, ‘Complications of a Common Language: Why it is so Hard to Talk about Autonomous Weapons’ (2017) Vol. 22 No. 2 Journal of Conflict and security Law, <https://www.researchgate.net/publication/323416370_Complications_of_a_Common_Language_Why_it_is_so_Hard_to_Talk_about_Autonomous_Weapons> accessed 1 April 2020, p. 311-312 30 N. Bhuta, S. Beck, R. Geiß, H. Liu, C. Kreß, Autonomous Weapons Systems: Law, Ethics, Policy (Cambridge University Press 2016), p. 7
12
Before going into the core of this chapter which will explain what autonomy in LAWS entails,
it is necessary to look into the current span of definitions of a LAWS. Is there one universally
accepted definition? If not, is such a definition necessary? Why (not)? As the discourse on
LAWS is multifaceted and multidisciplinary, a wide variety of actors are taking part in this
discourse and have a different idea of how a LAWS should be defined31. I will look into a
number of definitions to see whether there is a specific one that ties in with the research
objectives of this master thesis.
There are a multitude of organizations involved in the discussions regarding LAWSs
such as the International Committee of the Red Cross (ICRC) 32, the UN Institute for
Disarmament Research (UNIDIR) 33, NGOs such as Human Rights Watch, Pax for Peace, etc.
The International Committee of the Red Cross (ICRC) defines an autonomous weapon system
in a similar way as Human Rights Watch (HRW) namely, as “any weapon system with
autonomy in its critical functions that is, a weapon system that can select and attack targets
without human intervention”.34 All the organizations mentioned above are mainly focused on
the context of a potential ban of these weapons. Each definition developed by these international
bodies serves a certain purpose pending their perspective on the use of LAWS.
In 2013 the UN Convention on Certain Conventional Weapons (CCW) initiated the
holding of informal expert meetings where states could reflect on moving towards a potential
protocol for a limitation on the use of LAWS35 as this was the case with for example laser
31M. Ekelhof, ‘Complications of a Common Language: Why it is so Hard to Talk about Autonomous Weapons’ (2017) Vol. 22 No. 2 Journal of Conflict and security Law, <https://www.researchgate.net/publication/323416370_Complications_of_a_Common_Language_Why_it_is_so_Hard_to_Talk_about_Autonomous_Weapons> accessed 1 April 2020, p. 311; Rebecca Crootof, ‘The Killer Robots Are Here: Legal and Policy Implications’ (2015), vol. 36 no. 5 Cardozo Law Review, <https://heinonline-org.tilburguniversity.idm.oclc.org/HOL/Page?lname=&public=false&collection=journals&handle=hein.journals/cdozo36&men_hide=false&men_tab=toc&kind=&page=1837> accessed 27 December 2019, p. 1847 32 H. M. Roff, R. Moyes, ‘Meaninghul Human Control, Artificial Intelligence and Autonomous Weapons’(2016), briefing paper prepared for the Informal Meeting of Experts on Lethal Autonomous Weapons Systems, UN Convention on Certain Conventional Weapons, <http://www.article36.org/wp-content/uploads/2016/04/MHC-AI-and-AWS-FINAL.pdf> accessed 27 December 2017, p. 1 33 Ibid 34 M. Ekelhof, ‘Complications of a Common Language: Why it is so Hard to Talk about Autonomous Weapons’ (2017) Vol. 22 No. 2 Journal of Conflict and security Law, <https://www.researchgate.net/publication/323416370_Complications_of_a_Common_Language_Why_it_is_so_Hard_to_Talk_about_Autonomous_Weapons> accessed 1 April 2020, p. 321; ICRC Expert Meeting on ‘Autonomous Weapon Systems: Implications of Increasing Autonomy in the Critical Functions of Weapons’(2016) Versoix, Switzerland, <https://shop.icrc.org/autonomous-weapon-systems.html?___store=default&_ga=2.162034876.1219892157.1578954954-792592069.1578954954> accessed 27 December 2019, p.54; ICRC, ‘Views of the International Committee of the Red Cross on autonomous weapon systems’ (2016) ICRC Working Paper <www.unog.ch/80256EDD006B8954/(httpAssets)/B3834B2C62344053C1257F9400491826/$file/2016_LAWS+MX_CountryPaper_ICRC.pdf> accessed 28 December 2019, p. 2 35H. M. Roff, R. Moyes, ‘Meaninghul Human Control, Artificial Intelligence and Autonomous Weapons’(2016), briefing paper prepared for the Informal Meeting of Experts on Lethal Autonomous Weapons Systems, UN
13
weapons. The High Contracting Parties initiated an open-ended Group of Governmental
Experts (GGE) during the CCW Review Conference in December 2016. The GGE was created
to further examine and discuss achievable guidance in the form of recommendations in relation
to emerging technologies in the field of LAWS. In order to draft a Protocol on a ban, they
would have to come to an unequivocal definition of LAWS. Even though the prominent
discourse of LAWS happens at the political level of the CCW and even after frequent
discussions over the past years, the terminology and definitions remain today controversial and
unresolved.36 For this reason, the High Contracting Parties to the CCW link the discussions
about a definition to the negotiations of a potential new Protocol. For them it is not just a
discussion to get a better understanding of what LAWS are but a discussion initiating a phase
in which the State Parties are discussing potential future law. Here certain State Parties wish to
delay or push forward these discussions and prefer an ambiguous definition.37 Neither Russia,
nor China, nor the US are willing to put forward or agree on a definition as they do not want
this leading to a limiting protocol because they are investing significantly in these types of
systems.38
The US Department of Defense (DoD) Directive 3000.09 from November 2012 was the
first policy document that mentioned autonomous weapons.39 This Directive makes a
Convention on Certain Conventional Weapons, <http://www.article36.org/wp-content/uploads/2016/04/MHC-AI-and-AWS-FINAL.pdf> accessed 27 December 2019, p. 1; A. Dahlmann, M. Dickow, ‘Preventive Regulation of Autonomous Weapon Sytems: Need for Action by Germany at Various Levels’ (2019) SWP Research Paper 3, <https://www.swp-berlin.org/fileadmin/contents/products/research_papers/2019RP03_dnn_dkw.pdf> accessed 27 December 2019, p. 5; P. Scharre, ‘Lethal Autonomous Weapons and Policy-Making Amid Disruptive Technological Change’ (2017) Just Security, <https://www.justsecurity.org/47082/lethal-autonomous-weapons-policy-making-disruptive-technological-change/> accessed 20 March 2020; John Lewis, ‘The Case for Regulating Fully Autonomous Weapons’,(2015) Vol. 124 No. 4 The Yale Law Journal, Volume 124, <https://www.yalelawjournal.org/comment/the-case-for-regulating-fully-autonomous-weapons> accessed 29 March 2020; P. Asaro, ‘Why the world needs to regulate autonomous weapons, and soon’ (2018) Bulletin of the Atomic Scientists, <https://thebulletin.org/2018/04/why-the-world-needs-to-regulate-autonomous-weapons-and-soon/#> accessed 29 March 2020; R. Christian, ‘Heed the Call: A Moral and Legal Imperative to Ban Killer Robots’ (2018) HRW, <https://www.hrw.org/report/2018/08/21/heed-call/moral-and-legal-imperative-ban-killer-robots> accessed 29 March 2020 36 M. Ekelhof, ‘Complications of a Common Language: Why it is so Hard to Talk about Autonomous Weapons’ (2017) Vol. 22 No. 2 Journal of Conflict and security Law, <https://www.researchgate.net/publication/323416370_Complications_of_a_Common_Language_Why_it_is_so_Hard_to_Talk_about_Autonomous_Weapons> accessed 1 April 2020, p. 314 37M. Ekelhof, ‘Complications of a Common Language: Why it is so Hard to Talk about Autonomous Weapons’ (2017) Vol. 22 No. 2 Journal of Conflict and security Law, <https://www.researchgate.net/publication/323416370_Complications_of_a_Common_Language_Why_it_is_so_Hard_to_Talk_about_Autonomous_Weapons> accessed 1 April 2020, p. 319 38 The EU, NATO and Artificial Intelligence, (2019) Report ISS, <https://www.iss.europa.eu/sites/default/files/EUISSFiles/EU%20NATO%20AI%20-%20Report.pdf> accessed 28 March 2020 39 M. Ekelhof, ‘Complications of a Common Language: Why it is so Hard to Talk about Autonomous Weapons’ (2017) Vol. 22 No. 2 Journal of Conflict and security Law, <https://www.researchgate.net/publication/323416370_Complications_of_a_Common_Language_Why_it_is_so_Hard_to_Talk_about_Autonomous_Weapons> accessed 1 April 2020, p. 322
14
distinction between 3 types of weapon systems. The first type being semi-autonomous weapon
systems. These need the selection and authorization of a human to engage specific targets.40
The second type are the human-supervised autonomous weapon systems. They permit human
intervention and if necessary termination of engagement, even in time-sensitive attacks on
platforms or installations. The third type is the autonomous weapon system which is defined as
“once activated, can select and engage targets without intervention by a human operator”.41
This last definition of an autonomous weapon is also used by the UN Special Rapporteur
Christof Heyns in his report on Lethal Autonomous Robotics.42 The definition covers both
human-supervised LAWS developed for human operators to override the weapon system when
needed and systems that can autonomously select and engage targets without intervention after
their initiation.43 This definition of LAWS thus emphasizes that the level of autonomy is related
to the system’s capacity to act independently, not if it is supervised or actually used in that
way.44 It emphasizes that humans are from the outset responsible for the decision to deploy the
LAWS.
There are many different definitions and terms circulating to describe LAWS in the political
and diplomatic debate and this is also the case in the academic arena. Rebecca Crootof for
instance, defines LAWS in one of her articles as “a weapon system that based on conclusions
derived from gathered information and pre-programmed constraints, is capable of
independently selecting and engaging targets”.45 The philosopher, Peter Asaro in his article for
the International Review of the Red Cross defines LAWS as “any system that is capable of
targeting and initiating the use of potentially lethal force without direct human supervision and
40 C. Saad, E. Gosal, ‘Autonomous weapon systems: how to work towards a total ban?’, The Canadian Bar Association, <https://www.cba.org/Sections/International-Law/Articles/2019/Autonomous-weapons-systems-how-to-work-towards-a> accessed 1 April 2020 41 US Department of Defense, ‘Directive 3000.09’ (2012) <www.dtic.mil/whs/directives/corres/pdf/300009p.pdf> accessed 27 December 2019 42 C. Heyns, ‘Report of the Special Rapporteur on extrajudicial, summary of arbitrary executions’(2013), UN General Assembly, <https://www.ohchr.org/Documents/HRBodies/HRCouncil/RegularSession/Session23/A-HRC-23-47_en.pdf> accessed 28 December 2019 43 R. Crootof, ‘The Killer Robots Are Here: Legal and Policy Implications’ (2015), vol. 36 no. 5 Cardozo Law Review, <https://heinonline-org.tilburguniversity.idm.oclc.org/HOL/Page?lname=&public=false&collection=journals&handle=hein.journals/cdozo36&men_hide=false&men_tab=toc&kind=&page=1837> accessed 27 December 2019, p. 1847 44 Ibid, p. 1848 45 R. Crootof, ‘The Killer Robots Are Here: Legal and Policy Implications’ (2015), vol. 36 no. 5 Cardozo Law Review, <https://heinonline-org.tilburguniversity.idm.oclc.org/HOL/Page?lname=&public=false&collection=journals&handle=hein.journals/cdozo36&men_hide=false&men_tab=toc&kind=&page=1837> accessed 27 December 2019, p. 1847
15
direct human involvement in lethal decision-making”.46 These two definitions show that
different research perspectives lead to different definitions containing those parameters that are
relevant for the specific field of research. A legal and a physical analysis of LAWS thus require
a different approach.
However, most definitions do have resemblances using terms such as ‘autonomy’,
‘target selection’, ‘attack’/’target engagement’ and ‘human intervention’. Till this day there is
yet to be agreed upon one universal definition of LAWS.47 The advantage of one single
definition would be that everyone is speaking about the same thing but if there is no common
understanding of the terms used in the definition or if they lack consistent interpretation the
definition is useless.48 Furthermore the IHL framework with its broadly interpretable terms is
sufficiently flexible to regulate the use of LAWS in their current form and employment and
does not require one single univocal definition.
Given the ambiguity of the term autonomy, a better approach to discuss the regulation
of LAWS, is to look at existing autonomous capabilities and analyse how their use adhere to
the underlying principles of IHL where autonomy is interpreted in a narrow sense namely
performing tasks without human intervention.
2.3. Characteristics and usage modalities Phalanx CIWS
The characteristics and usage of LAWS will be used as
the analytical lens to conduct the research. One specific
system, the US Phalanx CIWS, will be analyzed more
in depth as it is representative for the current use pattern
of defensive LAWS. CIWSs such as the Phalanx “are
designed to defend a limited geographical zone such as
the area around a ship or military base (point defence)”.49 Possible targets consist of missiles,
46 P. Asaro, ‘On banning autonomous weapon systems: human rights, automation, and the dehumanization of lethal decision-making’ (2012) Volume 94 Nr. 886 International Review of the Red Cross, p. 6 47 R. Crootof, ‘The Killer Robots Are Here: Legal and Policy Implications’ (2015), vol. 36 no. 5 Cardozo Law Review, <https://heinonline-org.tilburguniversity.idm.oclc.org/HOL/Page?lname=&public=false&collection=journals&handle=hein.journals/cdozo36&men_hide=false&men_tab=toc&kind=&page=1837> accessed 27 December 2019, p. 1837 and 1841 48 M. Ekelhof, ‘Complications of a Common Language: Why it is so Hard to Talk about Autonomous Weapons’ (2017) Vol. 22 No. 2 Journal of Conflict and security Law, <https://www.researchgate.net/publication/323416370_Complications_of_a_Common_Language_Why_it_is_so_Hard_to_Talk_about_Autonomous_Weapons> accessed 1 April 2020, p. 316 49 V. Boulanin, M. Verbruggen, ‘Mapping the Development of Autonomy in Weapons Systems’ (2017) Stockholm International Peace Research Institute, <https://www.sipri.org/sites/default/files/2017-
16
rockets or aircrafts of the adversary but also surface vehicles.50 It is “a rapid-fire, computer-
controlled, radar-guided gun that can defeat anti-ship missiles and other close-in threats”.51 It
can also destroy incoming objects that are not adversary or even military by nature, which could
lead to a breach of IHL. However, it must be mentioned that the Phalanx CIWS is
complementary to a number of other systems that also deal with threats approaching the ship
which require more human (inter)action. On land this concerns e.g. observation posts,
HUMINT and at sea e.g. sonars looking for submarine threats. 52 The Phalanx CIWS is in fact
a last resort defence that comes into effect when the threat is imminent and no (human)
assessment is possible.53
In the following subsections the characteristics of the Phalanx weapon system will be
further analysed.
2.3.1. Lethal character The Phalanx system is used to eliminate imminent threats such as incoming missiles, mortar
shells, etc. but also vessels or airplanes.54 Its primary aim is not to engage living beings. It is
thus not intended to be lethal by design. Lethal refers to the capacity to use lethal force and take
human lives.55 However, its engagement can be lethal as it will destroy all incoming threats
according the pre-programmed parameters thus opening the possibility to destroy manned
threats or create unintended effects and collateral damage.56 It can thus arguably be categorized
as a LAWS.
2.3.2. Autonomous character The ambiguity of the concept of autonomy has been discussed in Part 2.2. Specifically for the
Phalanx CIWS, most authors57 consider it a LAWS as it can function autonomously. For
11/siprireport_mapping_the_development_of_autonomy_in_weapon_systems_1117_1.pdf> accessed 20 March 2020, p. 37 50V. Boulanin, M. Verbruggen, ‘Mapping the Development of Autonomy in Weapons Systems’ (2017) Stockholm International Peace Research Institute, <https://www.sipri.org/sites/default/files/2017-11/siprireport_mapping_the_development_of_autonomy_in_weapon_systems_1117_1.pdf> accessed 20 March 2020, p. 37 51 <https://www.raytheon.com/capabilities/products/phalanx> accessed 20 March 2020 52 Ibid 53 Ibid 54 D. Saxon, International Humanitarian Law and the Changing Technology of War (Martinus Nijhoff Publishers 2013), p. 73 55 J. Carlson, Citizen-protectors: The Everyday Politics of Guns in an Age of Decline (Oxford University Press 2015), p. 74 56 A.J. Plunkett, ‘Iwo Jima Officer Killed In Firing Exercise’ (1989) Daily Press, <http://articles.dailypress.com/1989-10-12/news/8910120238_1_iwo-jima-ship-close-in-weapons-system> 20 March 2020; 57 P. Asaro ‘On Banning Autonomous Weapon Sytems: Human Rights, Automation and the Dehumanization of Lethal Decision-Making’, (2012) Volume 94 Nr. 886 International Review of the Red Cross; M. Gubrud in
17
instance, Gubrud states that a weapon system that “once activated, can select and engage targets
without further intervention by a human operator”58 is considered autonomous. Regarding a
defensive weapon system, it is advantageous to make it autonomous as the aim is to ensure that
the decision-making process to engage an opponent, the so-called Orient, Observe, Decide and
Act (OODA) loop59 is more accurate and faster than that of the opponent.
Nevertheless, most authors do not see the system as ‘fully’ autonomous as there is still
a possibility for a human to push an ‘abort’ button to stop the firing.60 In practice however, it
will be extremely difficult for an operator to push the ‘abort’ button as the system starts
engaging a target as soon as it perceives it as an imminent threat which will be much faster than
any human intervention would be able to interrupt.
The Phalanx system carries out automatically various actions that were previously
performed by multiple systems and operators. “It searches, detects, does a threat evaluation,
tracks, engages and does a kill assessment”.61 It however does not determine its own targeting
criteria as these are already programmed into the system but once initiated the system selects
and engages targets without human intervention. 62
2.3.3. Defensive character
The Phalanx system is mostly used on sea in combination with the Aegis Combat System
(ACS).63 The US Navy describes the system as follows:
The current version of the Phalanx close-in weapon system (MK15) provides on sea
ships with an inner layer point defense capability against Anti-Ship Missiles (ASM),
‘Autonomy without Mystery:Where do you draw the line?’ (2014) 1.0 Human; G. A. Bekey, Autonomous robots: From biological inspiration to implementation and control, (MIT Press 2005), p. 1 58 M. Gubrud, ‘Stopping killer robots’ (2014) Vol. 70 No. 1 Bulletin of the Atomic Scientists <https://journals.sagepub.com/doi/pdf/10.1177/0096340213516745> accessed 20 March 2020, p. 32; M. Gubrud in Autonomy without Mystery: Where do you draw the line?’ (2014) 1.0 Human 59 A four-step approach to decision-making which focuses on gaining information, putting it into the right context and making the most appropriate decision as quickly as possible while keeping in mind that changes can be made as more info is known. <https://searchcio.techtarget.com/definition/OODA-loop> accessed 15 June 2020 60 S. Welsh, Regulating Autonomous Weapons’ (2017) RealClear Defense, <https://www.realcleardefense.com/articles/2017/11/16/regulating_autonomous_weapons_112647.html> accessed 20 March 2020 61 Ibid 62 Ibid; C. Jenks, ‘False Rubicons, Moral Panic & Conceptual Cul-DelSacs: Critiquing & Reframing the Call to Ban Lethal Autonomous Weapons’ Southern Methodist University, Dedman School of Law<https://scholar.smu.edu/law_faculty/495/> accessed 20 March 2020 63 S. Welsh, Regulating Autonomous Weapons’ (2017) RealClear Defense, <https://www.realcleardefense.com/articles/2017/11/16/regulating_autonomous_weapons_112647.html> accessed 20 March 2020
18
aircraft, and littoral warfare threats that have penetrated other fleet defenses. Phalanx
automatically detects, evaluates, tracks, engages, and performs kill assessment against
ASM and high-speed aircraft threats. The Phalanx variant (Block 1B) adds the ability
to counter asymmetric warfare threats through the addition of an integrated, stabilized,
Electro Optic sensor. These improvements give Phalanx the added ability to counter
small high-speed surface craft, aircraft, helicopters, and Unmanned Aerial Systems
(UAS). Phalanx is the only deployed close-in weapon system capable of autonomously
performing its own search, detect, evaluation, track, engage and kill assessment
functions. Phalanx also can be integrated into existing ship combat control systems to
provide additional sensor and fire-control support to other installed ship weapon
systems.64
The quotation clearly describes the defensive nature of the Phalanx system what makes it a
prime example of a defensive LAWS.
2.3.4. Human-machine interaction Another important factor in case of the Phalanx CIWS is the necessity for an evolving
interaction between the human, meaning the military commander and operator, and the Phalanx
CIWS. Employing the Phalanx requires knowledge and skills that differ from the normal
spectrum of weaponry the military is familiar with to operate on the battlefield. The nature and
design of the human-machine interface is crucial for the ability of the LAWS to be compliant
with IHL.65 Military command structures and the skill sets of operators have to be adapted to
accommodate the unique capabilities of the Phalanx.66
This was already the case when new types of weapons such as nuclear weapons were
introduced. The structures, procedures and formation of the military involved required a
thorough revision compared to the use of conventional weapons. In this specific case it was
clear that the political oversight and related procedures needed to be a lot tighter for nuclear
weapons than this was the case for conventional weapons.
64<https://www.navy.mil/navydata/fact_display.asp?cid=2100&tid=487&ct=2> accessed 20 March 2020; <https://www.public.navy.mil/surfor/Pages/Phalanx-CIWS.aspx> accessed 20 March 2020 65 P. Margulies, ‘Making Autonomous Weapon Accountable: command responsibility for Computer-Guided Lethal Force in Armed Conflicts’ (2016) No. 166 Roger Williams University Legal Studies Paper, <https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2734900> accessed 20 March 2020, p. 20 66 Ibid
19
In a similar way a modification of the military doctrine is necessary for the employment
and engagement of the Phalanx. US Department of Defense Directive Number 3000.09 requires
that all systems, including LAWS, be designed to “allow commanders and operators to exercise
appropriate levels of human judgment over the use of force”.67 Here ‘appropriate’ “is a flexible
term that reflects the fact that there is not a fixed, one-size-fits-all level of human judgment that
should be applied to every context”.68 A US Government white paper of 2018 states:
‘Human judgment over the use of force’ does not require manual human ‘control’ of the
weapon system, as is often reported, but rather broader human involvement in decisions
about how, when, where, and why the weapon will be employed.69 This includes a human
determination that the weapon will be used with appropriate care and in accordance
with the law of war, applicable treaties, weapon system safety rules, and applicable
rules of engagement.70
Invoking criminal individual and command responsibility - which will be addressed in the next
chapter - thus requires a customized approach that permits to hold the military decision makers
and operators responsible for their acts.
2.4. Conclusion
This chapter started out by briefly discussing the significance of increasing levels of autonomy
in weapon systems. It was stated that due to an evolution of the underlying technologies such
as robotics and AI, weapons systems started to become more sophisticated and even capable of
autonomous lethal action. This contrasts with the way the systems were previously used for
mainly dull, dirty and dangerous tasks. Subsequently various definitions of LAWS were
explored making it clear that for the purpose of this thesis there is no necessity to formulate one
single definition but rather to focus on the specific characteristics. IHL as a legal framework is
flexible enough to regulate the use of LAWS in their current form and employment and does
67 ‘Defense Primer: U.S. Policy on Lethal Autonomous Weapon Systems’ (2019) Congressional Research Service, <https://fas.org/sgp/crs/natsec/IF11150.pdf > accessed 21 March 2020, p.1; DoD Directive 3000.09, Autonomy in Weapon Systems (“Unmanned Systems Integrated Roadmap, FY 2013-2038) (Nov. 21, 2012) 68 ‘Defense Primer: U.S. Policy on Lethal Autonomous Weapon Systems’ (2019) Congressional Research Service, <https://fas.org/sgp/crs/natsec/IF11150.pdf > accessed 21 March 2020, p.1 69 ‘Defense Primer: U.S. Policy on Lethal Autonomous Weapon Systems’ (2019) Congressional Research Service, <https://fas.org/sgp/crs/natsec/IF11150.pdf > accessed 21 March 2020, p.1 70 Ibid; U.S. Government, ‘Human-Machine Interaction in the Development, Deployment and Use of Emerging Technologies in the Area of Lethal Autonomous Weapons Systems’ (2018) CCW/GGE.2/2018/WP.4, <https://www.unog.ch/80256EDD006B8954/(httpAssets)/D1A2BA4B7B71D29FC12582F6004386EF/$file/2018_GGE+LAWS_August_Working+Paper_US.pdf> accessed 21 March 2020, p. 4
20
not require one single univocal definition. Subsequently a more detailed analysis was made of
the characteristics of the Phalanx CIWS, a defensive system currently employed by the armed
forces. The most important characteristic here to focus on is autonomy as it is considered as the
essential element with regard to the legal implications. Autonomy is a term that means different
things for different people which does not make it easy to come to unequivocal conclusions.
The current generation of defensive autonomous systems such as the Phalanx system however
is more clear-cut in terms of its characteristics thus facilitating the analysis of the impact of
their use on the IHL notions of criminal individual and command criminal responsibility.
21
Chapter 3: Criminal Responsibility This chapter sets out the relevant legal framework and legal notions which could be challenged
by the use of LAWS with ever higher degrees of autonomy. More specifically the legal
application of the notions of criminal individual and command responsibility in IHL will be
explored. Due to the rapidly evolving technology embedded in weapon systems one could
question if these legal notions remain applicable and if there exists a regulatory gap that would
call for an urgent re-evaluation or adaptation of these notions.71 This will inter alia be examined
through the principle of distinction as it offers a testbed that links the terms of use of the weapon
system to the related legal framework. This principle is a direct instigation to trigger criminal
individual and/or command responsibility when using LAWS.
Responsibility is a broad concept that has different meanings in different contexts and
that has a problematic relationship with technology.72 Noorman states in this regard:
The various strategies to hold people responsible in the face of uncertainty are part of
responsibility practices. I use the term responsibility practices here to refer to the
established ways that people, within a particular environment or community,
understand, evaluate, and ascribe responsibility. These practices involve accepted ways
of evaluating actions, holding others to account, blaming or praising, and conveying
expectations about obligations and duties. They pertain to the various kinds of
responsibility, such as accountability, role responsibility, legal responsibility, and
moral responsibility.73
In this master thesis the focus will be on the legal dimension of responsibility. In legal terms,
responsibility is often confused with accountability or liability which is not illogical as these
notions are partially overlapping. However, they are not synonyms. ‘Accountability’ entails
“the process aimed at a public assessment of conduct in a given case in order to evaluate
71 P. Margulies, ‘Making Autonomous Weapon Accountable: command responsibility for Computer-Guided Lethal Force in Armed Conflicts’ (2016) No. 166 Roger Williams University Legal Studies Paper, <https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2734900> accessed 20 March 2020, p. 25 72 M. Noorman, ‘Responsibility Practices and Unmanned Military Technologies’ (2014) Vol. 20 Science and Engineering Ethics, < https://link.springer.com/article/10.1007/s11948-013-9484-x > accessed 2 April 2020, p. 811 73 M. Noorman, ‘Responsibility Practices and Unmanned Military Technologies’ (2014) Vol. 20 Science and Engineering Ethics, < https://link.springer.com/article/10.1007/s11948-013-9484-x > accessed 2 April 2020, p. 813
22
whether this conduct was required and/or justified”74 based on established responsibility.
‘Liability’ follows this. Liability is “the quality or state of being legally obligated or
accountable; legal responsibility to another or to society, enforceable by civil remedy or
criminal punishment”.75 Liability thus refers to the attachment of legal consequences to the
conduct.76 One can be responsible but not accountable and thus not liable. Think of a situation
where someone is mentally ill. He is responsible but not accountable nor liable for his actions.
Further one can be made accountable for actions of someone else as is the case with command
responsibility where the commander can be held accountable and can be liable for the direct
infringements of his subordinates. Command responsibility will be further elaborated. This
thesis refers to criminal (individual and command) responsibility where the person responsible
can be held accountable and liable for certain conduct. This means this legal perspective is
mainly focused on a backward-looking form of responsibility.77
3.1. Criminal Responsibility in IHL Rules and codes of conduct to delimit the act of war have their origins in arrangements already
made in ancient civilizations and religions but we had to wait for Henri Dunant, the founder of
the International Red Cross, to start in 1864 the process of codifying these, often old, customs
into international humanitarian law. The rules that were written down protected in specific
circumstances and under certain conditions wounded combatants from disproportionate harm
and suffering.78 During the decades that followed this first codification of the rules was
frequently amended and adapted to an evolving society and weaponry. Having the horrors of
World War II in mind, legal experts gathered again in Geneva in 1949 to adopt four treaties that
74 I. Giesen, F. G. H. Kristen, ‘Liability, Responsibility and Accountability: Crossing Borders’ (2014) Vol. No. 3 Utrecht Law Review, <https://www.utrechtlawreview.org/articles/10.18352/ulr.280/galley/281/download/> accessed 27 December 2019, p. 6; R. S. B. Kool, ‘(Crime) Victims’ Compensation: The Emergence of Convergence’ (2014) Vol. 10 No. 3, <https://www.utrechtlawreview.org/articles/abstract/10.18352/ulr.281/> accessed 28 December 2019, p. 14 75<https://leg.mt.gov/content/Committees/Interim/1999_2000/environmental_quality_council/subcommittees/eminent_domain/staffmemos/definitions.pdf> accessed 2 January 2020, p. 6 <https://www.cccattorneys.com/glossary> accessed 2 January 2020 76 E. Chavannes and A. Arkhipov-Goyal, ‘Towards Responsible Autonomy – The Ethics of Robotic Autonomous Systems in a Military Context’ (2019) The Hague Centre for Strategic Studies, <https://hcss.nl/report/towards-responsible-autonomy-ethics-ras-military-context> accessed 2 January 2020, p. 60 77M. Noorman, ‘Responsibility Practices and Unmanned Military Technologies’ (2014) Vol. 20 Science and Engineering Ethics, < https://link.springer.com/article/10.1007/s11948-013-9484-x > accessed 2 April 2020, p. 814; I. van de Poel, ‘The Relation Between Forward-Looking and Backward-Looking Responsibility’ in N. Vincent, J. van den hoven, I. van de Poel (eds), Moral Responsibility: Beyond Free Will and Determinism ( Springer 2011), p. 37-52 78 Gary D. Solis, The Law of Armed Conflict: International Humanitarian Law in War (Cambridge University Press 2016), p. 7; International Committee of the Red Cross, ‘The ICRC and Geneva Convention (1863-1864)’ <https://www.icrc.org/en/doc/resources/documents/misc/57jnvt.htm> accessed 2 January 2020
23
reaffirmed and updated the previous treaties and expanded the rules to protect civilians. These
treaties are now collectively known as the 1949 Geneva Conventions.79 These Conventions
describe the most important rules of war. As waging war is clearly part of human nature80,
regulating it is better than doing nothing at all and by this regulation human rights can be better
preserved as well. IHL is here the specific form of regulation. It is also often called the law of
armed conflicts meaning the rules to regulate the conduct of war (‘ius in bello’). It is a set of
rules and principles which aim, for humanitarian reasons, to limit the harmful effects of armed
conflict. As IHL is the legal foundation for regulating armed conflicts it will be used as the
legal reference base for the research. Taking into account that LAWS are mainly used in armed
conflict, IHL is the most appropriate legal framework for regulating such war-linked
technologies.
Since WWII, we have seen a technological evolution that lead to new weapon types
and systems that required new rules and conventions. Today we are confronted with an ever-
accelerating technological advancement leading to new sophisticated weapon systems such as
LAWS that have the potential to change the fundamentals of the current human-centric IHL
framework. It is hereby important to make a distinction between criminal individual and
command responsibility in IHL. The act has to be connected to the conflict (a nexus to the
conflict). Criminal responsibility arises when a grave breach of IHL or a war crime has allegedly
been committed.
The notion of criminal individual responsibility originates from the Nuremberg trial
where it was stated: “That international law imposes duties and liabilities upon individuals as
well as upon states has long been recognized (…) Crimes against international law are
committed by men, not by abstract entities, and only by punishing individuals who commit such
crimes can the provisions of international law be enforced”.81 In Nuremberg the individual
liability was emphasized, so not just the collective responsibility of a unit or a state but the
responsibility of the individuals themselves who had committed wrongful acts. The Geneva
Conventions (GC) from 1949 codified the obligation to enact any legislation necessary to
79 Gary D. Solis, The Law of Armed Conflict: International Humanitarian Law in War (Cambridge University Press 2016), p. 87; International Committee of the Red Cross, ‘The Geneva Conventions of 1949 and their Additional Protocols’<https://www.icrc.org/en/doc/war-and-law/treaties-customary-law/geneva-conventions/overview-geneva-conventions.htm> accessed 2 January 2020 80 David H. Petraeus, ‘As Machines Wage War, Human Nature Endures’ (2017) Zocalo Public Square, <https://www.zocalopublicsquare.org/2017/03/29/machines-wage-war-human-nature-endures/ideas/nexus/> accessed 2 January 2020 81 International Criminal Court, ‘Applying the Principles of Nuremberg in the ICC’, (2006) Keynote Address at the Conference “Judgement at Nuremberg held on the 60th Anniversary of the Nuremberg Judgment, <https://www.icc-cpi.int/NR/rdonlyres/ED2F5177-9F9B-4D66-9386-5C5BF45D052C/146323/PK_20060930_English.pdf > accessed 27 December 2019, p. 3
24
prosecute, to search for, bring before court or transfer persons committing grave breaches of
IHL in article 49 GC I, article 50 GC II, article 129 GC III and article 146 GC IV. Individual
responsibility rests upon everyone (also commanders) and entails any actions that lead to a
grave breach of IHL or a war crime. Grave breaches entail an exhaustive list of breaches e.g.
willful killing, torture, willfully causing great suffering, etc. (see article 50 GC I, article 51 GC
II, article 130 GC III and 147 GC IV). War crimes are any violations of IHL due to the nexus
to hostilities. They entail any violation of IHL for which an individual responsibility exists. The
origin of this definition dates back to the Statute of the Nuremberg Tribunal. On a UN level,
war crimes are defined as ‘violations of the laws or customs of war’.82 A list of the possible war
crimes can be found in Article 8 of the ICC Statute.
The persons suspected of having breached IHL can then be brought before a national
criminal court or an international criminal Tribunal such as the ICC (article 25), ICTY (article
7 (1)) or ICTR (article 6(1)). These articles describe what needs to be proven to hold someone
criminally responsible. The different requirements are alternative possibilities so they do not
have to be cumulatively proven. Article 25 of the Rome Statute (similar to the articles from the
other Statutes) in Appendix establishes the principle of ‘personal jurisdiction’ and gives the
ICC the jurisdiction to convict natural persons accused of crimes (such as breaches of IHL)
within its jurisdiction.83
The notion of criminal command responsibility resulting from article 86(2) Additional Protocol
I (AP I) and enshrined in the statutes of the ICC (article 28), ICTY (article 7 (3)) and ICTR
(article 6(3)), entails that the commander is responsible “if he knew or had information which
should have enabled him to conclude in the circumstances at the time that his subordinates were
committing or about to commit a breach and that he did not take all feasible measures within
his power to prevent or repress the breach”.84 The elements that have to be proven cumulatively
82 Charter of the International Military Tribunal – Annex to the Agreement for the prosecution and punishment of the major war criminals of the European Axis (‘London Agreement’), 8 August 1945, article 6(b) 83 <https://www.casematrixnetwork.org/cmn-knowledge-hub/icc-commentary-clicc/commentary-rome-statute/commentary-rome-statute-part-3/> accessed 27 July 2020 84 Advisory Service on International Humanitarian Law, ‘Command responsibility and failure to act’, ICRC, https://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=&ved=2ahUKEwiTy8idsNTqAhXNi6QKHf3bCw8QFjABegQIARAB&url=https%3A%2F%2Fwww.icrc.org%2Fen%2Fdownload%2Ffile%2F1087%2Fcommand-responsibility-icrc-eng.pdf&usg=AOvVaw3WAfvETOyfp_ipRKC-XnnZ accessed 27 December 2019, p. 1
25
to hold a commander responsible according to the Rome Statute, which were clearly illustrated
in the Bemba case85, are:
(a) the existence of a superior-subordinate relationship;
(b) the superior knew or had reason to know that the criminal act was about to be or
had been committed;
(c) the superior failed to take the necessary and reasonable measures to prevent the
criminal act or punish the perpetrator thereof.
The statute of the ICTR adds:
(d) the existence of acts and crimes by the subordinate
A wrongful act committed by (a) subordinate(s) is thus a prerequisite for the application. The
commander possesses a secondary liability for an omission that in the US would be described
as vicarious liability. The burden of proof is heavy as all the requirements above must be
proven.
There is still ongoing discussion about what this ‘should have’ mentioned in article
86(2) AP I means and if it entails a simple negligence or a gross negligence. Command
responsibility is also conditioned by the manner an organization is structured and how it
functions. The commander has to manage the military chain of command and organize how
information is structured and delivered. He must make sure he has permanent situational
awareness of what is happening under his command.
The subjective standard to trigger this responsibility is ‘knew’ or ‘had information
which should have enabled them to conclude’. This was further developed by case law as case
ICTY Celebici §383 demonstrates:
A construction of this provision in light of the content of the doctrine under customary
law leads the Trial Chamber to conclude that a superior may possess the mens rea
required to incur criminal liability where: (1) he had actual knowledge, established
through direct or circumstantial evidence, that his subordinates where committing or
about to commit crimes referred to under article 2 to 5 of the Statute or (2) where he
85 J. B. Mbokani, ‘The Doctrine of “Command Responsibility” in the Bemba Case’ (2011) International Justice Monitor, <https://www.ijmonitor.org/2011/07/the-doctrine-of-command-responsibility-in-the-bemba-case/> accessed 14 May 2020; Bemba (Case ICC-01/05-01/08), ICC Judgment, The Hague 8 June 2018
26
had in his possession information of a nature, which at least, would put him on notice
of the risk of such offences by indicating the need for additional investigation in order
to ascertain whether such crimes were committed or were about to be committed by his
subordinates.86
In § 393 on the interpretation of AP I, article 86(2) is stated:
An interpretation of the terms of this provision in accordance with their ordinary
meaning thus leads to the conclusion, confirmed by the travaux préparatoires, that a
superior can be held criminally responsible only if some specific information was in fact
available to him which would provide notice of offences committed by his subordinates.
This information needs not be such that it by itself was sufficient to compel the
conclusion of the existence of such crimes. It is sufficient that the superior was put on
further inquiry by the information, or, in other words, that it indicated the need for
additional investigation in order to ascertain whether offences were being committed
or about to be committed by his subordinates.87
As stated above, according to IHL these types of criminal responsibility are triggered by two
types of crimes namely grave breaches and war crimes. There are also crimes that can occur
outside the scope of IHL and those are crimes against humanity, genocide, crimes against peace.
These are specific crimes which are not per definition war crimes nor grave breaches if
committed unrelated to hostilities.88 This research only covers the criminal responsibility that
can be triggered by a breach of IHL as this is the legal framework where most of the issues with
LAWS will be dealt with.
3.2. The principles underlying the targeting process In order to establish a link between the employment of a certain type of weapon system such as
a LAWS and the aspects of criminal individual and command responsibility an intermediary
step is necessary. The principles underlying the targeting process that it is closely related to
IHL and more specifically to the aspects of criminal individual and command responsibility
will be used to establish this link. Targeting is the conduct of both defensive and offensive
86 Celebici (Case CC/PIU/364-E) ICTY judgment, The Hague 16 November 1998 87 Celebici (Case CC/PIU/364-E) ICTY judgment, The Hague 16 November 1998 88 Gary D. Solis, The Law of Armed Conflict: International Humanitarian Law in War (Cambridge University Press 2016), p. 102 -103
27
operations namely in situations of individual or unit self-defense as well as in situations of
targeting high-value targets (HVTs) and larger scale attacks against military objectives and
enemy formations. Even though the targeting process is more relevant for offensive targeting,
the principles underlying this process must also be adhered to in the case of defensive
applications. The underlying principles offer a methodological framework to provide insights
regarding the impact defensive LAWS may have on the notions of individual and command
responsibility. This analysis also allows to assess the more offensive applications that will
become more prominent in the future.
The targeting process is a way to limit the use of force. The targeting directions function
as a complement to the Rules of Engagement (ROE) that describe the conditions under which
(lethal) force can be used. More specifically ROE are “directives issued by competent military
authority that delineate the circumstances and limitations under which (naval, ground and air)
forces will initiate and/or continue combat engagement with other forces encountered”.89 For
this reason, targeting directives can be seen as an extra layer of ROE. The ROE are linked with
the choice of the weapon systems the commander will deploy depending on the mission and
the specific operational circumstances of the area of operation.
Breaches of IHL by LAWS will most likely result from not conforming to the requirements of
the IHL principles of distinction, proportionality and precautions in attack,90 which underlie the
aforementioned targeting process. The targeting process is based on these very principles to
comply with IHL in the context of an intention to engage (a) certain target(s) both in an
offensive and defensive mode.91 As explained in chapter one, the focus will mainly be on the
principle of distinction as it is seen as the prime ‘gatekeeper’ of target selection and a thorough
examination of all three principles would not be possible within the limits of this thesis. These
underlying principles constitute the link between the IHL provisions and possible breaches
thereof resulting in criminal responsibility. It is thus crucial to be able to understand how more
autonomy of weapon systems affects individual and criminal command responsibility and
requires these notions to be re-evaluated and check their relevance. If LAWS contain the
89 Joint Publication 1-02, Dictionary of Military and Associated Terms; Gary D. Solis, The Law of Armed Conflict: International Humanitarian Law in War (Cambridge University Press 2016), p. 479; ‘Department of Defense Support to Foreign Disaster Relief (Handbook for JTF Commanders and Below)’,<https://fas.org/irp/doddir/dod/disaster.pdf > accessed 1 April 2020, p. A - 6 90 P. Asaro, ‘On Banning Autonomous Weapon Sytems: Human Rights, Automation and the Dehumanization of Lethal Decision-Making’, (2012) Volume 94 Nr. 886 International Review of the Red Cross, p. 692 91 M. Roorda, ‘NATO’s Targeting Process: Ensuring Human Control Over and Lawful Use of ‘Autonomous’ Weapons (2015) No. 2015-13 Amsterdam Law School Legal Studies Research Paper, <https://www.peacepalacelibrary.nl/ebooks/files/401482448.pdf> accessed 1 April 2020, p. 152
28
functions of being able to recognise targets, (autonomously) select and engage, this means that
parts of the targeting process are taken over by the weapon system92 leaving room for issues
regarding the attribution of criminal responsibility in those parts, keeping in mind that human
actors remain in control of how, when and where these systems are allowed to operate.93
The first of these three principles of IHL which plays an essential role in the targeting
process and the one which will be focussed on in this research is distinction. Article 48 AP I
explains that this principle is about being able to distinguish combatants and military objectives
from civilians and civilian objects. “In order to ensure respect for and protection of the civilian
population and civilian objects, the Parties to the conflict shall at all times distinguish between
the civilian population and combatants and between civilian objects and military objectives and
accordingly shall direct their operations only against military objectives”.94
The principle of distinction thus entails an assessment of whether a target is lawful or
not. The target cannot be a civilian, a civilian object or a person hors de combat.95 Person hors
de combat refers to persons who are not capable of waging war at a certain moment. For
example, a pilot who parachutes from his destroyed or malfunctioning plane.
The ICRC explains that the principle of distinction entails three components.96 These
three components “are interrelated and the practice pertaining to each of them reinforces the
validity of the others”.97 The first component being that parties to the conflict have to
continuously make a distinction between combatants and civilians. Secondly, attacks may only
be against combatants and thirdly attacks may not be against civilians.98 The notion of a
92 M. Ekelhof, ‘Complications of a Common Language: Why it is so Hard to Talk about Autonomous Weapons’ (2017) Vol. 22 No. 2 Journal of Conflict and security Law, <https://www.researchgate.net/publication/323416370_Complications_of_a_Common_Language_Why_it_is_so_Hard_to_Talk_about_Autonomous_Weapons> accessed 1 April 2020, p. 326 93 M. Noorman, D. G. Johnson, ‘Negotiating autonomy and responsibility in military robots’ Vol. 16 (2014) Ethics and Information Technology, <https://link.springer.com/article/10.1007/s10676-013-9335-0?shared-article-renderer> accessed 21 February 2020, p. 61 94 ICRC ‘Autonomous Weapon Systems Technical, Military, Legal and Humanitarian Aspects’ Expert Meeting Geneva March 2014, <https://www.icrc.org/en/document/report-icrc-meeting-autonomous-weapon-systems-26-28-march-2014> accessed 21 February 2020, p. 77; ICRC, Customary International Humanitarian Law, Volume I: Rules (Cambridge University Press, 2005), p. 3 95 V. Boulanin, M. Verbruggen, ‘Mapping the Development of Autonomy in Weapon Systems’, Stockholm International Peace Research Institute 2017, <https://www.sipri.org/publications/2017/other-publications/mapping-development-autonomy-weapon-systems> accessed 21 Februari 2020, p. 73 96 International Committee of the Red Cross, ‘Rule 1’, The principle of distinction between Civilians and Combatants, in the Customary IHL Database 97 Ibid 98 D. Lawless, ‘The problems facing the principle of distinction in international humanitarian law duet o the changing nature of armed conflict – the effects of an increasing ‘civilian’ population on the battlefield for this principle’ Thesis <https://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=13&ved=2ahUKEwiQ_Mbio_zoAhVMNOwKHcJkDMgQFjAMegQIAxAB&url=http%3A%2F%2Fwww.scriptiesonline.uba.uva.nl%2Fdocument%2F621757&usg=AOvVaw3wf5DOd4Ls3aaJ59qxUXBt> accessed 22 April 2020, p. 7
29
‘combatant’ is meant in a generic manner to cover all those who do not have the protection that
citizens have. However, it must be read together with the rules regarding persons hors de
combat.99 Component one relates to the obligation of combatants to clearly indicate that they
are not civilians by wearing a uniform for example during a military attack or operation as
international customary law demands.100 Combatants who are not a part of Armed Forces of a
country must still wear a permanent distinctive sign that can be perceived from a distance.101
They must further carry their weapons in an open manner.102 However this is often (knowingly)
not done by non-state actors/armed groups, to create confusion and to use IHL against Western
Armed Forces.
To make everything even more complex there are scenarios in which civilians can
become targets namely when they directly participate in hostilities.103 The notion of direct
participation is vague which makes it in practice very hard to assess. Component two and three
specify who can be attacked referring to the methods and means of warfare that can lawfully
be deployed. The Armed Forces involved in a conflict have to guarantee that these means and
methods only attack ‘a specific and separable military objective’.104 They must also guarantee
that the means and methods are limitable in their outcomes.105 This means the overall
assessment that has to be made continuously is complex and requires an excellent situational
awareness, reliable time sensitive intelligence, behavioural insight but above all an excellent
process and judgment to integrate all these parameters into a coherent decision.
The second principle is proportionality which comes down to an assessment of the
collateral damage. It thus seeks to limit harm which may result from military operations. This
assessment “requires that the effects of the means and methods of warfare used must not be
disproportionate to the military advantage sought”.106 In this assessment the aim of the
99 International Committee of the Red Cross, ‘Rule 47’, Attacks against Persons Hors de Combat in the Customary IHL Database 100 International Committee of the Red Cross, ‘Rule 1’, The principle of distinction between Civilians and Combatants, in the Customary IHL Database; K. Ipsen, ‘Combatants and Non-Combatants’, in D. Fleck (ed) the Handbook on International Humanitarian Law, (Oxford University Press 2014), p. 89 101 K. Ipsen, ‘Combatants and Non-Combatants’, in D. Fleck (ed) the Handbook on International Humanitarian Law, (Oxford University Press 2014), p. 89 102 Article 44(3)(a) and (b) AP I 103 Article 51(3) AP I 104 D. Lawless, ‘The problems facing the principle of distinction in international humanitarian law due to the changing nature of armed conflict – the effects of an increasing ‘civilian’ population on the battlefield for this principle’ Thesis <https://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=13&ved=2ahUKEwiQ_Mbio_zoAhVMNOwKHcJkDMgQFjAMegQIAxAB&url=http%3A%2F%2Fwww.scriptiesonline.uba.uva.nl%2Fdocument%2F621757&usg=AOvVaw3wf5DOd4Ls3aaJ59qxUXBt> accessed 22 April 2020, p. 8 105 S. Oeter, ‘Methods And Means of Combat’, in D. Fleck (ed) the Handbook on International Humanitarian Law, (Oxford University Press 2014), p. 129 106 <https://casebook.icrc.org/glossary/proportionality> accessed 2 January 2020
30
operation and the campaign is also kept in mind. The definition of proportionality is in article
51 (5)(b) AP I: “an attack which may be expected to cause incidental loss of civilian life, injury
to civilians, damage to civilian objects, or a combination thereof, which would be excessive in
relation to the concrete and direct military advantage anticipated”.
The third principle entails the obligation to take all feasible precautions in attack. This
principle makes sure that during military operations, “constant care is taken to spare the civilian
population, civilians and civilian objects”.107 All feasible precautions have to be taken at all
times to ensure there is no incidental loss of civilian lives nor damage to civilian objects.
3.3. Conclusion
In this chapter IHL is identified as the most suitable legal framework for the further research.
A number of its relevant components are described and linked to the research goals of this
thesis. This will make it possible to analyse in the next chapter if the legal notions of criminal
individual and command responsibility are still suitable and applicable when using LAWS as
effectors or if they should be re-evaluated and redefined due to a regulatory gap. Responsibility
in this thesis refers to the legal situation that occurs when a person is accountable and liable for
a breach of IHL thus triggering criminal international law. In this regard the principles
underlying the targeting process are essential for the analysis. The main focus in this thesis will
lie on the principle of distinction. It forms the best reference base to check how the use of
defensive LAWS might affect the notions of individual and command responsibility as breaches
through the use of defensive LAWS will most likely result from non-compliance with this
underlying principle. It thus provides the link between the IHL provisions regarding the use of
(defensive) LAWS and possible breaches thereof resulting in criminal responsibility. Criminal
individual responsibility can apply to anyone for direct actions and omissions that lead to a
grave breach of IHL or a war crime. Criminal command responsibility is triggered if the
commander knew or had information out of which he should have derived that his subordinates
were committing or about to commit a breach of IHL hereby omitting to take the right measures
to prevent this breach.
107 International Committee of the Red Cross, ‘Rule 15’, The principle of Precautions in Attack, in the Customary IHL Database
31
Chapter 4: Analysis of criminal responsibility for defensive LAWS
In this chapter, the impact of (using) defensive LAWS on the utility and enforceability of the
legal notions of criminal command and individual responsibility will be explored. More
specifically I will look into the legal dimension of responsibility when deploying the Phalanx
CIWS. The principle of distinction underlying the targeting process will be used as the main
analytical reference base to determine the impact on criminal individual and command
responsibility when engaging a target through the Phalanx system. Due to the defensive nature
of the Phalanx system a breach of IHL is less likely to take place in comparison with an
offensive use of such weapon systems. However, it remains useful to research the possible
issues with the legal framework as the results are relevant for other defensive systems as well.
The conclusions of who will or could be held criminally responsible will also allow to reflect
upon possible future offensive applications which are currently still mainly hypothetical.
4.1. Impact of the characteristics and usage modalities of the Phalanx CIWS on
the principle of distinction Targeting as explained is a methodology that supports the decision-making which entails
linking certain identified objectives with possible desired effects.108 Normally in the classic
targeting process the commander together with his team deliberate about the selection and
priority of the target proposed by the intelligence community coming to a proportional response
to them.109 Defensive LAWS such as the Phalanx however are no capabilities that engage
selected targets but they contain the functions of being able to, based upon pre-programmed
parameters, defend a ship or a base camp against incoming threats. These systems identify,
select and engage incoming threats as targets based upon pre-programmed physical parameters.
In cases of defensive systems, there is no elaborate targeting process utilised to engage a
target.110 This means that entire parts of the targeting process are taken over by the weapon
108 NATO Standard, ‘AJP-3.9 Allied joint doctrine for joint targeting’ (2016) Ed. A version 1, < https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/628215/20160505-nato_targeting_ajp_3_9.pdf > accessed 3 January 2020, p. 1-3 109 M. A.C. Ekelhof, ‘Complications of a Common Language: Why it is so Hard to talk about Autonomous Weapons’ (2017) Vol. 20 No. 2 Journal of Conflict & Security Law, <https://research.vu.nl/ws/portalfiles/portal/42807463/krw029.pdf> accessed 2 April 2020, p. 326 110 M. A.C. Ekelhof, ‘Lifting the fog of targeting: “Autonomous Weapons” and Human Control through the Lens of Military Targeting, U.S. Naval War College, Vol. 71, No. 3 (Summer 2018), https://www.jstor.org/stable/10.2307/26607067 > accessed 1 April 2020, p. 14
32
system111, hereby respecting the principle of distinction underlying the abbreviated targeting
process. The context within which these systems are used and how human control is exercised
is essential to determine how the principle of distinction is impacted.112 I will investigate if and
how the Phalanx system through its specific characteristics and usage modalities potentially
runs into difficulties regarding the principle of distinction and how human responsibility of a
commander or an operator relates to these difficulties.
For acquiring and tracking targets the Phalanx CIWS uses the Ku band (frequency range
from 12 to 18 GHz) fire control radar system with a special-made mount, able to levitate
quickly.113 The LAWS is thus programmed to look based on a number of parameters for certain
incoming threats that are engaged. One could say the list with targets of opportunity or in fact
of necessity (that form a threat) is pre-programmed into the Phalanx system.
Before the activation of the Phalanx the operator must run a number of checks. If the
operator does not execute the checks with the necessary due diligence which is his duty and
something were to go wrong because of that, his individual responsibility would be triggered
by failing to do his duty. Command responsibility may also be triggered as it is his responsibility
to check on the execution of this procedure.
The Phalanx CIWS is guided by two antennas that cooperate to detect targets and send
information about the possible target (e.g. velocity, altitude, range) to the CIWS.114 The target
analysis consists of this information being processed to determine whether the detected
incoming object must be engaged.115 All its components enable the Phalanx system to
“automatically search for detect, track, engage and confirm kills using its computer-controlled
radar system”.116 Looking at the criteria used, no distinction criteria based upon the different
legal categories are programmed into the system. This could potentially lead to issues in terms
of the principle of distinction as the Phalanx CIWS does thus not identify an incoming object
111 Ibid; M. Roorda, ‘NATO’s Targeting Process: Ensuring Human Control Over and Lawful Use of ‘Autonomous’ Weapons’ (2015) No 2015-13 Amsterdam Law School Legal Studies Research Paper, <https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2593697 > accessed 2 April 2020, p. 154 - 156 112 M. A.C. Ekelhof, ‘Lifting the fog of targeting: “Autonomous Weapons” and Human Control through the Lens of Military Targeting, U.S. Naval War College, Vol. 71, No. 3 (Summer 2018), https://www.jstor.org/stable/10.2307/26607067 > accessed 1 April 2020, p. 61 113 ‘Laser quest: Phalanx, LAWS and the future of close-in weapon systems’ (2014) Naval Technology, <https://www.naval-technology.com/features/featurelaser-quest-phalanx-laws-and-the-future-of-close-in-weapon-systems-4295413/ > accessed 2 April 2020 114 <https://fas.org/man/dod-101/sys/ship/weaps/mk-15.htm> accessed 2 April 2020 115 ‘Inside the Phalanx’ (2013) <https://www.howitworksdaily.com/inside-the-phalanx-ciws/> accessed 2 April 2020 116 <https://www.doncio.navy.mil/Chips/ArticleDetails.aspx?ID=7898> accessed 2 April 2020; ‘Laser quest: Phalanx, LAWS and the future of close-in weapon systems’ (2014) Naval Technology, <https://www.naval-technology.com/features/featurelaser-quest-phalanx-laws-and-the-future-of-close-in-weapon-systems-4295413/ > accessed 2 April 2020
33
as a friend or foe or as a ‘combatant’ or ‘non-combatant’.117 The CIWS only looks into the
aforementioned physical real time data to determine if the target must be engaged or not. This
means that arming the system needs to be a deliberate decision based upon a thorough analysis
of the operational environment as the operator has only very limited possibilities to intervene
and abort an engagement. Although the Phalanx Block 1B version consists of control stations
that permit operators to track and identify targets before engagement118the system completes
its “detection, evaluation and response process within a matter of seconds and thus renders it
extremely difficult for human operators to exercise meaningful supervisory control once the
system has been activated other than deciding when to switch them off”.119 The principle of
meaningful supervisory or human control entails that “humans not computers and their
algorithms should ultimately remain in control of relevant decisions about (lethal) military
operations”.120
The abbreviated and sometimes implicit execution of a number of phases of the targeting
process based upon the upfront decision to activate an autonomous defensive system modifies
the traditional approach where a commander makes a decision, gives orders and subordinates
execute the lethal action (by pulling the trigger or pushing the button). When employing a
defensive LAWS such as the Phalanx the operator will in most cases not be fast enough to press
the ‘abort’ button as the time between the machine detecting a threat and engaging the threat
will be too limited.
The traditional distribution and (legal) attribution of responsibility is altered when
employing a defensive autonomous system and requires a new overarching conceptual and
management approach that needs to be put in place by the organization that decides to acquire
such defensive LAWSs. A clear responsibility lies here with the organization employing LAWS
as the military personnel, both commanders and operators, using these systems need to be
117 ‘Phalanx Close-In Weapon System (CIWS)’ (2006), <https://web.archive.org/web/20060820113825/http://wrc.navair-rdte.navy.mil/warfighter_enc/weapons/shiplnch/Guns/ciws.htm> accessed 2 April 2020; ‘Phalanx close-in weapon system (CIWS)’ (2006) <https://web.archive.org/web/20060613141359/http://www.raytheon.com/products/phalanx/> 2 April 2020 118 ‘Phalanx Weapon System’ Raytheon Missiles & Defense, <https://www.raytheonmissilesanddefense.com/capabilities/products/phalanx-close-in-weapon-system> accessed 2 April 2020 119 ‘Ethically Aligned Design – A Vision for Prioritizing Human Well-Being with Autonomous and Intelligent Systems’, IEEE, <https://standards.ieee.org/content/dam/ieee-standards/standards/web/documents/other/ead_v2.pdf> accessed 2020, p. 124 120 F. Santoni de Sio, J. van den Hoven, ‘Meaningful Human Control over Autonomous Systems: A Philosophical Account’, Faculty Technology Policy and Management, Deflt University of Technology, <https://www.frontiersin.org/articles/10.3389/frobt.2018.00015/full> accessed 18 June 2020, p. 1
34
educated and trained and be made fully aware of the specifics of the technology. To examine
more in depth what autonomy in defensive systems means for criminal individual and command
responsibility, the focus lies on the principle of distinction, which is the legal cornerstone of
the targeting process and must be upheld in both defensive as well as offensive situations.121
The user of the defensive weapon system does not only need to be capable of respecting the
principle of distinction but also the principle of proportionality and precautions in attack to be
lawful. However, non-compliance with one of the three principles suffices to demonstrate
unlawfulness.
Chances of there being a breach of the principle of distinction are noticeably more
limited when deploying defensive LAWS such as the Phalanx CIWS, but not non-existent as
the Phalanx CIWS does not identify nor classify an incoming object according to the IHL
categories (‘combatant’ or ‘non-combatant’). There have been incidents with this system in the
past. One example is the activation during an exercise where the Phalanx system was supposed
to engage a target drone simulating an adversary asset. After engagement the remains of the
partially destroyed drone bounced off the sea surface and hit the ship which was not a target. A
civilian instructor who was on the ship lost his life.122 The question when such an incident
happens is: Who is to be held criminally responsible and on which grounds? On the basis of
criminal command responsibility? Or on the basis of criminal individual responsibility?
It will be very hard to comply with the burden of proof of command responsibility as
there would have to be a wrongful act of a subordinate. Even if the operator received proper
education and training, can one really hold him accountable for the defensive LAWS
automatically engaging a wrong target as his time to intervene is so limited? Furthermore, it
will be equally hard to substantiate that the commander knew or had reason to know that a
wrongful act by the operator was about to be committed. This legal doctrine of command
responsibility was clearly not conceived for situations like this and is not fit for purpose.
Another option would be to take a recourse to individual responsibility where you would have
to prove a wrongful act of either the commander or operator (or both). Here, the problem of the
superior-subordinate relationship does not arise. However, the wrongful intent would still have
to be proven as it is a mandatory element of individual responsibility in order to establish blame
and find a person guilty. However, the assessment of the intent depends on the circumstances
and is not always facile.
121 Article 57 Additional Protocol I. 122 ‘USS Antrim (FFG 20)’ <https://www.navysite.de/ffg/FFG20.HTM > 2 April 2020
35
Based upon the analysis it becomes clear there is a shared responsibility in case of damage
resulting from indistinctive engagement when using defensive LAWS. There lies a possible
(shared) individual responsibility with the weapon system purchaser, designer, manufacturer
and programmer of the engagement parameters but also with the commander who decides to
activate the system and the operator supervising the engagement. In order to attribute legal
responsibility to the commander and/or the operator the employment and management of
LAWSs requires a different capability management approach including all lines of
development123 with a special focus on training and leader development. It is impossible to
assign responsibility to military personnel when the overarching conceptual and management
approach is not adapted to the specificity of the employment of LAWS. This presents
considerable challenges, not only to military structures and the military mind-set but also to
decision-making processes and the relationship between human actors and technologies in the
targeting process.124 It marks a fundamental shift in the age-old goal of weapons development
taking the individual soldier firing the gun out of the equation through technology.125
4.2. Consequences for the notions of criminal individual and command
responsibility As demonstrated in the previous subsection, in case of the use of autonomous defensive weapon
systems like the Phalanx CIWS, even though the traditional targeting process takes a different
abbreviated form with implicit phases, the underlying principle of distinction is still relevant.
The analysis illustrated that the scope of criminal responsibility is affected by defensive LAWS.
Since WW II and the Nuremberg trials thereafter, IHL has been a part of a decades-long
trend of an increasing emphasis on individual criminal responsibility and it is a crucial aspect
of IHL that perpetrators are held personally responsible if they commit wrongful acts.126 In case
123 A military capability consists of different lines of development being doctrine, organization, training, materiel, leader development, personnel and facilities often abbreviated by the acronym "DOTMLPF", ‘Future Trends from the Capability Development Plan, (2008) European Defence Agency <https://www.eda.europa.eu/docs/documents/brochure_cdp.pdf> accessed 2 April 2020, p. 8 124 M. A.C. Ekelhof, ‘Lifting the fog of targeting: “Autonomous Weapons” and Human Control through the Lens of Military Targeting, U.S. Naval War College, Vol. 71, No. 3 (Summer 2018), https://www.jstor.org/stable/10.2307/26607067 > accessed 1 April 2020, p. 63 125 M. Wagner, ‘Taking Humans Out of the Loop: Implications for International Humanitarian Law’ (2011) No. 21 Journal of Law, Information and Science, <https://www.researchgate.net/publication/228193208_Taking_Humans_Out_of_the_Loop_Implications_for_International_Humanitarian_Law> accessed 1 April 2020, p. 155 126 L. A. Dickinson, ‘Drones, Automated Weapons, and Private Military Contractors’ (2018) Cambridge University Press, <https://www.cambridge.org/core/books/new-technologies-for-human-rights-law-and-practice/drones-automated-weapons-and-private-military-contractors/C6F1C06CBFA06E46D7A8E67A0BB3065D/core-reader> accessed 3 April 2020, p. 115
36
of a defensive LAWS this trend seems to persist as the main decision to make is the
commander’s to activate the system or not in specific circumstances and a specific area of
operations. The difference with a manned defensive system is that here the operator is still able
to make the judgment to engage a target or not, taking into account his orders and considering
the IHL principles specified by the rules of engagement. This means individual (criminal)
responsibility is an important notion when operating manned defensive weapon systems.
However, in case of defensive unmanned systems such as the Phalanx this aspect becomes less
important as the target engagement is based upon the pre-programmed physical parameters
instead of an assessment of the operational parameters and IHL principles by an operator. The
operator has only a limited role in a fully automated process putting the emphasis more on the
individual responsibility of the commander and not on command responsibility of the
commander as the operator, which is a subordinate will not likely be found to have committed
a wrongful act. The requirements of command responsibility are in these kinds of situations
almost impossible to comply with. The existence of a superior-subordinate relationship is still
valid. However, the knowledge requirement namely ‘to know’ or ‘had reason to know’ that the
criminal act was about to be or had been committed is not useful here and unprovable. The
requirement of omission being the failure of the commander to take the necessary and
reasonable measures to prevent this criminal act by the human subordinate also does not make
sense. For this secondary liability the criminal act by the operator/subordinate is a prerequisite
and this is precisely the issue. In most cases of a malfunction of the machine, it will not be
possible to prove that the operator has committed a criminal act. Unless in an unlikely situation
where the operator would have known the machine was going to malfunction and did not do
anything about it or if he deliberately reprogrammed the machine, etc. but these situations are
highly unlikely. This means the only way to hold the commander responsible via the current
legal framework would be to hold him individually responsible. This means the command
responsibility doctrine is not suited to these kinds of scenarios.
Within military organizations responsibilities are normally clearly and formally spread
throughout a chain of command which follow international laws, national laws and internal
regulations.127 These laws and regulations indicate who has the authority to take decisions and
who must be held responsible for the (non-desired) outcomes that resulted from these
127 M. Noorman, D. G. Johnson, ‘Negotiating autonomy and responsibility in military robots’ Vol. 16 (2014) Ethics and Information Technology, <https://link.springer.com/article/10.1007/s10676-013-9335-0?shared-article-renderer> accessed 21 February 2020, p. 59; M. Schulzke, ‘Autonomous weapons and distributed responsibility’ (2012) No. 26 Philosophy and Technology, http://link.springer.com/article/10.1007%2Fs13347-012-0089-0, accessed 9 February 2020, p. 205
37
decisions.128 “Military commanders are assigned responsibility even if they do not directly
control the outcome, because they are accountable for setting and creating the conditions under
which their subordinates act”129 or in this case for the conditions under which they activate the
LAWS. “Roles previously performed by multiple human actors have been compacted and
compounded”130 making decision-making in unmanned operations a more shared activity thus
reducing the autonomy of the operator.131 In case of defensive LAWS the role of the
commander becomes more important, not in legal terms of command responsibility as the
subordinate’s role is very limited but in terms of individual responsibility of the commander.
There is less shared responsibility of specialized advisors as is the case in the normal targeting
process. “Commanding officers are required to sign off on their decisions and assume
responsibility for the units under their command”132 but also for their choices of means and
methods such as the weapon systems that will defend their ships or ground forces and, on this
basis, they can be held to account for their decisions.133
However, it is important to acknowledge that an automated defensive weapon system
such as the Phalanx CIWS has many stakeholders during its life cycle and each one of them
carries a certain level of individual responsibility at any given moment of its design,
development, production, purchase and finally employment. Many system engineers and
programmers spent a massive amount of time putting all the necessary components together,
programming it until it reached the threshold of effective use in practice.134
Criminal individual responsibility will thus become more important as defensive
systems become more autonomous. Ultimately in case of fully autonomous systems and the
removal of an operator able to intervene, the main (individual) responsibility will rest upon the
commander who decided to employ the system.
128 M. Noorman, D. G. Johnson, ‘Negotiating autonomy and responsibility in military robots’ Vol. 16 (2014) Ethics and Information Technology, <https://link.springer.com/article/10.1007/s10676-013-9335-0?shared-article-renderer> accessed 21 February 2020, p. 59 129 Ibid 130 M. Noorman, ‘Responsibility Practices and Unmanned Military Technologies’ (2014) Vol. 20 Science and Engineering Ethics, < https://link.springer.com/article/10.1007/s11948-013-9484-x > accessed 2 April 2020, p. 817 131 P. M. Asaro, ‘The labor of surveillance and bureaucratized killing: New subjectivities of military drone operators’ (2013) Vol. 23 No. 2 Social Semiotics, <https://www.tandfonline.com/doi/abs/10.1080/10350330.2013.777591> accessed 9 February 2020, p. 197 132 M. Noorman, D. G. Johnson, ‘Negotiating autonomy and responsibility in military robots’ Vol. 16 (2014) Ethics and Information Technology, <https://link.springer.com/article/10.1007/s10676-013-9335-0?shared-article-renderer> accessed 21 February 2020, p. 59 133 M. Schulzke, ‘Autonomous weapons and distributed responsibility’ (2012) No. 26 Philosophy and Technology, http://link.springer.com/article/10.1007%2Fs13347-012-0089-0, accessed 9 February 2020, p. 204 134 M. Noorman, D. G. Johnson, ‘Negotiating autonomy and responsibility in military robots’ Vol. 16 (2014) Ethics and Information Technology, <https://link.springer.com/article/10.1007/s10676-013-9335-0?shared-article-renderer> accessed 21 February 2020, p. 59
38
4.3. Conclusion The Phalanx CIWS was selected as a representative example of a defensive LAWS as it is
broadly used by different Western countries. It is a pre-programmed autonomous system that
based upon a number of physical parameters such as velocity will destroy all incoming threats
with a cloud of conventional ammunition as soon as they enter a predetermined perimeter. The
exclusively defensive use of such a LAWS results in a modified targeting process that is an
abbreviated version of the traditional one for conventional offensive targeting. However, the
underlying principle of distinction must still be adhered to and is used in the analysis to
demonstrate the impact of the employment of defensive LAWS on the notions of criminal
individual and command responsibility.
Due to the exclusive defensive application of the Phalanx mainly against incoming
unmanned missiles and mortar projectiles, non-compliance with the principle of distinction and
thus a breach of IHL is less likely to occur and criminal responsibility is less probable to be
triggered. However, if the defensive LAWS would cause a breach than it will most likely be
the commander who will be held criminally responsible on the basis of individual responsibility
rather than on the basis of command responsibility as it will be very hard to fulfil the related
prerequisites linked to the commander-operator interaction and the operator as a subordinate.
Indeed, the operator has little time to intervene and has no active role when supervising a
defensive LAWS, which makes establishing individual responsibility for the operator hard as
well. In case of the Phalanx the role of the subordinate being the operator is thus so limited that
it is hard for him to be engaged in such criminal offences making command responsibility a
hollow legal concept.
39
Chapter 5: Future prospects The issue of autonomy in LAWS is today mainly a reality within the context of defensive
LAWS such as the Phalanx CIWS and these systems are already widely implemented on a
diversity of platforms by different nations. When engaging these defensive systems, the military
does not go through the complete classic targeting process as the targets are mainly unmanned
incoming threats such as missiles or mortar shells. This means there are not many issues
regarding (criminal) responsibility linked to potential breaches of IHL based upon the principle
of distinction or other principles. Today if something were to go wrong it would most likely be
the commander who will be held individually responsible and possibly depending the specific
circumstances also the operator. Defensive LAWS have within the scope of their current usage
profile, been ‘accepted’ by the (legal) community and are considered as being compliant with
IHL. This chapter will briefly go into the challenges of an ever-broadening employment of
autonomous systems that are also receiving increasingly offensive applications. It creates a
potential function creep that raises wider and more complex legal challenges compared to those
identified in the analysis of defensive systems regarding criminal responsibility despite of the
existing safeguards.
5.1. A possible function creep A possible function creep was already signaled during the ICRC expert meeting on autonomous
weapon systems that took place in Geneva in 2014.135 The creeping process towards more and
more autonomous functions allowing a broader use requires reflection at the political level.136
“A thorough analysis of the respective human–machine inter-face is particularly important.
This is the only way to ensure that the transfer of decision-making and responsibility to the
machine proceeds as desired and that human control in the targeting process is maintained”.137
A LAWS created for a particular use in a certain context could later be used in wider contexts.138
135 ICRC Expert Meeting on ‘Autonomous Weapon Systems: Implications of Increasing Autonomy in the Critical Functions of Weapons’(2016) Geneva, Switzerland,https://reliefweb.int/sites/reliefweb.int/files/resources/4221-002-autonomous-weapons-systems-full-report%20%281%29.pdf > accessed 21 February 2020 136 M. Dickow, A. Dahlmann ‘Preventive Regulation of Autonomous Weapon Sytems: Need for Action by Germany at Various Levels’ (2019) SWP Research Paper 3, <https://www.swp-berlin.org/fileadmin/contents/products/research_papers/2019RP03_dnn_dkw.pdf> accessed 27 December 2019, p. 23 137 M. Dickow, A. Dahlmann ‘Preventive Regulation of Autonomous Weapon Sytems: Need for Action by Germany at Various Levels’ (2019) SWP Research Paper 3, <https://www.swp-berlin.org/fileadmin/contents/products/research_papers/2019RP03_dnn_dkw.pdf> accessed 27 December 2019, p. 23 138ICRC Expert Meeting on ‘Autonomous Weapon Systems: Implications of Increasing Autonomy in the Critical Functions of Weapons’(2016) Geneva, Switzerland,https://reliefweb.int/sites/reliefweb.int/files/resources/4221-002-autonomous-weapons-systems-full-report%20%281%29.pdf > accessed 21 February 2020, p. 16; J. Lang, R.
40
As LAWS are already ‘approved’ for defensive use, they might be more easily accepted for
broader applications thus increasing the probability of violating IHL. This function creep would
mean the systems are used in situations in which they no longer fully comply with the principles
of distinction, precautions in attack and proportionality. These principles are formulated in a
legally abstract way. They depend very much on the context, which makes implementing them
in generic applicable machine rules more complicated or even impossible.139 Certain states
could abuse this ambiguity certainly when organizations such as the armed forces are under
strong (political) pressure to provide tangible results on the battlefield and the technological
options are available.
It is clear that the risk of function creep is both present in the context of a conventional
war (the so-called NATO Art 5 scenario) and in a crisis management context. The current
geopolitical context with a more antagonistic multipolar world order pushes countries such as
the USA, China and Russia to a faster pace of weapon systems development that could provide
them a competitive advantage on the battlefield. In addition, the death toll of peace building
and peace keeping operations (such as those in Afghanistan and Iraq) pushes the involved
countries to use new technologies that prevent the loss of life of their soldiers. Autonomous
standoff systems like unmanned vehicles facilitate a function creep as they contribute to the
false idea of the possibility of a ‘clean’ war. Politically, the engagement of LAWSs is attractive
as it minimizes the political risk (‘body bag syndrome’) when waging a war. Here it is clear
that some countries such as the USA and China are less constrained by multilateral legal
frameworks (treaties, conventions …) than for example EU countries. In this regard, the
regulation on the European Defence Fund of the European Parliament and the Council has put
clear limits (under pressure of the European Parliament) on the use of European funds for the
research into unmanned lethal systems. The USA and China are not hampered by such
limitations on their further development of LAWS as also in the past they did not want to be
hindered by the ratification of limiting treaties such as the Ottawa Treaty which outlaws anti-
personnel mines or the Convention on cluster ammunition. In a NATO context, this tension
also exists in processes such as the NATO Defence Planning Process (NDPP) or in the NATO
Science for Peace and Security (SPS) Program that “promotes dialogue and practical
van Munster & R. M. Schott, ‘Failure to define killer robots means failure to regulate them’ (2018) Danish Institute For International Studies, <https://www.diis.dk/en/research/failure-to-define-killer-robots-means-failure-to-regulate-them> accessed 21 February 2020 139 M. Dickow, ‘Preventive Regulation of Autonomous Weapon Sytems: Need for Action by Germany at Various Levels’ (2019) SWP Research Paper 3, <https://www.swp-berlin.org/fileadmin/contents/products/research_papers/2019RP03_dnn_dkw.pdf> accessed 27 December 2019, p. 6
41
cooperation between NATO member states and partner nations based on scientific research,
technological innovation and knowledge exchange”.140
Already in 2016 Deputy Defense Secretary Robert Work stated that the Defense
Department of the USA was not going to give lethal authority to a machine. However, he further
stated this could change due to ‘authoritarian regimes’ developing such machines.141 Such
autonomous systems could have unintended consequences which could have dire and global
effects.142 A reoccurring reasoning to allow these systems with more and more autonomous
capabilities eventually also in offensive systems is that there “are competitors and adversaries
willing to accept the risks of these technologies even if they entail unjust harms”.143 According
to this kind of reasoning you can either develop these technologies and risk inflicting such harm
or refrain from it and risk being vulnerable and disadvantaged.144 For states that are limited by
the social contract to see to the security and well-being of their citizens, permitting such
vulnerabilities and disadvantages depicts its own kind of moral failure. 145 This thought pattern
can be criticized as probably the states and the citizens who can afford to possess such weapons
will be ‘safer’ but many states and their citizens (on whose territories the weapons will probably
be used) will be subjected to the risks of the use of these systems.146 Simply because adversaries
do so147, states should not risk harming or violating international norms as this would be a
slippery slope leading to a situation where no one abides by IHL.148 The morally correct attitude
is not to disregard this increasing level of autonomy in weapon systems simply because such
140 ‘Science for Peace and Security’ (2020) North Atlantic Treaty Organization, <https://www.nato.int/cps/en/natolive/78209.htm> accessed 22 Februari 2020 141 D. Lamothe, ‘Pentagon examining the ‘killer robot’ threat’ (2016) Boston Globe, <https://www.bostonglobe.com/news/nation/2016/03/30/the-killer-robot-threat-pentagon-examining-how-enemies-could-empower-machines/sFri6ZDifwIcQR2UgyXlQI/story.html > accessed 22 February 2020 142 A. Pfaff, ‘The Ethics of Acquiring Disruptive Military Technologies’ (2019) Vol. 3 No. 1 Texas National Security Review, <https://tnsr.org/2020/01/the-ethics-of-acquiring-disruptive-military-technologies/> accessed 20 March 2020 143 Ibid 144 Ibid 145 Podcast, <https://technologyandsociety.org/military-robots-mapping-the-moral-landscape/> accessed 20 March 2020 146 J. G. Surrey, ‘Military Robots: Mapping the Moral Landscape’ (2017) in Book Reviews, Magazine Articles, Robotics, Social Impact, <https://technologyandsociety.org/military-robots-mapping-the-moral-landscape/> accessed 20 March 2020 ; D. P. Lackey, Moral Principles and Nuclear Weapons ( Rowman & Littlefield 1984), <https://books.google.be/books?id=Zr2AhJTZzgIC&pg=PA7&lpg=PA7&dq=Social+contract++weapons&source=bl&ots=yx_QnFR1e3&sig=ACfU3U1YGdSp93-fV68vtz3ElXidv-Tjig&hl=nl&sa=X&ved=2ahUKEwiCvf_Jzb_pAhWGC-wKHUtWDk8Q6AEwCXoECAoQAQ#v=onepage&q=Social%20contract%20%20weapons&f=false> accessed 20 March 2020, p. 7 147E. C. Hirschman, ‘Social contract theory and the semiotics of guns in America’ (2014) Vol. 24 No. 5 Social Semiotics, <https://www.tandfonline.com/doi/abs/10.1080/10350330.2014.937077> accessed 21 March 2020, p. 543 148Podcast, <https://www.bnr.nl/podcast/de-strateeg/10404954/oorlog-in-de-toekomst-doet-de-mens-nog-mee>accessed 20 March 2020
42
risks exist as the technology is there and not going away.149 If necessity permits to override
moral commitments, the outcome is a normative incoherency that jeopardizes the traditional
rules of international behavior such as IHL and places citizens’ lives and well-being on the
line.150 The right approach would be to acknowledge the (unavoidable) progress of the
technology and to create a multilateral ethical and legal framework that is shared as wide as
possible as has been done in the past with other potentially harmful (weaponized) technologies.
This should be done if possible within the existing international legal constellation and by
applying the existing norms and principles. If this is not feasible an adjusted ad hoc framework
has to be created and agreed upon. This allows to cope with the challenges of the evolving
technology and to use it within the agreed boundaries. We must act with due diligence when
developing and producing such increasingly autonomous systems as they risk the introduction
of unjust means and/or unjust applications that will no longer be in conformity with IHL. The
increasingly growing aversion for multilateralism of the USA, which was historically together
with Europe an advocate of international legal regimes, is an important factor in this respect.
5.2. Legal safeguards against a function creep
One could wonder what is legally possible to avoid such a function creep. The legal mechanism
developed to meet such concerns is found in Article 36 of the Additional Protocol I of the 1949
Geneva Conventions (AP I) which demands that states run through a so called ‘weapon review’.
This is a legal review that investigates the (un)lawfulness of any weapon, means or method of
warfare before its implementation in an armed conflict.151 It is an obligation “to determine
whether the weapon’s employment would, in some or all circumstances, be prohibited by this
Protocol or by any other rule of international law applicable to the High Contracting Party”.152
This obligation comes into force during “the study, development, acquisition or adoption of a
new weapon, means or method of warfare”153 to “prevent the use of weapons that would violate
149 A. Pfaff, ‘The Ethics of Acquiring Disruptive Military Technologies’ (2019) Vol. 3 No. 1 Texas National Security Review, <https://tnsr.org/2020/01/the-ethics-of-acquiring-disruptive-military-technologies/> accessed 20 March 2020 150 Ibid 151 V. Boulanin, M. Verbruggen, ‘Mapping the Development of Autonomy in Weapon Systems’, Stockholm International Peace Research Institute 2017, <https://www.sipri.org/publications/2017/other-publications/mapping-development-autonomy-weapon-systems> accessed 21 Februari 2020, p. 73; ‘A Guide to the Legal Review of Weapons, Means and Methods of Warfare’ (2006) International Committee of the Red Cross, <https://www.icrc.org/en/publication/0902-guide-legal-review-new-weapons-means-and-methods-warfare-measures-implement-article> accessed 21 April 2020, p. 4 152 Art. 36 AP I 153 Ibid
43
international law”154 and to “impose restrictions on the use of weapons that would violate
international law in some circumstances, by determining their lawfulness before they are
developed, acquired or otherwise incorporated into a State’s arsenal”.155 Art. 36 requires that
all new weapons and weapon systems be tested for compliance with IHL and “any other rule
of international law applicable”.156 Some new weapon systems are tested only when adopted
by a state’s armed forces. A LAWS however has conceptually no other application other than
to be used during armed conflicts. “For a State producing weapons… reviews should take place
at the stage of the conception/design of the weapon, and thereafter at the stages of its
technological development (development of prototypes and testing), and … before entering into
the production contract”.157 This reflects U.S. policy, which requires at least two legal reviews:
one before deciding to start formal development of the system, and another before fielding the
weapon system.158 For an autonomous weapon system, with its myriad of subsystems and
sensors, each of them with algorithm-laden programs, Article 36 testing will require a
continuous and even more rigorous test process. Testing will be a lengthy, complex and
continuous process for any autonomous system. The greater the system’s autonomy, the more
rigorous the testing should be.
“To be found lawful by a weapons review, a weapon cannot be indiscriminate by its
very nature”.159 In its normal, routine use, a weapon must be capable of discriminating between
civilians and combatants, as the principle of distinction requires.160 This is a low bar; few
weapons fail this test.161 Second, a weapon consistent with IHL’s balance between military
necessity and humanity cannot be of a ‘nature’ to ‘engender unnecessary suffering or
superfluous injury’.162 Third, a weapon can be illegal if its deleterious impacts cannot be
154 ‘A Guide to the Legal Review of Weapons, Means and Methods of Warfare’ (2006) International Committee of the Red Cross, <https://www.icrc.org/en/publication/0902-guide-legal-review-new-weapons-means-and-methods-warfare-measures-implement-article> accessed 21 April 2020, p. 933 155 ‘A Guide to the Legal Review of Weapons, Means and Methods of Warfare’ (2006) International Committee of the Red Cross, <https://www.icrc.org/en/publication/0902-guide-legal-review-new-weapons-means-and-methods-warfare-measures-implement-article> accessed 21 April 2020, p. 933 156 Article 36 AP I 157 ‘A Guide to the Legal Review of Weapons, Means and Methods of Warfare’ (2006) International Committee of the Red Cross, <https://www.icrc.org/en/publication/0902-guide-legal-review-new-weapons-means-and-methods-warfare-measures-implement-article> accessed 21 April 2020, p. 23 158DoD, ‘Directive 3000.09’ (2012) <www.dtic.mil/whs/directives/corres/pdf/300009p.pdf> accessed 27 December 2019, paras 1.a (5) and 1.b. (6) 159 W. H Boothby, Weapons and the Law of Armed Conflict (Oxford University Press 2009), p. 78 160 G. D Brown, A. O. Metcalf, ‘Easier Said Than Done: Legal Reviews of Cyber Weapons’ (2014) Vol. 7 Journal of National Security Law and Policy, < https://jnslp.com/wp-content/uploads/2014/02/Easier-Said-than-Done.pdf > accessed p. 115 161 K. Anderson, D. Reisner, M. Waxman, ‘Adapting the Law of Armed Conflict to Autonomous Weapon Systems’ (2014) Vol. 90 Naval War College’s International Legal Studies, < https://digital-commons.usnwc.edu/cgi/viewcontent.cgi?article=1015&context=ils> accessed 18 April 2020, p. 399 162 Article 35(2) AP I
44
‘controlled’.163 This refers more to biological weapons, since the path of a virus cannot be
guided once the virus is released.164 Since the weapons review standard is low, a properly
validated LAWS may well be able to pass the review in the future.165 There is not a lot of state
practice regarding the implementation of this provision. Only a few states are acknowledged to
perform this review.166 The interpretation of article 36 has been fiercely debated167 and the
degree to which states fulfill this obligation is not apparent.168 Due to the delicate nature of
LAWS technology, states will probably not make such a review public as there are “strong
military and security reasons”169 for this. According to customary international law, this
obligation exists for all states, even for states who are not a party to the Additional Protocol I.
However, most states do not have laws or processes to conduct this legal weapon review for
new weapons. Only a few states (less than 20) out of the 174 who are a party to the AP I have
such a mechanism in place.170 Art 36 is thus not a strong guarantee that this potential function
creep can be legally assessed and contained.
The next generation of LAWSs currently already under development will create a number of
additional ethical and legal challenges due to fast evolving technological possibilities. AI and
163 M. N. Schmitt, J. Thurnher, ‘’Out of the Loop’: Autonomous Weapons Systems and the Law of Armed Conflict’ (2013) Vol. 4 Harvard National Security Journal, <https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2212188> accessed 22 April 2020, p. 250; Article 51(4)(c) AP I 164 K. Anderson, D. Reisner, M. Waxman, ‘Adapting the Law of Armed Conflict to Autonomous Weapon Systems’ (2014) Vol. 90 Naval War College’s International Legal Studies, < https://digital-commons.usnwc.edu/cgi/viewcontent.cgi?article=1015&context=ils> accessed 18 April 2020, p. 400 165 P. Margulies, ‘Making Autonomous Weapon Accountable: command responsibility for Computer-Guided Lethal Force in Armed Conflicts’ (2016) No. 166 Roger Williams University Legal Studies Paper, <https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2734900> accessed 20 March 2020, p. 5-6 166 J. D. Fry, ‘Contextualized Legal Reviews for the Methods and Means of Warfare: Cave Combat and International Humanitarian Law’ (2006) Vol. 44 No. 2 Columbia Journal of Transnational Law, <https://www.researchgate.net/publication/294182434_Contextualized_legal_reviews_for_the_methods_and_means_of_warfare_Cave_combat_and_international_humanitarian_law> accessed 25 April 2020, p. 473 167 I. Daoust, R. Coupland, R. Ishoey, ‘New Wars, New Weapons? The Obligation of States to Assess the Legality of Means and Methods of Warefare’ (2002) Vol. 84 International Review of the Red Cross, <https://www.icrc.org/en/doc/assets/files/other/345_364_daoust.pdf> accessed 25 April 2020, p. 352-354; J. McClelland, ‘The Review of Weapons in Accordance with Article 36 of Additional Protocol I’ (2003) Vol. 85 International Review of the Red Cross, <https://www.icrc.org/en/doc/assets/files/other/irrc_850_mcclelland.pdf> accessed 25 April 2020, p. 414 168 M. Jacobsson, ‘Modern Weaponry and Warfare: The Application of Article 36 of Additional Protocol I by Governments’ (2007) Vol. 82 International Law Studies, <https://digital-commons.usnwc.edu/cgi/viewcontent.cgi?article=1231&context=ils> accessed 25 April 2020, p. 184-185 169 W. H Boothby, Weapons and the Law of Armed Conflict (Oxford University Press 2009), p. 343 170 V. Boulanin, M. Verbruggen, ‘Mapping the Development of Autonomy in Weapon Systems’, Stockholm International Peace Research Institute 2017, <https://www.sipri.org/publications/2017/other-publications/mapping-development-autonomy-weapon-systems> accessed 21 Februari 2020, p. 73
45
more specifically machine learning171 will allow that the systems make autonomous decisions
based upon a proper learning process and optimization of their initial parameters. The LAWSs
will receive certain mission information from which they are able to derive possible approaches
to act on this information. An important aspect to control machine learning is to systematically
train, test and validate.172 First a programmer will enter certain basic data, for instance on the
appearance of the adversary state or non-state actors. Based upon this input the systems develop
their proper approach to be able to carry out the task by optimizing all the related parameters
(for example the distinction between combatants and civilians) and ‘train’ themselves. This
training allows the machine to anticipate in analogous situations, which might not have been
included in the basic data set. To assure that the machine is performing as it should (according
to the different standards), the programmer must test the machine periodically on a
different/separate data set.173 If the machine performs the way it is supposed to “the programmer
will validate the results by varying the test set to ensure that no fortuitous overlap between the
previous test set and the training set artificially inflated the machine performance”.174
Validation is dependent on certain (low) percentages of mistakes in comparison with accurate
results.175 There are two important types of mistakes namely false positives and false negatives.
A false positive occurs when for example the machine wrongfully identifies someone as a
combatant whereas in fact he or she is a civilian. A false negative would in this case mean that
the machine identifies a subject as a civilian when he or she is in fact a combatant.176 This ties
171 P. Margulies, ‘Making Autonomous Weapon Accountable: command responsibility for Computer-Guided Lethal Force in Armed Conflicts’ (2016) No. 166 Roger Williams University Legal Studies Paper, <https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2734900> accessed 20 March 2020, p. 7 172 S. Tuffery, Data Mining and Statistics for Decision Making (John Wiley and Sons 2011), p. 304 173 L. Zhou, ‘A Comparison of Classification Methods for Predicting Deception in Computer-Mediated Communication’ (2004) Vol. 20 No. 4 Journal of Management Information Systems, <https://www.tandfonline.com/doi/abs/10.1080/07421222.2004.11045779> accessed 25 April 2020, p. 152; P. Margulies, ‘Making Autonomous Weapon Accountable: command responsibility for Computer-Guided Lethal Force in Armed Conflicts’ (2016) No. 166 Roger Williams University Legal Studies Paper, <https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2734900> accessed 20 March 2020, p. 8 174P. Margulies, ‘Making Autonomous Weapon Accountable: command responsibility for Computer-Guided Lethal Force in Armed Conflicts’ (2016) No. 166 Roger Williams University Legal Studies Paper, <https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2734900> accessed 20 March 2020, p. 8; P. Flach, Machine Learning: The Art and Science of Algorithms that Make Sense of Data (Cambridge University Press 2012), p. 162; S. J. Russel, P. Norvig, Artificial Intelligence: A Modern Approach (Prentice Hall 2010) 175 P. Flach, Machine Learning: The Art and Science of Algorithms that Make Sense of Data (Cambridge University Press 2012), p. 56; I. H. Witten, E. Frank, M. A. Hall, Data Mining: Practical Machine Learning Tools and Techniques (Elsevier 2011), p. 174-177; P. Margulies, ‘Making Autonomous Weapon Accountable: command responsibility for Computer-Guided Lethal Force in Armed Conflicts’ (2016) No. 166 Roger Williams University Legal Studies Paper, <https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2734900> accessed 20 March 2020, p. 8 176 P. Margulies, ‘Making Autonomous Weapon Accountable: command responsibility for Computer-Guided Lethal Force in Armed Conflicts’ (2016) No. 166 Roger Williams University Legal Studies Paper, <https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2734900> accessed 20 March 2020, p. 8
46
together with the article 36 weapon review as well because in this review the incidence of false
positives and negatives will have to be weighed against the legal standard that ensures that a
weapon may not be indiscriminate in its ‘nature’.177
Validation and Verification (V& V) is crucial as LAWS will only be allowed to act more
and more autonomously if they prove to be predictable and reliable in their behavior.178
Increasing levels of autonomy will require new methods of V&V to assure that a new weapon
system works as it should179 during its entire life cycle. V&V procedures are critical to
investigate and examine if new weapon systems are safe, consistent and legal. These V&V
procedures determine if the weapon system can (not) be certified.180 This is also confirmed by
the US DoD Roadmap that states:
To ensure the safety and reliability of autonomous systems and to fully realize the
benefits of these systems, new approaches to V&V are required. V&V is the process of
checking that a product, service, or system meets specifications and that it fulfills its
intended purpose.181
The Technological Horizons report of the US Air Force emphasizes that new V&V systems are
needed for LAWS to get certified as current V&V methods are not suitable for certifying LAWS
177 Ibid 178 M. Noorman, D. G. Johnson, ‘Negotiating autonomy and responsibility in military robots’ Vol. 16 (2014) Ethics and Information Technology, <https://link.springer.com/article/10.1007/s10676-013-9335-0?shared-article-renderer> accessed 21 February 2020, p. 59 179 V. Boulanin, M. Verbruggen, ‘Mapping the Development of Autonomy in Weapon Systems’, Stockholm International Peace Research Institute 2017, <https://www.sipri.org/publications/2017/other-publications/mapping-development-autonomy-weapon-systems> accessed 21 Februari 2020, p. 68; S. Russell, D. Dewey, M. Tegmark, ‘Research Priorities for Robust and Beneficial Artificial Intelligence’ (2015) Future of Life Institute: Boston, <https://futureoflife.org/data/documents/research_priorities.pdf> accessed 22 February 2020, p. 108 180 V. Boulanin, ‘Implementing Article 36 weapon reviews in light of increasing autonomy in weapon systems’ (2015) No. 2015/1 SIPRI Insight on Peace and Security, <https://www.sipri.org/sites/default/files/files/insight/SIPRIInsight1501.pdf> Accessed 20 April 2020, p. 15; V. Boulanin, M. Verbruggen, ‘Mapping the Development of Autonomy in Weapon Systems’, Stockholm International Peace Research Institute 2017, <https://www.sipri.org/publications/2017/other-publications/mapping-development-autonomy-weapon-systems> accessed 21 Februari 2020, p. 68 181DoD, ‘Directive 3000.09’ (2012) <www.dtic.mil/whs/directives/corres/pdf/300009p.pdf> accessed 27 December 2019, p. 50
47
for use.182 There is thus a call for new methods of V&V to predict, understand and control these
new weapon systems.183
5.3. Impact of offensive LAWSs on the principle of distinction
There are countries already developing and testing autonomous systems for more offensive
tasks. We can see that the research and development (R&D) and experimentation pipelines
already contain a variety of offensive applications based on existing land, air and maritime
systems. Already today the Israeli army makes use of an unmanned aerial system called Harpy.
The Harpy can autonomously destroy enemy radar systems within certain geographic
boundaries and is a further development of existing systems that still had a man in the loop. To
date such suppression of enemy air defense is done by most countries by a(n) (manned) airplane
with a pilot who flies over an area and who sees on his dashboard an indication of an enemy
radar signal. He can then release a (fire and forget) missile that would automatically follow the
signal of the radar and destroy the target. The Israeli Harpy used today is a drone that is sent
over a vast area and that will automatically destroy every radar giving out a signal, which is
clearly an offensive application that can create issues in terms of the principle of distinction,
increasing the probability of a breach of IHL and triggering criminal responsibility.
The principle of distinction becomes more problematic in the context of more offensive
applications like the Harpy as such systems can recognize the electronic signature of radars of
the adversary but are not capable yet of appreciating the surrounding context. The Harpy cannot
assess if the radar is near civilians or civilian objects184 and cannot make a distinction between
the different categories protected by IHL (civilians, people hors de combat, cultural heritage
sites, …). A lot also depends on the precise circumstances of the use. For example, the Harpy
could be considered compliant with the principle of distinction when used in remote areas
where civilians and civilian objects are not present. However, in more urbanized and thus
populated areas where the targeting becomes much more dynamic and complex, the system
182 W. J. A. Dahm, Air Force Chief Scientist, ‘Report on technological horizons: A vision for air force science & technology during 2010-2030, (Office of the Chief Scientist of the U.S. Air Force 2010), <https://www.researchgate.net/publication/302305033_Technology_Horizons_A_Vision_for_Air_Force_Science_Technology_During_2010-2030> accessed 20 April 2020, p. 105 183 M. Noorman, D. G. Johnson, ‘Negotiating autonomy and responsibility in military robots’ Vol. 16 (2014) Ethics and Information Technology, <https://link.springer.com/article/10.1007/s10676-013-9335-0?shared-article-renderer> accessed 21 February 2020, p. 59 184 V. Boulanin, M. Verbruggen, ‘Mapping the Development of Autonomy in Weapon Systems’, Stockholm International Peace Research Institute 2017, <https://www.sipri.org/publications/2017/other-publications/mapping-development-autonomy-weapon-systems> accessed 21 Februari 2020, p. 73-74
48
should be able to have a more sophisticated appreciation of the civilians and civilian objects
surrounding the radars before engaging.185 According to Nathalie Weizmann186 the system
should:
be able to evaluate a person’s membership in the state’s armed forces e.g. distinct from
a police officer- and his or her membership in an armed group (with or without a
continuous combat function), whether or not he or she is directly participating in
hostilities, and whether or not he or she is hors de combat … [It] would also need to be
able to, first, recognize situations of doubt that would cause a human to hesitate before
attacking and, second, refrain from attacking objects and persons in those
circumstances.187
This is not yet possible with the current level of AI technology. An offensive system today like
the Harpy cannot comply with the principle of distinction on its own. A human assessment of
a congested environment based upon a broader spectrum of sensors including Humint remains
necessary. “Distinguishing between a fearful civilian and a threatening enemy combatant
requires a soldier to understand the emotions behind a human’s actions, something a robot
cannot do”.188 To be ethical and lawful, the distinction assessment must be complied with and
in certain complex circumstances, this will not be possible without a human intervention. If this
intervention does not occur, the principle of distinction is not respected and this could lead to
detrimental outcomes. “A major IHL issue is thus that offensive LAWS cannot discriminate
between combatants and non-combatants or … combatants that are wounded, [or] have
surrendered… in a way that would satisfy the principle of distinction”.189 “Weapon systems
that would be able to assess civilian status… as part of their own independent targeting
185 Ibid, p. 74 186 Researcher for the Columbia Law School 187 N. Weizmann, ‘Autonomous Weapon Systems under International Law’ (2014) No. 8 Academy Briefing Geneva Academy, <https://www.geneva-academy.ch/joomlatools-files/docman-files/Publications/Academy%20Briefings/Autonomous%20Weapon%20Systems%20under%20International%20Law_Academy%20Briefing%20No%208.pdf> accessed 22 February 2020, p. 14 188‘Losing Humanity: The Case against Killer Robots’ (2012) Human Rights Watch, <https://www.hrw.org/report/2012/11/19/losing-humanity/case-against-killer-robots> accessed 22 February 2020; G. D. Solis, The Law of Armed Conflict: International Humanitarian Law in War (Cambridge University Press 2016), p. 541-542 189 N. E. Sharkey, ‘The Evitability of Autonomous Robot Warfare’ (2012) Vol. 94 No. 886 International Review of the Red Cross, <https://www.law.upenn.edu/live/files/3399-sharkey-n-the-evitability-of-autonomous-robot> accessed 22 February 2020, p. 787 – 788; G. D. Solis, The Law of Armed Conflict: International Humanitarian Law in War (Cambridge University Press 2016), p. 539
49
decisions… do not exist today and research toward such capabilities currently remains in the
realm of theory”.190
5.4. Consequences for the notions of criminal individual and command responsibility and possible solutions
As long as a human remains in the loop such as an operator and a commander who chose to
deploy the weapon system, responsibility can still be attributed. However as soon as the
operator starts to become almost passive in target engagements or even disappears (in case of
future developments), the emphasis on individual responsibility of the commanding officer will
become ever more prominent. Because the doctrine of command responsibility is a form of
vicarious liability meaning it requires a commander-subordinate relationship, it only comes into
play when there is still a subordinate (operator) who is directly liable. As soon as the operator
disappears, an issue arises as the LAWS itself does not have moral agency.191 “LAWS have no
moral agency and as a result cannot be held responsible in any recognizable way if they cause
deprivation of life that would normally require accountability if humans had made the
decisions”.192 This would make the doctrine of command responsibility in its current form
inapplicable. In order to obtain a more coherent and enforceable legal framework there are three
options.
Firstly, the command responsibility doctrine could be adapted and redrafted in a format
that would make it usable when deploying LAWS by focussing on a shared organization-
commander responsibility. The emphasis would lie on the decision to deploy and engage
LAWS instead of the current focus and linked requirements on the superior-subordinate
relationship and the prerequisite of the subordinate committing a crime. These requirements
will no longer be valid in the future as the role of the subordinate will be insignificant or
inexistent and a LAWS does not have the intent or mens rea to commit a crime.
The second option would be to use and possibly expand the existing doctrine of the
notion of individual responsibility where one would have to prove the commander intendedly
190 K. Anderson, D. Reisner, M. Waxman, ‘Adapting the Law of Armed Conflict to Autonomous Weapon Systems’ (2014) Vol. 90 Naval War College’s International Legal Studies, < https://digital-commons.usnwc.edu/cgi/viewcontent.cgi?article=1015&context=ils> accessed 18 April 2020, p. 308, 310-311 191 C. Heyns, ‘Human Rights Council Report of the Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions’ (2013) No. A/HRC/23/47 United Nations, paragraph 26-126; C. Allen, W. Wallach, ‘Moral Machines: Contradiction in Terms or Abdication of Human Responsibility?’ in P. Lin, G. Bekey, K. Abney, Robot Ethics: The Ethical and Social Implications of Robots (MIT Press 2011), p. 62 192 C. Heyns, ‘Human Rights Council Report of the Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions’ (2013) No. A/HRC/23/47 United Nations, paragraph 26-126
50
acted in a wrongful way. Individual responsibility is a possible way to go within the current
legal framework as you would not have to prove a superior-subordinate relationship. However,
here a significant hurdle can be the need to prove the wrongful intent. There are not any cases
yet where commanders were held responsible for incidents with LAWS so it is hard to predict
if this ground for accountability would hold in court. In this case an ad hoc assessment of the
precise circumstances, decision making and specificities of the system will be important to take
into consideration.
A third option would be to reintroduce a form of liability based upon individual
responsibility where all the actors involved in the ecosystem surrounding a LAWS could be
held responsible. It would be a form of distributed responsibility between decision makers and
the centre of gravity would be situated with the stakeholder(s) that had the biggest impact on
the decision making depending the precise moment and circumstances. This form of ‘collective
criminality’ already exists. It was introduced in the aftermath of WWII in Nuremberg and
developed by the ICTY in the Tadic case193 under the concept of ‘Joint Criminal Enterprise’
(JCE) as a special modality of individual criminal responsibility. However, under the ICC
jurisprudence this construction was replaced by the concept of joint control over the crime. The
doctrine of joint control over the crime is a modality of indirect co-perpetration which attributes
the actions of individuals under the control of one co-perpetrator to all other co-perpetrators in
the common plan194. This legal doctrine is less relevant in the context of LAWS as it also
demands a superior-subordinate relationship.
The construction that the ICTY had developed, could be a solution in situations where
there is evident inadequacy of the modes of criminal liability as written down in several statutes
(like in the ICTY Statute) to cover the different intensities of responsibility of the actors in a
common war crime.195 It was a legal doctrine which aimed at prosecuting high-ranked political
and military leaders as members of a group for acts committed by the group. In case of LAWS,
it would not be a group of actors that commits the war crime with a common purpose. It would
be the LAWS that commits the crime but each actor that contributed by for instance
programming the weapons system, by deploying it, by operating it, etc would carry some
193 Tadic (Case IT-94-1-T) ICTY judgment, The Hague 14 July 1997 194 Dominic Ongwen case, <https://www.icc-cpi.int/CourtRecords/CR2019_00588.PDF> accessed 15 May 2020, paragraaf 24 195 G. Bigi, ‘Joint Criminal Enterprise in the Jurisprudence of the International Criminal Tribunal for the Former Yugoslavia and the Prosecurtion of Senior Political and Military Leaders: The Krajisnik Case’ (2010) Vol. 10 Max Planck Yearbook of United Nations Law, <https://www.mpil.de/files/pdf3/mpunyb_02_bigi_14.pdf> accessed 22 April, p. 53
51
responsibility. Each actor involved in the process would have a share in the common
responsibility, even if they are not physically included in the actual commission of the crime.196
LAWS make it difficult to pinpoint one person as the perpetrator as the system works
autonomously. However, all actors involved might have contributed in some way to allow the
LAWS to act in a certain way. It would be unethical not to hold someone accountable when a
LAWS causes a war crime due to the fact that the LAWS itself cannot be prosecuted. Further,
these actors create the operating capabilities and conditions that can contribute to the LAWS to
behave unjustly or to fail to build in sufficient constraints.197 Therefore, it is not illogical that
all actors contributing to the functionality of the LAWS would be held (partially) accountable.
With the scope of this thesis in mind, the focus is primarily on the research of criminal
responsibility within the armed forces and more specifically the actors deciding to deploy the
LAWS and its users. If the LAWS’ actions are authorized by military commanders they could
be held responsible for breaches of IHL committed by the LAWS depending on the
circumstances.198 For instance, if commanders deploy a LAWS into combat without just and
adequate formulated ROE they should be held responsible. Besides the principle of distinction
there are thus also other factors to take into account such as the ROE and the environment. ROE
are forward-looking199 in the sense that they describe certain expectations of how a soldier
should behave on a certain mission. ROE articulate in which circumstances a soldier may use
lethal force. In the case of LAWS, the ROE may also allow to zoom in on the use of these
systems meaning under which circumstances a certain type of offensive LAWS may be
engaged. It is clear that a deployment in a peacekeeping or peace enforcement environment is
more complex than in a conventional conflict (NATO Art 5 context). Describing these
circumstances in the ROE could also help to hold someone criminally responsible after an
incident if the circumstances for to use lethal force were clearly not met. Nevertheless, the
degree of responsibility is very dependent on the specific situation. It is not possible to give a
clear, definite answer on the degree of responsibility that could cover all cases.200
196 G. Bigi, ‘Joint Criminal Enterprise in the Jurisprudence of the International Criminal Tribunal for the Former Yugoslavia and the Prosecurtion of Senior Political and Military Leaders: The Krajisnik Case’ (2010) Vol. 10 Max Planck Yearbook of United Nations Law, <https://www.mpil.de/files/pdf3/mpunyb_02_bigi_14.pdf> accessed 22 April, p. 59; N. Piacente, ‘Importance of the Joint Criminal Enterprise. Criminal Liability by Prosecutorial Ingenuity and Judicial Creativity?’ (2004) Vol. 2 Journal of International Criminal Justice, p. 606 197 M. Schulzke, ‘Autonomous weapons and distributed responsibility’ (2012) No. 26 Philosophy and Technology, http://link.springer.com/article/10.1007%2Fs13347-012-0089-0, accessed 9 February 2020, p. 213 198 Ibid, p. 204 199 M. Noorman, ‘Responsibility Practices and Unmanned Military Technologies’ (2014) Vol. 20 Science and Engineering Ethics, < https://link.springer.com/article/10.1007/s11948-013-9484-x > accessed 2 April 2020, p. 814 200 Ibid, p. 213
52
It is clear that collective responsibility is a complex concept that extends beyond the
military actors and the context of IHL. As offensive LAWS evolve, like with any new
technologies, new roles are being created for a larger number of human actors. These actors can
become quite numerous such as e.g. developers, procurement officers, policy makers.201 As
these actors are not necessarily part of the military and therefore not within the scope of this
thesis, I will not go into this in extenso. Regarding developers as other actors involved in the
manufacturing process it is valid to state that they should be able to be held responsible if the
LAWS’ unlawful actions result from the way they are programmed (software/hardware issues).
They should thus share in the responsibility in case of a war crime committed by a LAWS to
the extent that they contribute or allowed the LAWS to violate IHL or failed to create the
necessary precautions against such violations.202 The legal basis for the concept of shared
responsibility however will be broader than IHL.
5.5. Conclusion Current research, experimentation and testing indicate that the future use of LAWS will not
remain limited to exclusively defensive applications. Already today, there are several land, air
and maritime LAWS prototypes that possess offensive capabilities. The Israeli army is currently
already using the Harpy system, which can autonomously destroy enemy radar systems in a
certain bounded area. Already this offensive application generates a lot more risks in terms of
non-compliance with the principle of distinction and thus increases the risk of breaching a
principle of IHL. There lies a clear danger in such a broadening integration of a variety of
autonomous systems in defence inventories as they gradually stretch the boundaries of future
use thus impacting the related IHL principles and notions such as criminal responsibility.
The doctrine of command responsibility only comes into play when there is still a
subordinate who is directly liable but as soon as the operator disappears, command
responsibility in its current form becomes inapplicable.
There are three options to deal with future (more offensive) applications of LAWS.
A first option is to adapt the doctrine of command responsibility that would make it usable
when deploying LAWS by focussing on a shared organization-commander responsibility. The
201 M. Noorman, ‘Responsibility Practices and Unmanned Military Technologies’ (2014) Vol. 20 Science and Engineering Ethics, < https://link.springer.com/article/10.1007/s11948-013-9484-x > accessed 2 April 2020, p. 818 202 M. Schulzke, ‘Autonomous weapons and distributed responsibility’ (2012) No. 26 Philosophy and Technology, http://link.springer.com/article/10.1007%2Fs13347-012-0089-0, accessed 9 February 2020, p. 213
53
emphasis would lie on the decision to deploy and engage LAWS and not on the superior-
subordinate relationship.
A second option would be to fundamentally review the whole command responsibility
doctrine and perhaps broaden or adapt it to include organizational decisions such as choosing
certain weapons systems to be deployed instead of making it a ground reliant on the crime of a
subordinate.
A third option that emphasizes individual responsibility is to create a kind of legal framework
based on a previously applied doctrine by the ICTY entailing a broader distribution of
individual responsibility. Thus, creating the possibility to involve a whole ecosystem unless
one actor clearly made an individual mistake.
54
Chapter 6: Conclusion A radical doctrinal evolution or even revolution has taken place in military affairs since WW I
where the senior leadership determined almost exclusively all strategy and tactics. Until WWI
the emphasis was undeniably on command responsibility. Before and during WWII the concept
of mission command was modernized and reintroduced emphasizing individual responsibility
up to the lowest echelons. Until today this doctrinal approach proves its added value due to the
vast military theatres such as Afghanistan and Iraq and the shrinking of the old mass armies.
Technologies such as AI and robotics create the need to reflect upon the validity of the existing
doctrinal approach by offering the possibility to engage LAWS even for lethal purposes. This
reduces the impact of human military judgement and decreases the level of accountability of
the military hierarchy as technology replaces large parts of the decision-making process thus
limiting the often-subjective human control previously exercised during the process. This also
reshuffles the responsibility sharing among the different stakeholders that are involved in the
design, production, procurement and employment of such unmanned systems but does not end
the human responsibility despite the possible future disappearance of the man in the loop.
Specifically for the military employing the system, this means that the current legal framework
is not fully suited anymore for these situations. This thesis addresses that problem by using the
principle of distinction, underlying the targeting process, as analytical tool to investigate the
impact of employing defensive LAWS on the legal notions of individual and command
responsibility.
The main research question of this research reads: How does the deployment of LAWS
impact the notions of criminal individual and command responsibility? Defining LAWS
unambiguously was not a necessity to conduct the research for this thesis. It proved to be more
useful to describe the characteristics and usage modalities of LAWS. In practice however, the
current applications are mainly defensive in nature. The most widely used defensive system is
the Phalanx CIWS which was selected for this research as it offers a representative example of
defensive LAWS. Phalanx is a pre-programmed defensive weapon system that destroys all
incoming imminent threats with a cloud of conventional munition as soon as they enter a certain
predefined perimeter. The research that was conducted analysed how the characteristics and
usage modalities of the Phalanx system influenced the principle of distinction, allowing to
derive conclusions regarding criminal individual and command responsibility with regard to
defensive LAWS.
55
The decision to use a specific type of defensive weapon system for a certain mission in
a specific area of operation and in a specific operational context is the most important moment
where human control is exercised and where responsibility can be linked to the commander
who makes this deployment decision. Should anything go wrong in a legal sense then the
responsibility would fall on the commander. A system error could in this case cause a war crime
triggering criminal individual responsibility of the commander. This way the military doctrine
of mission command emphasizing the level of responsibility of the lower echelons is no longer
applicable as the commander receives again more responsibility as it was the case during WWI.
Paradoxically, this does not mean the doctrine of command criminal responsibility becomes
dominant; quite the contrary, as this doctrine is based upon the commander-subordinate
relationship and supposes the commission of criminal acts by the subordinate who no longer
has an important role.
The research demonstrated that the command responsibility doctrine becomes
impracticable in a situation where an operator has either a very limited active role or none at
all. In the current legal framework, individual criminal responsibility could be a legal recourse
as here there is no subordination relationship that needs to be proven. This legal ground
currently seems the best suited to situations that involve defensive LAWS.
The thesis further briefly analysed what the future use of more offensive LAWSs could mean
for the notions individual and command responsibility as it is important to acknowledge the
current R&D trend leaning towards a broader employment and more offensive applications
based on existing land, air and maritime systems. Today the Harpy is an already existing
example of such a broader application of LAWS. This Israeli system can autonomously destroy
radar systems of the adversary whereas the previous generation still required the operator to
pull the trigger. Such an offensive application creates a lot more ambiguity with regard to the
respect of the principle of distinction as the system is not capable of appreciating the potentially
complex and congested environment of the radar it will attack. This could result in undesired
outcomes with collateral damage if the radar system is located in a densely populated area. This
also creates the possibility for non-state actors to use the legal complexity of IHL against armed
forces operating in respect of IHL leading to serious breaches of the principle of distinction
thus invoking criminal responsibility. The current legal framework, that is able to cope with
current levels of autonomy linked to defensive use, will require adjustments in order to retain
its legal utility. Neither command nor individual criminal responsibility fully cover all legal
dimensions of the broader employment of LAWS. An option could be an adaptation of criminal
56
command responsibility, currently limited to an indirect responsibility for acts perpetrated by
subordinates, to also cover the command decision to deploy LAWS. Another option could be
to introduce a liability framework based on individual responsibility where all the actors in the
LAWS ecosystem could bear (a part of) the responsibility. Even though a growing number of
tasks will be executed by technological components, human responsibility should remain the
basis of the new responsibility (and accountability) equation.
The use of more offensive LAWS will create friction with the current principles of IHL but this
could be partially countered by better enforcing the Article 36 weapon review complemented
by developing new effective V&V procedures.
There lies a clear and present danger in an incremental broadening towards more
offensive applications of LAWS raising the question whether the current core principles of IHL
will be able to cope with such a ground-breaking evolution due to a fast-evolving technological
environment or whether it will require a thorough adaptation or even revision of the current
legal foundation. The present-day defensive employment of LAWS still permits the application
of the current IHL framework (although not drawn up for this situation) and its allegedly
timeless principles but as a function creep evolves towards more and more offensive
applications, the need to adjust the regulatory framework will become increasingly prominent.
How will we reconcile the underlying legal principles of the targeting process with offensive
LAWS in a way that their employment still corresponds to IHL? Where will the responsibility
centre of gravity be situated when systems become even more autonomous and offensive able
to integrate the whole or an important part of the targeting process making them the potential
assessor of life and death? Such questions are possible further elaborations of the subject and
could be a topic for future research.
57
Bibliography ___________________________________________________________________________
PRIMARY SOURCES
LEGISLATION
US legislation
US Department of Defense, ‘Directive 3000.09’ (2012) <www.dtic.mil/whs/directives/corres/pdf/300009p.pdf> accessed 27 December 2019
International treaties
‘Additional Protocol I to the Geneva Conventions’ (1977) Diplomatic Conference on the Reaffirmation and Development of International Humanitarian Law applicable in Armed Conflicts International Committee of the Red Cross, ‘The Geneva Conventions of 1949 and their Additional Protocols’<https://www.icrc.org/en/doc/war-and-law/treaties-customary-law/geneva-conventions/overview-geneva-conventions.htm> accessed 2 January 2020 ‘The Rome Statute of the International Criminal Court’ (1998) Vol. 2187 No. 38544 United Nations Treaty Series Charter of the International Military Tribunal – Annex to the Agreement for the prosecution and punishment of the major war criminals of the European Axis (‘London Agreement’), 8 August 1945
CASE LAW
International Criminal Court
Bemba (Case ICC-01/05-01/08), ICC Judgment, The Hague 8 June 2018 Dominic Ongwen (Case ICC-02/04-01/15), ICC Judgment, The Hague 1 February 2019
International Criminal Tribunal for the former Yugoslavia
Celebici (Case CC/PIU/364-E) ICTY judgment, The Hague 16 November 1998 Tadic (Case IT-94-1-T) ICTY judgment, The Hague 14 July 1997
SECONDARY SOURCES
POLICY DOCUMENTS
United Nations
ICRC Expert Meeting on ‘Autonomous Weapon Systems: Implications of Increasing Autonomy in the Critical Functions of Weapons’(2016) Geneva, Switzerland,https://reliefweb.int/sites/reliefweb.int/files/resources/4221-002-autonomous-weapons-systems-full-report%20%281%29.pdf > accessed 21 February 2020 C. Heyns, ‘Human Rights Council Report of the Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions’ (2013) No. A/HRC/23/47 United Nations
58
Official US policy documents
‘Defense Primer: U.S. Policy on Lethal Autonomous Weapon Systems’ (2019) Congressional Research Service, <https://fas.org/sgp/crs/natsec/IF11150.pdf > accessed 21 March 2020, p.1 U.S. Government, ‘Human-Machine Interaction in the Development, Deployment and Use of Emerging Technologies in the Area of Lethal Autonomous Weapons Systems’ (2018) CCW/GGE.2/2018/WP.4, <https://www.unog.ch/80256EDD006B8954/(httpAssets)/D1A2BA4B7B71D29FC12582F6004386EF/$file/2018_GGE+LAWS_August_Working+Paper_US.pdf> accessed 21 March 2020
European Union
‘Future Trends from the Capability Development Plan’ (2008) European Defence Agency <https://www.eda.europa.eu/docs/documents/brochure_cdp.pdf> accessed 2 April 2020
DOCTRINE
Books
C. Allen, W. Wallach, ‘Moral Machines: Contradiction in Terms or Abdication of Human Responsibility?’ in P. Lin, G. Bekey, K. Abney, Robot Ethics: The Ethical and Social Implications of Robots (MIT Press 2011) D. Fleck, The Handbook of International Humanitarian Law, (Oxford University Press 2013) D. P. Lackey, Moral Principles and Nuclear Weapons (Rowman & Littlefield 1984) D. Saxon, International Humanitarian Law and the Changing Technology of War (Martinus Nijhoff Publishers 2013) E. Di Nucci, F. Santoni de Sio, Drones and Responsibility: Legal, Philosophical and Socio-Technical Perspectives on Remotely Controlled Weapons (Routledge 2016) G. A. Bekey, Autonomous robots: From biological inspiration to implementation and control, (MIT Press 2005) Gary D. Solis, The Law of Armed Conflict: International Humanitarian Law in War (Cambridge University Press 2016) I. H. Witten, E. Frank, M. A. Hall, Data Mining: Practical Machine Learning Tools and Techniques (Elsevier 2011) I. van de Poel, ‘The Relation Between Forward-Looking and Backward-Looking Responsibility’ in N. Vincent, J. van den hoven, I. van de Poel (eds), Moral Responsibility: Beyond Free Will and Determinism (Springer 2011) J. Campbell, Naval Weapons of World War Two (Naval Institute Press, 2002) J. Carlson, Citizen-protectors: The Everyday Politics of Guns in an Age of Decline (Oxford University Press 2015) K. Ipsen, ‘Combatants and Non-Combatants’, in D. Fleck (ed) the Handbook on International Humanitarian Law, (Oxford University Press 2014) L. Elliott, B. Stewart, Automation and autonomy in unmanned aircraft systems. Introduction to Unmanned Aircraft Systems, (CRC Press 2011) M. Luck, S. Munroe, M. D’Inverno, ‘Autonomy: Variable and generative’ In H. Hexmoor, C. Castelfranchi, R. Falcone (Eds.), Agent Autonomy, (Kluwer 2003)
59
N. Bhuta, S. Beck, R. Geiß, H. Liu, C. Kreß, Autonomous Weapons Systems: Law, Ethics, Policy (Cambridge University Press 2016) P. Flach, Machine Learning: The Art and Science of Algorithms that Make Sense of Data (Cambridge University Press 2012) P. Scharre, Army of None: Autonomous Weapons and the Future of War (W.W. Norton & Company 2018) P. Springer, Military Robots and Drones: A Reference Handbook (ABC-CLIO, 2013) R. Arkin, Governing Lethal Behaviour in Autonomous Robots (CRC Press 2009) S. J. Russel, P. Norvig, Artificial Intelligence: A Modern Approach (Prentice Hall 2010) S. Oeter, ‘Methods And Means of Combat’, in D. Fleck (ed) the Handbook on International Humanitarian Law, (Oxford University Press 2014) S. Tuffery, Data Mining and Statistics for Decision Making (John Wiley and Sons 2011) T. B. Sheridan, Telerobotics, automation and human supervisory control, (MIT Press 1992) W. H Boothby, Weapons and the Law of Armed Conflict (Oxford University Press 2009)
(Online) articles, papers and reports
A. Dahlmann, M. Dickow, ‘Preventive Regulation of Autonomous Weapon Sytems: Need for Action by Germany at Various Levels’ (2019) SWP Research Paper 3, <https://www.swp-berlin.org/fileadmin/contents/products/research_papers/2019RP03_dnn_dkw.pdf> accessed 27 December 2019 A. IIachinski, ‘AI, Robots, and Swarms: Issues, Questions and Recommended Studies’, (2017) CNA, <https://www.cna.org/cna_files/pdf/DRM-2017-U-014796-Final.pdf> accessed 20 March 2020 A. J. Plunkett, ‘Iwo Jima Officer Killed In Firing Exercise’ (1989) Daily Press, <http://articles.dailypress.com/1989-10-12/news/8910120238_1_iwo-jima-ship-close-in-weapons-system> 20 March 2020 A. Pfaff, ‘The Ethics of Acquiring Disruptive Military Technologies’ (2019) Vol. 3 No. 1 Texas National Security Review, <https://tnsr.org/2020/01/the-ethics-of-acquiring-disruptive-military-technologies/> accessed 20 March 2020 C. Heyns, ‘Report of the Special Rapporteur on extrajudicial, summary of arbitrary executions’(2013), UN General Assembly, <https://www.ohchr.org/Documents/HRBodies/HRCouncil/RegularSession/Session23/A-HRC-23-47_en.pdf> accessed 28 December 2019 C. Jenks, ‘False Rubicons, Moral Panic & Conceptual Cul-DelSacs: Critiquing & Reframing the Call to Ban Lethal Autonomous Weapons’ Southern Methodist University, Dedman School of Law<https://scholar.smu.edu/law_faculty/495/> accessed 20 March 2020 C. Saad, E. Gosal, ‘Autonomous weapon systems: how to work towards a total ban?’, The Canadian Bar Association, <https://www.cba.org/Sections/International-Law/Articles/2019/Autonomous-weapons-systems-how-to-work-towards-a> accessed 1 April 2020 C. W. Marra, S. K. McNeil, ‘Understanding “The Loop”: Regulating the Next Generation of War Machines’, Vol. 36 Harvard Journal of Law & Public Policy, <https://www.harvard-jlpp.com/wp-content/uploads/sites/21/2013/05/36_3_1139_Marra_McNeil.pdf> accessed 20 March 2020 D. E. Vandergriff, ‘How the Germans Defined Auftragstaktik: What Mission Command is – AND – is Not’ Small Wars Journal, <https://smallwarsjournal.com/jrnl/art/how-germans-defined-auftragstaktik-what-mission-command-and-not> accessed 3 January 2020
60
D. H. Petraeus, ‘As Machines Wage War, Human Nature Endures’ (2017) Zocalo Public Square, <https://www.zocalopublicsquare.org/2017/03/29/machines-wage-war-human-nature-endures/ideas/nexus/> accessed 2 January 2020 D. Lamothe, ‘Pentagon examining the ‘killer robot’ threat’ (2016) Boston Globe, <https://www.bostonglobe.com/news/nation/2016/03/30/the-killer-robot-threat-pentagon-examining-how-enemies-could-empower-machines/sFri6ZDifwIcQR2UgyXlQI/story.html > accessed 22 February 2020 D. Lawless, ‘The problems facing the principle of distinction in international humanitarian law due to the changing nature of armed conflict – the effects of an increasing ‘civilian’ population on the battlefield for this principle’ Thesis <https://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=13&ved=2ahUKEwiQ_Mbio_zoAhVMNOwKHcJkDMgQFjAMegQIAxAB&url=http%3A%2F%2Fwww.scriptiesonline.uba.uva.nl%2Fdocument%2F621757&usg=AOvVaw3wf5DOd4Ls3aaJ59qxUXBt> accessed 22 April 2020 D. Pavlas, C. S. Burke, S. M. Fiore, E. Salas, R. Jensen, D. Fu, ‘Enhancing Unmanned Aerial Systems Training: A Taxonomy of Knowledge, Skills, Attitudes, and Methods (2009) Vol. 53, No. 26 Human Factors and Ergonomics Society Annual Meeting Proceedings, <https://journals.sagepub.com/doi/10.1177/154193120905302604> accessed 2 April 2020 E. Chavannes and A. Arkhipov-Goyal, ‘Towards Responsible Autonomy – The Ethics of Robotic Autonomous Systems in a Military Context’ (2019) The Hague Centre for Strategic Studies, <https://hcss.nl/report/towards-responsible-autonomy-ethics-ras-military-context> accessed 2 January 2020 E. C. Hirschman, ‘Social contract theory and the semiotics of guns in America’ (2014) Vol. 24 No. 5 Social Semiotics, <https://www.tandfonline.com/doi/abs/10.1080/10350330.2014.937077> accessed 21 March 2020 F. Slijper, ‘Where to Draw the Line: Increasing Autonomy in Weapon Systems – Technology and Trends’ (2017) Paxforpeace, <www.paxvoorvrede.nl/> accessed 1 April 2020 G. Bigi, ‘Joint Criminal Enterprise in the Jurisprudence of the International Criminal Tribunal for the Former Yugoslavia and the Prosecurtion of Senior Political and Military Leaders: The Krajisnik Case’ (2010) Vol. 10 Max Planck Yearbook of United Nations Law, <https://www.mpil.de/files/pdf3/mpunyb_02_bigi_14.pdf> accessed 22 April G. D Brown, A. O. Metcalf, ‘Easier Said Than Done: Legal Reviews of Cyber Weapons’ (2014) Vol. 7 Journal of National Security Law and Policy, < https://jnslp.com/wp-content/uploads/2014/02/Easier-Said-than-Done.pdf > accessed 22 April G. Swiney, ‘Saving Lives: The principle of distinction and the Realities of Mordern War’ (2005) Vol. 39 No. 3 The International Lawyer, <https://www.jstor.org/stable/40707812> accessed 2 April 2020 H. Huang, E. Messina, J. Albus, ‘Autonomy level specification for intelligent autonomous vehicles: Interim progress report’, In Proceedings of the performance metrics for intelligent systems (PerMIS) workshop, (Courtyard Gaithersburg Washingtonian Center 2003) <https://www.govinfo.gov/content/pkg/GOVPUB-C13-e1537a1db6acc39af59986f6f4722b48/pdf/GOVPUB-C13-e1537a1db6acc39af59986f6f4722b48.pdf> accessed 20 March 2020 H. M. Roff, R. Moyes, ‘Meaningful Human Control, Artificial Intelligence and Autonomous Weapons’(2016), briefing paper prepared for the Informal Meeting of Experts on Lethal Autonomous Weapons Systems, UN Convention on Certain Conventional Weapons, <http://www.article36.org/wp-content/uploads/2016/04/MHC-AI-and-AWS-FINAL.pdf> accessed 27 December 2017 I. Daoust, R. Coupland, R. Ishoey, ‘New Wars, New Weapons? The Obligation of States to Assess the Legality of Means and Methods of Warefare’ (2002) Vol. 84 International Review of the Red Cross, <https://www.icrc.org/en/doc/assets/files/other/345_364_daoust.pdf> accessed 25 April 2020 I. Giesen, F. G. H. Kristen, ‘Liability, Responsibility and Accountability: Crossing Borders’ (2014) Vol. No. 3 Utrecht Law Review, <https://www.utrechtlawreview.org/articles/10.18352/ulr.280/galley/281/download/> accessed 27 December 2019
61
J. B. Mbokani, ‘The Doctrine of “Command Responsibility” in the Bemba Case’ (2011) International Justice Monitor, <https://www.ijmonitor.org/2011/07/the-doctrine-of-command-responsibility-in-the-bemba-case/> accessed 14 May 2020 J. D. Fry, ‘Contextualized Legal Reviews for the Methods and Means of Warfare: Cave Combat and International Humanitarian Law’ (2006) Vol. 44 No. 2 Columbia Journal of Transnational Law, <https://www.researchgate.net/publication/294182434_Contextualized_legal_reviews_for_the_methods_and_means_of_warfare_Cave_combat_and_international_humanitarian_law> accessed 25 April 2020 J. G. Surrey, ‘Military Robots: Mapping the Moral Landscape’ (2017) in Book Reviews, Magazine Articles, Robotics, Social Impact, <https://technologyandsociety.org/military-robots-mapping-the-moral-landscape/> accessed 20 March 2020 J. Lang, R. van Munster & R. M. Schott, ‘Failure to define killer robots means failure to regulate them’ (2018) Danish Institute For International Studies, <https://www.diis.dk/en/research/failure-to-define-killer-robots-means-failure-to-regulate-them> accessed 21 February 2020 J. Lewis, ‘The Case for Regulating Fully Autonomous Weapons’,(2015) Vol. 124 No. 4 The Yale Law Journal, <https://www.yalelawjournal.org/comment/the-case-for-regulating-fully-autonomous-weapons> accessed 29 March 2020 J. McClelland, ‘The Review of Weapons in Accordance with Article 36 of Additional Protocol I’ (2003) Vol. 85 International Review of the Red Cross, <https://www.icrc.org/en/doc/assets/files/other/irrc_850_mcclelland.pdf> accessed 25 April 2020 J. M. Wheatcroft, M. Jump, A. L. Breckel, J. Adams-White, ‘Unmanned Aerial Systems (UAS) operators’ accuracy and confidence of decisions: Professional pilots or video game players?’ (2017) Cognitive Science & Neuroscience, <https://www.cogentoa.com/article/10.1080/23311908.2017.1327628> accessed 3 April 2020 K. Anderson, D. Reisner, M. Waxman, ‘Adapting the Law of Armed Conflict to Autonomous Weapon Systems’ (2014) Vol. 90 Naval War College’s International Legal Studies, < https://digital-commons.usnwc.edu/cgi/viewcontent.cgi?article=1015&context=ils> accessed 18 April 2020, p. 399 L. A. Dickinson, ‘Drones, Automated Weapons, and Private Military Contractors’ (2018) Cambridge University Press, <https://www.cambridge.org/core/books/new-technologies-for-human-rights-law-and-practice/drones-automated-weapons-and-private-military-contractors/C6F1C06CBFA06E46D7A8E67A0BB3065D/core-reader> accessed 3 April 2020, p. 115 Legality of the Threat or Use of Nuclear Weapons Advisory Opinion, I.C.J. Reports 1996, 8 July 1996 International Court of Justice (ICJ), 8 July 1996 L. Zhou, ‘A Comparison of Classification Methods for Predicting Deception in Computer-Mediated Communication’ (2004) Vol. 20 No. 4 Journal of Management Information Systems, <https://www.tandfonline.com/doi/abs/10.1080/07421222.2004.11045779> accessed 25 April 2020, p. 152 M. A.C. Ekelhof, ‘Lifting the fog of targeting: “Autonomous Weapons” and Human Control through the Lens of Military Targeting’ (2018) Vol. 71, No. 3 U.S. Naval War College, <https://www.jstor.org/stable/10.2307/26607067 > accessed 1 April 2020 M. Dickow, ‘Preventive Regulation of Autonomous Weapon Sytems: Need for Action by Germany at Various Levels’ (2019) SWP Research Paper 3, <https://www.swp-berlin.org/fileadmin/contents/products/research_papers/2019RP03_dnn_dkw.pdf> accessed 27 December 2019, p. 23 M. Ekelhof, ‘Complications of a Common Language: Why it is so Hard to Talk about Autonomous Weapons’ (2017) Vol. 22 No. 2 Journal of Conflict and security Law, <https://www.researchgate.net/publication/323416370_Complications_of_a_Common_Language_Why_it_is_so_Hard_to_Talk_about_Autonomous_Weapons> accessed 1 April 2020
62
M. Gubrud in ‘Autonomy without Mystery:Where do you draw the line?’ (2014) 1.0 Human M. Gubrud, ‘Stopping killer robots’ (2014) Vol. 70 No. 1 Bulletin of the Atomic Scientists <https://journals.sagepub.com/doi/pdf/10.1177/0096340213516745> accessed 20 March 2020 M. Jacobsson, ‘Modern Weaponry and Warfare: The Application of Article 36 of Additional Protocol I by Governments’ (2007) Vol. 82 International Law Studies, <https://digital-commons.usnwc.edu/cgi/viewcontent.cgi?article=1231&context=ils> accessed 25 April 2020 M. Noorman, ‘Mind the gap a critique of human/technology analogies in artificial agents discourse’ (2009) Proefschrift, Universitaire Pers Maastricht, <https://cris.maastrichtuniversity.nl/ws/portalfiles/portal/1656284/guid-fa7527ce-ee3e-4600-bf08-99ff26d24330-ASSET1.0.pdf> accessed 21 February 2020 M. Noorman, ‘Responsibility Practices and Unmanned Military Technologies’ (2014) Vol. 20 Science and Engineering Ethics, < https://link.springer.com/article/10.1007/s11948-013-9484-x > accessed 2 April 2020 M. Noorman, D. G. Johnson, ‘Negotiating autonomy and responsibility in military robots’ Vol. 16 (2014) Ethics and Information Technology, <https://link.springer.com/article/10.1007/s10676-013-9335-0?shared-article-renderer> accessed 21 February 2020 M. N. Schmitt, J. Thurnher, ‘’Out of the Loop’: Autonomous Weapons Systems and the Law of Armed Conflict’ (2013) Vol. 4 Harvard National Security Journal, <https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2212188> accessed 22 April 2020 M. N. Schmitt, ‘Military Necessity and Humanity in International Humanitarian Law: Preserving the Delicate Balance’ (2010) Vol. 50 No. 4 Virginia Journal of International Law M. Roorda, ‘NATO’s Targeting Process: Ensuring Human Control Over and Lawful Use of ‘Autonomous’ Weapons’ (2015) No 2015-13 Amsterdam Law School Legal Studies Research Paper, <https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2593697 > accessed 2 April 2020 M. Sassoli, ‘Autonomous Weapons and International Humanitarian Law: Advantages, Open Technical Questions and Legal Issues to be Clarified’ (2014) Vol. 90 International Law Studies, U.S. Naval War College, <https://digital-commons.usnwc.edu/cgi/viewcontent.cgi?article=1017&context=ils> accessed 14 November 2019 M. Schmitt, J. Biller, S. C. Fabey, D. S. Goddard, C. Highfill, ‘Joint and Combined Targeting: Structure and Process’ in J. Ohlin, L. May, C. Finkelstein (eds), Weighing Lives in War (Oxford University Press 2017), <https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2830229 > accessed 2 April 2020 M. Schulzke, ‘Autonomous weapons and distributed responsibility’ (2012) No. 26 Philosophy and Technology, http://link.springer.com/article/10.1007%2Fs13347-012-0089-0, accessed 9 February 2020 M. Wagner, ‘Taking Humans Out of the Loop: Implications for International Humanitarian Law’ (2011) No. 21 Journal of Law, Information and Science, <https://www.researchgate.net/publication/228193208_Taking_Humans_Out_of_the_Loop_Implications_for_International_Humanitarian_Law> accessed 1 April 2020 N. E. Sharkey, ‘The Evitability of Autonomous Robot Warfare’ (2012) Vol. 94 No. 886 International Review of the Red Cross, <https://www.law.upenn.edu/live/files/3399-sharkey-n-the-evitability-of-autonomous-robot> accessed 22 February 2020 N. Lee, S. Brown, ‘Otherness and the Actor Network: The Undiscovered Continent’ (1994) Vol. 37 No. 6 American Behavioural Scientists, <https://journals.sagepub.com/doi/10.1177/0002764294037006005> accessed 24 March 2020 N. Piacente, ‘Importance of the Joint Criminal Enterprise. Criminal Liability by Prosecutorial Ingenuity and Judicial Creativity?’ (2004) Vol. 2 Journal of International Criminal Justice
63
N. Weizmann, ‘Autonomous Weapon Systems under International Law’ (2014) No. 8 Academy Briefing Geneva Academy, <https://www.geneva-academy.ch/joomlatools-files/docman-files/Publications/Academy%20Briefings/Autonomous%20Weapon%20Systems%20under%20International%20Law_Academy%20Briefing%20No%208.pdf> accessed 22 February 2020 P. Asaro, ‘The labor of surveillance and bureaucratized killing: New subjectivities of military drone operators’ (2013) Vol. 23 No. 2 Social Semiotics, <https://www.tandfonline.com/doi/abs/10.1080/10350330.2013.777591> accessed 9 February 2020 P. Asaro, ‘On banning autonomous weapon systems: human rights, automation, and the dehumanization of lethal decision-making’ (2012) Vol. 94 No. 886 International Review of the Red Cross, <https://www.cambridge.org/core/services/aop-cambridge-core/content/view/992565190BF2912AFC5AC0657AFECF07/S1816383112000768a.pdf/on_banning_autonomous_weapon_systems_human_rights_automation_and_the_dehumanization_of_lethal_decisionmaking.pdf> accessed 15 November 2019 P. Asaro, ‘Why the world needs to regulate autonomous weapons, and soon’ (2018) Bulletin of the Atomic Scientists, <https://thebulletin.org/2018/04/why-the-world-needs-to-regulate-autonomous-weapons-and-soon/#> accessed 29 March 2020 P. Lin, G. Bekey, K. Abney ‘Robots in war: issues of risk and ethics’ In R. Capurro, M. Nagenborg, Ethics and Robotics (AKA Verlag Heidelberg 2009), <https://digitalcommons.calpoly.edu/cgi/viewcontent.cgi?referer=https://www.google.com/&httpsredir=1&article=1010&context=phil_fac> accessed 1 April 2020 P. Scharre, ‘Lethal Autonomous Weapons and Policy-Making Amid Disruptive Technological Change’ (2017) Just Security, <https://www.justsecurity.org/47082/lethal-autonomous-weapons-policy-making-disruptive-technological-change/> accessed 20 March 2020 R. Christian, ‘Heed the Call: A Moral and Legal Imperative to Ban Killer Robots’ (2018) HRW, <https://www.hrw.org/report/2018/08/21/heed-call/moral-and-legal-imperative-ban-killer-robots> accessed 29 March 2020 R. Crootof, ‘The Killer Robots Are Here: Legal and Policy Implications’ (2015), vol. 36 no. 5 Cardozo Law Review, <https://heinonline-org.tilburguniversity.idm.oclc.org/HOL/Page?lname=&public=false&collection=journals&handle=hein.journals/cdozo36&men_hide=false&men_tab=toc&kind=&page=1837> accessed 27 December 2019 R. Elio, A. Petrinjak, ‘Normative Communication Models for Agent’ (2005) Vol. 11 No. 3 Autonomous Agents and Multi Agent Systems, <https://link.springer.com/article/10.1007/s10458-004-0555-x> accessed 24 March 2020 R. Parasuraman, V. Riley, ‘Humans and automation: Use, misuse, disuse, abuse’ (1997) Vol. 39 No. 2 The Journal of the Human Factors Society, <https://www.ise.ncsu.edu/wp-content/uploads/2017/02/Parasuraman_Riley_1997_HF.pdf> accessed 20 March 2020 R. Sparrow, ‘Killer robots’ (2007) Vol. 24 No. 1 Journal of applied philosophy, <https://wmpeople.wm.edu/asset/index/cvance/sparrow> accessed 20 March 2020 R. S. B. Kool, ‘(Crime) Victims’ Compensation: The Emergence of Convergence’ (2014) Vol. 10 No. 3, <https://www.utrechtlawreview.org/articles/abstract/10.18352/ulr.281/> accessed 28 December 2019 R. Wagner, ‘Agility and Self-Organisation – Success Factors for the Prussian Army in the 19th Century’ International Project Management Association <https://www.ipma.world/agility-and-self-organisation-success-factors-for-the-prussian-army-in-19th-century/> accessed 3 January 2020 S. Russell, D. Dewey, M. Tegmark, ‘Research Priorities for Robust and Beneficial Artificial Intelligence’ (2015) Future of Life Institute: Boston, <https://futureoflife.org/data/documents/research_priorities.pdf> accessed 22 February 2020
64
S. Welsh, Regulating Autonomous Weapons’ (2017) RealClear Defense, <https://www.realcleardefense.com/articles/2017/11/16/regulating_autonomous_weapons_112647.html> accessed 20 March 2020 T. B. Sheridan, W. L. Verplank, ‘Human and computer control of undersea teleoperators’, (1978) Man-Machine Systems Laboratory, Department of Mechanical Engineering MIT, <https://apps.dtic.mil/dtic/tr/fulltext/u2/a057655.pdf> accessed 20 March 2020 The EU, NATO and Artificial Intelligence, (2019) Report ISS, <https://www.iss.europa.eu/sites/default/files/EUISSFiles/EU%20NATO%20AI%20-%20Report.pdf> accessed 28 March 2020 V. Boulanin, ‘Implementing Article 36 weapon reviews in light of increasing autonomy in weapon systems’ (2015) No. 2015/1 SIPRI Insight on Peace and Security, <https://www.sipri.org/sites/default/files/files/insight/SIPRIInsight1501.pdf> Accessed 20 April 2020, p. 15 V. Boulanin, M. Verbruggen, ‘Mapping the Development of Autonomy in Weapon Systems’, Stockholm International Peace Research Institute 2017, <https://www.sipri.org/publications/2017/other-publications/mapping-development-autonomy-weapon-systems> accessed 21 February 2020 W. J. A. Dahm, ‘Air Force Chief Scientist, ‘Report on technological horizons: A vision for air force science & technology during 2010-2030’ (2010) Office of the Chief Scientist of the U.S. Air Force, <https://www.researchgate.net/publication/302305033_Technology_Horizons_A_Vision_for_Air_Force_Science_Technology_During_2010-2030> accessed 20 April 2020
Other ‘Department of Defense Support to Foreign Disaster Relief (Handbook for JTF Commanders and Below)’,<https://fas.org/irp/doddir/dod/disaster.pdf > accessed 1 April 2020, p. A - 6 ICRC, ‘A Guide to the Legal Review of Weapons, Means and Methods of Warfare’ (2006) International Committee of the Red Cross, <https://www.icrc.org/en/publication/0902-guide-legal-review-new-weapons-means-and-methods-warfare-measures-implement-article> accessed 21 April 2020 ‘Laser quest: Phalanx, LAWS and the future of close-in weapon systems’ (2014) Naval Technology, <https://www.naval-technology.com/features/featurelaser-quest-phalanx-laws-and-the-future-of-close-in-weapon-systems-4295413/ > accessed 2 April 2020 ‘Losing Humanity: The Case against Killer Robots’ (2012) Human Rights Watch, <https://www.hrw.org/report/2012/11/19/losing-humanity/case-against-killer-robots> accessed 22 February 2020 International Criminal Court, ‘Applying the Principles of Nuremberg in the ICC’, (2006) Keynote Address at the Conference “Judgement at Nuremberg held on the 60th Anniversary of the Nuremberg Judgment, <https://www.icc-cpi.int/NR/rdonlyres/ED2F5177-9F9B-4D66-9386-5C5BF45D052C/146323/PK_20060930_English.pdf > accessed 27 December 2019 ICRC Expert Meeting on ‘Autonomous Weapon Systems: Implications of Increasing Autonomy in the Critical Functions of Weapons’(2016) Versoix, Switzerland, <https://shop.icrc.org/autonomous-weapon-systems.html?___store=default&_ga=2.162034876.1219892157.1578954954-792592069.1578954954> accessed 27 December 2019 ICRC, ‘Rule 1’, The principle of distinction between Civilians and Combatants, in the Customary IHL Database ICRC, ‘Rule 15’, The principle of Precautions in Attack, in the Customary IHL Database ICRC, ‘Rule 47’, Attacks against Persons Hors de Combat in the Customary IHL Database ICRC, ‘Views of the International Committee of the Red Cross on autonomous weapon systems’ (2016) ICRC Working Paper
65
<www.unog.ch/80256EDD006B8954/(httpAssets)/B3834B2C62344053C1257F9400491826/$file/2016_LAWS+MX_CountryPaper_ICRC.pdf> accessed 28 December 2019 ‘Inside the Phalanx’ (2013) <https://www.howitworksdaily.com/inside-the-phalanx-ciws/> accessed 2 April 2020 ‘Laser quest: Phalanx, LAWS and the future of close-in weapon systems’ (2014) Naval Technology, <https://www.naval-technology.com/features/featurelaser-quest-phalanx-laws-and-the-future-of-close-in-weapon-systems-4295413/ > accessed 2 April 2020 NATO Standard, ‘AJP-3.9 Allied joint doctrine for joint targeting’ (2016) Ed. A version 1, <https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/628215/20160505-nato_targeting_ajp_3_9.pdf> accessed 1 April 2020 ‘Phalanx Close-In Weapon System (CIWS)’ (2006), <https://web.archive.org/web/20060820113825/http://wrc.navair-rdte.navy.mil/warfighter_enc/weapons/shiplnch/Guns/ciws.htm> accessed 2 April 2020 ‘Phalanx Weapon System’ Raytheon Missiles & Defense, <https://www.raytheonmissilesanddefense.com/capabilities/products/phalanx-close-in-weapon-system> accessed 2 April 2020 Podcast, <https://www.bnr.nl/podcast/de-strateeg/10404954/oorlog-in-de-toekomst-doet-de-mens-nog-mee>accessed 20 March 2020 Podcast, <https://technologyandsociety.org/military-robots-mapping-the-moral-landscape/> accessed 20 March 2020 ‘Science for Peace and Security’ (2020) North Atlantic Treaty Organization, <https://www.nato.int/cps/en/natolive/78209.htm> accessed 22 February 2020 ‘USS Antrim (FFG 20)’ <https://www.navysite.de/ffg/FFG20.HTM > 2 April 2020 <https://casebook.icrc.org/glossary/proportionality> accessed 2 January 2020 <https://www.cccattorneys.com/glossary> accessed 2 January 2020 <https://www.public.navy.mil/surfor/Pages/Phalanx-CIWS.aspx> accessed 20 March 2020 <https://www.raytheon.com/capabilities/products/phalanx> accessed 20 March 2020
66
Appendix ___________________________________________________________________________ Article 25 of the Rome Statute:
In accordance with this Statute, a person shall be criminally responsible and liable for
punishment for a crime within the jurisdiction of the Court if that person:
(a) Commits such a crime, whether as an individual, jointly with another or
through another person, regardless of whether that other person is criminally
responsible;
(b) Orders, solicits or induces the commission of such a crime which in fact
occurs or is attempted;
(c) For the purpose of facilitating the commission of such a crime, aids, abets
or otherwise assists in its commission or its attempted commission, including
providing the means for its commission;
(d) In any other way contributes to the commission or attempted commission of
such a crime by a group of persons acting with a common purpose. Such
contribution shall be intentional and shall either:
(i) Be made with the aim of furthering the criminal activity or criminal
purpose of the group, where such activity or purpose involves the
commission of a crime within the jurisdiction of the Court; or
(ii) Be made in the knowledge of the intention of the group to commit
the crime;
(e) In respect of the crime of genocide, directly and publicly incites others to
commit genocide;
(f) Attempts to commit such a crime by taking action that commences its
execution by means of a substantial step, but the crime does not occur because
of circumstances independent of the person's intentions. However, a person who
67
abandons the effort to commit the crime or otherwise prevents the completion of
the crime shall not be liable for punishment under this Statute for the attempt to
commit that crime if that person completely and voluntarily gave up the criminal
purpose.