Title: Autonomous Lethal Systems international law and societal realities Dr Chris Elliott FREng FRAeS Syst
1Autonomous Lethal Systems - international law and
societal realities Dr Chris Elliott FREng
FRAeSSystem engineer and barristerPitchill
Consulting Ltd
2Alternative title Can a robot commit a war
crime?
3Continuum of uninhabited military vehicles
effectiveness of abort command
4Hierarchy of enablers
technologiesto enable autonomy
capacityto engineer autonomy
willingnessto exploit autonomy
5Legal framework
- deploying autonomous systems would raise issues
of - civil law eg who is liable to pay damages for
unlawful harm - administrative law eg road traffic, COLREGS,
civil aviation (including infringement of neutral
State) - Law of International Armed Conflict (LOIAC)
- concentrate on LOIAC and societal factors that
are related to it - In the study, development, acquisition or
adoption of a new weapon, means of or method of
warfare, a High Contracting Party is under an
obligation to determine whether its employment
would, in some or all circumstances, be
prohibited by this Protocol or by any other rule
of international law applicable to the High
Contracting Party. Art 36, Protocol 1, 1949
Geneva Convention
6Basic concepts of LOAIC
- rooted in concept of war from 19th and 1st half
of 20th century - addresses primarily armed conflict between
sovereign States - a State is either in state of peace or at war
- a person is either a combatant or a non-combatant
- civilians and prisoners of war are protected
- relies on established norms of international law
and specific codes in successive Hague and Geneva
conventions - lags behind emerging technology but must apply
spirit of law - In cases not covered by this Protocol or by other
international agreements, civilians and
combatants remain under the protection and
authority of the principles of international law
derived from established custom, from the
principles of humanity and from the dictates of
public conscience. Protocol 1 1977, as Martens
Clause, Hague Conv. 1907
7Principles of LOAIC relevant to Autonomous Lethal
Systems (ALS)
- general prohibitions on unnecessary suffering,
long-term environmental harm, violation of
neutral territory, not make civilians the object
of attack, distinguish combatants and
non-combatants - ALS fall between
- military vehicles (assumed to be inhabited by
people in uniform or to have designated markings) - mines, torpedoes bombs (assumed to behave
predictably after release, must disarm safely) - weapons that are intrinsically incapable of
distinguishing between civilian and military
targets are illegal - serious breach of LOIAC (eg war crime) is also
crime in UK International Criminal Court Act
2001
8LOIAC in 21st century
- assumptions decreasingly valid with asymmetric
warfare - is UK currently at war?
- distinction between combatant and non-combatant
blurred - how to protect civilians when used as human
shields or suicide bombers? - LOIAC is becoming a weapon itself both sides
claim the legal (and hence moral) high ground - conclude that it is necessary to design
autonomous systems that can comply with Rules of
Engagement but RoE might not necessarily follow
simple precepts of LOIAC - face judgement in the Court of Public Opinion
trial by CNN? - LOIAC as a set of standards of acceptable
behaviour rather than a formal legal code?
society expects its military to behave in a
manner compatible with its moral values. Andrew
White
9Which of these is a war crime?
- a system that autonomously targeted and destroyed
an ambulance marked with the Red Cross symbol
that the system was unable to recognise - a system that autonomously targeted and destroyed
a tank that was parked next to an ambulance that
was also destroyed in the blast - an autonomous system that suffered a technical
failure and crashed into and destroyed an
ambulance -
- if 3 is not a war crime, it is sensible to
distinguish lethal and non-lethal systems - would the distinction between 2 and 3 hold up if
the incident were live on CNN?
10If 1 is a war crime, who committed it?
- designer who failed to include the capability
to recognise international legal markings - trainer (analogous to animal trainer) who
failed to teach the raw system to recognise
actual symbols in use - configurer who failed to include appropriate
Rules of Engagement - operator who enabled and despatched the
inadequate system - commander who permitted an inadequate system to
be deployed - or the autonomous system itself?
11When would deployment of an ALS be acceptable?
- asymmetric warfare charges human military
personnel with almost impossible decisions - suggest that an ALS should be no worse than a
human at taking those decisions hence a kind of
military Turing test - Alan Turing proposed a test of true artificial
intelligence it is not possible to tell if the
reply to a series of questions is generated by a
person or a machine - not there yet (and maybe a long way off)
- legality (in the formal courts and the court of
public opinion) is a major barrier to deployment
of truly Autonomous Lethal Systems
12So who will be the first to deploy ALS?
technologiesto enable autonomy