Schrödinger’s Gap: Autonomous Weapons Systems and Legal Liability
MetadataShow full item record
This thesis examines issues surrounding the development and deployment of autonomous weapons systems (hereafter “AWS”) from two perspectives: a legal perspective (particularly in the context of international legal frameworks, such as international criminal law and international humanitarian law) and an engineering perspective (particularly involving issues of engineering ethics and professional responsibilities). Much attention has been paid to the development and deployment of AWS in recent years. Not only have the operational capabilities of AWS (including their accuracy, reliability, and deployability in a variety of different conditions) been called into question, but so too has their legal status, as well as the liability for any illegal acts associated with their deployment. In particular, the potential “gap” in liability associated with AWS (where a crime has been committed by an AWS without a clearly identifiable human actor to whom responsibility may be assigned) has been the source of much concern and consternation on the part of legal experts, human rights activists, engineers, and computer programmers. While the ultimate status of this “gap” will be determined by a court at a later date when the first cases involving AWS are inevitably heard, I believe that it does not truly exist, and that ultimate liability instead rests with the engineers and computer scientists responsible for the development of the AWS themselves. Furthermore, I support Dr. Daniele Amoroso’s call for a requirement of “meaningful human control,” an operational requirement that would circumvent the issue of a liability “gap” altogether. The only true potential “gap” in liability that remains is one that is wholly artificial: a gap wherein engineers, computer scientists, and other STEM professionals are actively sheltered from criminal liability, particularly by those governments and/or organisations wishing to benefit from their knowledge.