What is autonomy in weapons systems?
Autonomy in weapons systems has existed, to a certain degree, for decades. However, with advances in artificial intelligence, specifically in the area of machine learning, there is a trend towards increasing autonomy in various functions of weapons systems, including in critical functions such as target selection and the application of force.
Systems that use machine analysis of information acquired from sensors to automatically select and engage targets, such that a human operator does not determine specifically where, when or against what force is applied, are of particular concern. In these systems, upon activation, there is a period of time where the weapons system can apply force to a target without additional human approval.
The specific object to be attacked, and the exact time and place of the attack, are determined by sensor processing, instead of an immediate human command.
Why is increasing autonomy in weapons systems something to be concerned about?
The automation of certain functions, including target selection and engaging of targets, are critical functions which should remain within meaningful human control.
Systems that function in this way present moral, ethical, humanitarian, operational and legal challenges, and threaten international peace, security and stability. Autonomy in weapons systems diminishes the control of the human operator and undermines accountability and responsibility in conflict.
They raise serious concerns over compliance with international human rights law and the international humanitarian law principles of distinction, proportionality, precaution and the prohibition of indiscriminate attacks, as documented by the International Committee of the Red Cross.
A selection of relevant reports are available below.
Relevant reports

Convergences in state positions on human control (2023)
| Automated Decision Research
This paper presents an examination of convergences in state positions on human control in the context of autonomy in weapons systems. Despite differences in phrasing when discussing the human element as it relates to autonomous weapons systems, there is much [...]

Targeting people and digital dehumanisation (2023)
| Automated Decision Research
This short briefing paper addresses the need for a prohibition on autonomous weapons systems designed or used to target humans, and the digital dehumanisation inherent in such systems.

Autonomous weapons and digital dehumanisation (2022)
| Automated Decision Research
This short explainer paper discusses autonomous weapons in the context of digital dehumanisation.

Background Briefing: Review of the 2023 US Policy on Autonomy in Weapons Systems (2023)
| Human Rights Watch
A new directive on autonomy in weapons systems issued on January 25, 2023 shows the United States Department of Defense (DoD) is serious about ensuring it has policies and processes in place to guide its development, acquisition, testing, fielding, and [...]

Increasing complexity: Legal and moral implications of trends in autonomy in weapons systems (2023)
| PAX
This report looks at a number of trends related to autonomy in weapons systems (artificial intelligence, automatic target recognition and swarming) and the questions they raise about how meaningful human control can be retained. It gives an introduction into each [...]

Completely outside human control? (2023)
| Article 36
This discussion paper provides commentary on the proposed prohibition in the Working Paper submitted by Finland, France, Germany, the Netherlands, Norway, Spain and Sweden to the 2022 GGE on LAWS, July 2022. The paper notes that 'The notion of a [...]

Artificial intelligence and automated decisions: shared challenges in the civil and military spheres (2022)
| Automated Decision Research
This paper provides an initial sketch of responses to AI and automated decision-making in wider society, while contextualising these responses in relation to autonomy in weapons systems.

Autonomous weapons as a solution to war crimes? (2022)
| Article 36
A common claim about autonomous weapons in the media, and amongst pundits, is that autonomous systems would not suffer human emotions and would not therefore undertake the atrocities that are driven by these emotions in conflict. A new discussion paper [...]

Armes autonomes et déshumanisation numérique (2022)
| Automated Decision Research
La déshumanisation numérique est le processus par lequel les humains sont réduits à des données, qui sont ensuite utilisées pour la prise de décisions et/ou d'actions qui affectent négativement nos vies. Ce procédé dépossède les gens de leur dignité, dévalorise [...]

Armas autónomas y deshumanización digital (2022)
| Automated Decision Research
La deshumanización digital es el proceso por el cual los humanos quedan reducidos a datos, los cuales, después se utilizan para tomar decisiones y/o emprender medidas que afectan negativamente nuestras vidas. Este proceso despoja a las personas de su dignidad, [...]