Some updates on the Stop Killer Robotics Campaign and the use of autonomous weapons in the war in Ukraine (July 2022).
25-29 July 2022: Government Expert Group on Autonomous Weapons meets
On 25-29 July, the Group of Governmental Experts (GGE) on Autonomous Weapons Systems (Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May Be Deemed to Be Excessively Injurious or to Have Indiscriminate Effects) will meet for its second formal session in 2022. Participants will focus on formulating recommendations for the meeting of the High Contracting Parties (the States Parties to the Treaty) to the Convention on Certain Conventional Weapons, which should be aimed at advancing the development of a normative and operational framework on autonomous weapons. (Source https://www.reachingcriticalwill.org/)
The Group has been considering this issue since 2017 and there is still no agreement. Russia opposes international legal controls and is boycotting the discussions, for reasons related to the invasion of Ukraine, making an unanimous agreement impossible.
The only State that has passed an Autonomous Weapons Directive is the United States with DOD 3000.09 of November 2012. Unfortunately, despite several statements on the subject by leaders of the US military, this directive does not limit the use of autonomous weapons to the ‘human in the loop’ condition, i.e. the condition that the ultimate responsibility must be a human. DODD 3000.09 does not prohibit autonomous weapons, nor does it establish a requirement that US weapons must have a ‘human in the loop’.
The problem – also present in DODD 3000.09 – is that despite eight years of negotiations at the UN, there is still no internationally agreed definition of autonomous weapons or lethal autonomous weapons. Nor of autonomous weapons supported by Artificial Intelligence. In fact, often when AI linked to autonomous weapons is mentioned, machine learning capabilities are meant, whereas traditional software is sufficient to guarantee a high degree of autonomy for some military applications.
For instance, the Israeli Aerospace Industries’ (IAI) Harpy is an unmanned drone built in 2004 and recognised as an autonomous weapon. When in autonomous mode, the Harpy flies over a given region for up to nine hours, waiting to detect electromagnetic emissions consistent with an enemy radar library, focus on the source of the emissions (usually an enemy air defence radar) and attack. No human intervention is required. (Source: Gregory Colin Allen Center for Strategic and International Studies (CSIS)
These considerations indicate that we are still far from an agreement on Autonomous Weapons and that definitional issues will lengthen the time even further.
Reports on the use of autonomous drones in Ukraine are as vague and imprecise as those on the KUBLA drone employed by Russia. But every attentive commentator confirms that, as the war in Ukraine will take a long time, drones will certainly be deployed on both sides. And with a short time frame.