Drones have become a weapon of choice in modern warfare due to their surveillance and reconnaissance capabilities, as well as their potential for conducting targeted strikes. They represent a step in the direction of autonomous warfare. They share technological foundations such as Artificial Intelligence (AI) and raise the same ethical considerations as Lethal Autonomous Weapon Systems (LAWS).

On 23 August 2023, a drone attack on a local school at Romney, Northeastern Ukraine, reportedly ended in the loss of several lives. The ambiguity concerning who was accountable for this lethal attack arises due to the type of drone employed in the strike. On exploring its broader implications, this incident underscores the urgent need for discussions and international agreements regarding the use of LAWS.

In a recent interview, Mykhailo Fedorov, Minister of Digital Transformation of Ukraine, said “There is a need for an army of drones because it helps us in real time to get quality information on the enemy,” and “it allows us to hit the enemy on ground, sea, and air.”

There is an increase in drone warfare which marks an arms race, and ethical and humanitarian challenges that demand global attention.

International Humanitarian Law (IHL) offers a set of rules with the aim to preserve human rights during war. It safeguards the dignity and protection of individuals. Autonomous weapons pose challenges to principles of distinction, precaution, and proportionality. It emphasizes the control over the use of force by an operator under command and control. The IHL prohibits weapons of indiscriminate nature and restricts war crimes. The absence of human control or the unknown source of the weapons creates a responsibility and accountability gap.

Therefore, the Group of Governmental Experts (GGE) discussion on LAWS and proposals for a framework to prohibit or restrict further use of certain conventional weapons are the main agenda items on disarmament in the CCW. After extensive deliberations spanning nearly a decade, on 24 May 2023, the GGE of the CCW established a framework of non-binding prohibitions and limitations concerning autonomous weapon systems.

Weapon Systems Based on Emerging Technologies in the Area of LAWS

The Chair of the CCW, Ambassador Flavio S. Damico of Brazil, presented the final draft of the framework on emerging technologies in the area of LAWS. The report defines autonomous weapon systems as weapons that, once activated, select and apply force to targets without human intervention. This futuristic functional definition between autonomous and non-autonomous weapons received support from a group of different states in the GGEs meeting of the High Contracting Parties.

In addition, concerns are rising about the military applications of AI-based drones. It is drawing public and political attention with discussions focused on regulating autonomous weapons. States must continue the progressive development of IHL.

Many states and organizations are supporting the negotiation of new and legally binding international rules and frameworks on autonomous weapons.

Au contraire, big powers still lack interest in establishing norms and criteria for the development and deployment of autonomous weapons. This lack of interest is due to the intense technological competition which is similar to the balance of power between the Soviet Union and the US during the Cold War. Nonetheless, there is an urgent need to strengthen arms control and disarmament regimes that eventually may aim to establish meaningful prohibitions and restrictions on uncontrolled proliferation and misuse of autonomous weapons.

In the meantime, where this CCW framework lays the groundwork for potential international regulations, more specific and practical rules will be necessary to effectively address the humanitarian, legal, ethical, and security concerns associated with autonomous weapons. Some of these regulations can be drawn from concrete proposals submitted by states for prohibitions and restrictions. These include states like the US, Austria, and Pakistan, who have all made proposals based on their concern for regulatory measures to ensure accountability of such attacks on civilians.

One of the proposals submitted to the Group in the CCW by Pakistan suggests that under the accountability of states for internationally wrongful acts, “humans responsible for and in control of any weapons system based on emerging technologies in the area of LAWS remain accountable for the consequences of using such weapons.”

There should be agreed international and national measures by the international community regarding suspected, reported, or documented violation(s) by development, deployment, or use of such weapon systems.

It is important to notice that such a void suspicion or ambiguity can create space for advantage by non-state actors or third parties to flare up a hot conflict between two adversaries. This situation is a risk to arms control regimes like CCW for peace and prosperity among the nations in the international arena.

There are evolving technologies in today’s world, where these unmanned systems even in their initial phase of LAWS, play an increasingly detrimental role. There is a dire need for a meaningful normative framework in the GGE that responds to all challenges and concerns associated with unmanned vehicles or LAWS within the umbrella of IHL. Such a framework also needs to address the ambiguity of accountability regarding the use of force on civilians in different conflict zones, bearing in mind restrictions and regulations to the development, deployment, and use of force for LAWS. The Romney school attack serves as a sad reminder of the potential consequences of advancing technology in conflict zones. It also underscores the importance of addressing these complex issues to ensure a safer and more secure world.

Print Friendly, PDF & Email