top of page
Tejaswini

Changing the LAWs of War: The Automation of Warfare

In the words of army general Douglas Macarthur, a military leader who served in multiple wars, “Whoever said the pen is mightier than the sword obviously never encountered automatic weapons.” The potential to inflict a significant amount of destruction lies within the confines of military armaments that do not require human intervention on a major scale; this highlights the need to have a clear definition of Lethal Autonomous Weapons Systems and the ethical questions concerning them. The impact that the development of these weapons has on a global scale not only is a display of technological progress, but can also change the course of the dynamics of warfare. Before we delve deeper into the realm of byte-sized killer robots, what are LAWS?


In 2012, the US Department of Defense came up with what is universally agreed to be the most precise definition of Lethal Autonomous Weapons Systems. According to them, these systems have the characteristic of being fully autonomous, which implies that they can independently select and engage targets without further intervention by a human operator. While various countries and non-state organizations have similar variations, the commonality is autonomy. This comes close to bleeding into the realm of General Artificial Intelligence, but reality isn’t run by Tony Stark. The influence of these weapons on future discourses is a ground for ongoing international debates. The decision to endorse or prohibit these weapons lies in the outline of its possible developments, and the best way to predict this is through the analysis of historical context.


The first guided missile, Fritz X, emerged as Germany’s ingenious creation developed in a time that dates back to 1943. Since then, nations like the US during Vietnam war, Israel and the Soviet Union contributed to the development of quite a few LAWS, but it wasn’t until the 1980s that their proliferation gained momentum. They became more sophisticated, boasting features like extended range, precision guidance and simultaneous firing of rockets. They were also made a part of UAVs, a notable example being the Predator drones deployed in Pakistan and Afghanistan by the US after the 9/11 attacks.


Since then, the propagation of LAWS has continued, with countries like China, Russia, South Korea and Turkey shaping their own arsenals. The onset of Artificial Intelligence and Machine Learning in the present time makes it easier for these weapons to adapt to battlefield conditions and improve their performance over time. Ukrainian forces have used LAWS, such as Turkish-made Bayraktar TB2 drones, to attack Russian targets with considerable success, highlighting its relevance in the present. These weapons have also integrated multiple cameras, radar and lidar systems that helps them have a situation awareness better than humans do, so much so that they might soon be able to see John Cena through computer vision. Increased precision, lesser risk to human operators and force multiplication are a few potential benefits with the potential to revolutionize warfare.

The focus on whether LAWS are a curse hinges in the ethics and legalities of these systems. In the event of harm or damage, there is a huge loophole that needs to be addressed demanding attention to the issue of accountability. Precisely, who is to blame? The developer of the weapon, the country manning it or the weapon in itself? This links into how a fully autonomous weapon may lack the human judgement that we possess and can result in more brutality or miscalculations. To comply with the International Humanitarian Law, they need to comply with the aspect of reliability when it comes to distinguishing between combatants and civilians.


Considering everything mentioned above, a Group of Experts Meeting of the Convention on Certain Conventional Weapons (CCW) was held in Geneva, Switzerland in August 2023; with the focus of the meeting being the development and regulation of LAWS. The meeting established a mandate for future negotiations, recognized the need for international regulation and committed to responsible development and use of LAWS. As we explore the diverse panorama of opinions that countries worldwide hold on them at present, we realize the contrasting viewpoints of nations and the different approaches they take on this matter.

When it comes to support for autonomous weapons, countries like The United States and Russia take a strong stance by investing in research of these weapons and their deployment, like in Russia’s case, in Ukraine and Syria. They recognize its military capabilities and aim for modernization of warfare while also keeping in mind the complexities surrounding it, unintended consequences that may arise and responsible use of these weapons. Nations like Israel and Turkey too, have shown their support.


The ”Made in China” trademark continues to solidify its legacy and arming itself for the future with China making significant investments to enhance the capabilities of autonomous weapons systems, although they haven’t stated their stance on the international regulation of LAWS, stating technological parity with other nations and security concerns as reasons. South Korea, on the other hand, view the production and evolution of weapons being a deterrent to counter they nuclear and missile threats they face from North Korea and advocate for the regulation and use of these weapons for legitimate defense purposes.


Interestingly, a plethora of nations push for the prohibition of Lethal Autonomous Weapons Systems, including but not limited to Brazil, Germany, New Zealand, Pakistan and Switzerland. They argue that LAWS may not be able to comply with the principles set up by the International Humanitarian Law and raise concerns regarding ethics, legalities, arms races, accountability and the future of warfare. India takes the balanced approach by looking at these as a force multiplier while recognizing the efforts taken to establish clear norms and principles for their development, deployment and testing.


The future of autonomous weapons may not look pixel perfect, and the UN’s position on LAWS is likely to develop as the technology used in these weapons does, and the number of countries deploying them. In 2023, the UN Secretary-General called for a legally binding instrument to prohibit LAWS that cannot comply with IHL, garnering widespread support. After all, if we came from landmines being the autonomous blast from the past to weapons integrated with artificial intelligence, it isn’t long before even killer robots need rules to abide by. It is imperative to recognize the gravity of the decisions we make today that shapes our tomorrow, for warfare concerns not only technological progress but also human judgement.

59 views0 comments

Comments


bottom of page