Autonomous weapons without human-in-the-loop are coming. It is argued that emerging technologies are ‘enablers’ and ‘force multipliers’ to the state’s conventional and nuclear forces. To date, most of the technologies, both offensive and defensive, are controlled by human beings for simple reasons of avoiding accidents and inadvertent escalation between rivals. The Ukrainian drone company, Sakar, is reported to have claimed the fielding of a fully autonomous weapon. Such automation is programmed in a way that may decide to kill on the battlefield autonomously without human involvement. Autonomous weapons capable of finding and selecting targets may potentially raise the moral, legal and ethical questions about its regulations, especially when the autonomous weapons do not recognise between the combatants and the non-combatants in different war fighting scenarios risking accidents and escalation the military rivals may not desire in the first place.

In contemporary times, both the major and smaller powers aspire the acquisition of latest technology to stay relevant and dominant internationally. States value technologies that benefit military advantages vis-à-vis their potential adversaries. States also develop capabilities to develop effective countermeasures for every technology. This creates an unending arms race in the field of emerging technologies and those who lag behind in acquiring nascent technologies may suffer. This is because ‘technological opportunism’ entails that states should make use of latest technologies for both offensive and defensive purposes.

As automated technologies integrate with military and nuclear deterrent forces, worry among competing states also increases on how and when to regulate the automation of emerging technologies. Keeping the cost and benefit analysis in consideration, states may not be quick enough as expected in striking a legally binding treaty by prohibiting the use of autonomous weapons. It took many years for nuclear weapons states to craft the Non-Proliferation Treaty (NPT). Today, the NPT is considered one of the largest treaties comprising 191 countries despite the weakness and loopholes, more especially its discriminatory structure between the haves and the have-nots. It took many years for states to conceptualize, draft and bring the Chemical/Biological Weapons Convention into force. Meanwhile, states keep the option for the possible withdrawal and abrogation of treaties on an extra-ordinary occasion. For example, North Korea withdrew from the NPT to test its nuclear capability while both the US and Russia withdrew from a number of bilateral treaties to acquire and deploy their deterrent forces prohibited by those treaties.

The regulation of autonomous weapons must begin with the major powers. As the US and the UK declared a policy in 2022 that they would always consider human-in-the-loop for the decision to use nuclear weapons, other powers will follow suit. It is recommended that major powers also need to consider the human involvement in regulating the automation of emerging technologies.

They need not only look at the best practices by regulating the access and application of autonomous weapons, but also regulate such technologies so that the non-combatants are not killed during a military crisis. The rule and regulations for fielding autonomous weapons must be devised in controlled way so that accidents and miscalculation between rivals is prevented. It is reported the automation of technologies fielded in the sea, on the ground and in the air program through AI could kill entities, even though this may be undesirable.

In summary, regulating the automation of emerging technologies through legally and politically binding treaties is needed. Any regulation for autonomous weapons after the major powers have already acquired them could be less productive. This in turn will result in an arms race, crisis instability, and the risk of conflict between the rivals.

Published in The Express Tribune, March 13th, 2024.

Like Opinion & Editorial on Facebook, follow @ETOpEd on Twitter to receive all updates on all our daily pieces.

QOSHE - Regulating the automation of emerging technologies - Dr Zafar Khan
menu_open
Columnists Actual . Favourites . Archive
We use cookies to provide some features and experiences in QOSHE

More information  .  Close
Aa Aa Aa
- A +

Regulating the automation of emerging technologies

26 0
13.03.2024

Autonomous weapons without human-in-the-loop are coming. It is argued that emerging technologies are ‘enablers’ and ‘force multipliers’ to the state’s conventional and nuclear forces. To date, most of the technologies, both offensive and defensive, are controlled by human beings for simple reasons of avoiding accidents and inadvertent escalation between rivals. The Ukrainian drone company, Sakar, is reported to have claimed the fielding of a fully autonomous weapon. Such automation is programmed in a way that may decide to kill on the battlefield autonomously without human involvement. Autonomous weapons capable of finding and selecting targets may potentially raise the moral, legal and ethical questions about its regulations, especially when the autonomous weapons do not recognise between the combatants and the non-combatants in different war fighting scenarios risking accidents and escalation the military rivals may not desire in the first place.

In contemporary times, both the major and smaller powers aspire the........

© The Express Tribune


Get it on Google Play