 
            
                           By Adya Madhavan
Lethal autonomous weapons systems, or LAWS, may be on the brink of becoming central to military operations. However, the mechanisms to govern such capabilities are struggling to keep up. By their very nature, LAWS are difficult to define and challenging to restrict.
Can’t define, can’t proscribe
The definitional conundrum stems from the desire to not have emergent military capabilities be subject to undue restrictions - if you can’t define it, you can’t proscribe it. As a result, nations struggle to agree on the nitty-gritty – the level of autonomy of these systems, and the contexts in which they can operate autonomously. A common thread to most definitions, though, is that LAWS are systems with the highest possible level of autonomy, in that they can both select and strike a target independently of human intervention.
Given their ability to process real-time sensor data rapidly during missions and simultaneously reduce the risk to human actors by operating autonomously, LAWS potentially provide significant military advantages.
Developing autonomous and semi-autonomous systems is well within the realm of possibility for countries that already have advanced AI capabilities, and it is understandable that they do not want limitations placed on them. Moreover, AI, which is the underlying technology behind autonomous systems, is widely diffused across both the civil and military sectors, making it difficult to verify compliance with any binding restrictions against the development of LAWS.
LAWS come with new risks
However, LAWS create new risks. These range from misidentification to civilian casualties to inadvertent escalation. The very aspects of LAWS that have the potential to provide precision, speed, and efficiency can bring have unintended effects.
As a result of all these different factors, nations find themselves grappling with three things when they think about LAWS.
Firstly, that autonomous weapons systems should comply with their domestic laws and military ethos.
Secondly, that they need clear human accountability for unintended consequences.
And last of all, they need the same clear accountability to be practiced by other countries, in case they deploy lethal autonomous weapons in battle.
Every country approaches emerging technologies with a fine balance between its own technological developments and the risks posed by the developments of other countries.
For India, given its volatile relations with Pakistan and China, LAWS in the neighbourhood would compound pre-existing security threats. China is widely acknowledged as being a global leader in developing many advanced technologies, including AI. As the biggest importer of Chinese arms, Pakistan, too, could potentially be equipped with Chinese autonomous weapons. On the other hand, as India continues to ramp up its own investment in AI, its stance should leave the door open to developing its own autonomous systems.
Thus far, India has maintained a cautious stance on LAWS – abstaining from taking any strong positions on the need for a legally binding instrument or the restrictions that are proposed on LAWS. However, given the rapid technological strides being taken globally in the applications of AI and India’s own security environment, arguably, this position no longer reflects India’s own interests. Given India’s involvement in conversations about AI, it should shape the discourse in alignment with principles of responsible use that reflect its own strategic and geopolitical challenges.
Countries need to engage on an accountability mechanism
Considering the unlikelihood of an outright ban on LAWS, it is imperative that their development and deployment is characterised by appropriate levels of human oversight.
While completely autonomous weapons may be recent developments, oversight over semi-autonomous weapons is not new to militaries. ‘Fire and forget’ missiles and landmines have been a part of warfare for several decades. However, their limited operational context restricts their degree of autonomy, therefore reducing the scale of oversight needed.
Similarly as more evolved autonomous weapons come into being, instead of attempting to prevent their development, nations should focus on clearly defining their operational contexts and focus on having clear chains of accountability, in order to ensure that they are built and used responsibly.
(Adya Madhavan is a Research Analyst at Takshashila Institution working on advanced military technologies and geopolitics.)
Views are personal and do not reflect the stand of this publication.
Discover the latest Business News, Sensex, and Nifty updates. Obtain Personal Finance insights, tax queries, and expert opinions on Moneycontrol or download the Moneycontrol App to stay updated!
Find the best of Al News in one place, specially curated for you every weekend.
Stay on top of the latest tech trends and biggest startup news.