Author : Trisha Ray

Originally Published 2019-02-22 06:30:53 Published on Feb 22, 2019

In the absence of clear norms on human accountability and attribution for autonomous weapons, we could see states like Pakistan deploy LAWS for operations outside their borders.

Machine-driven weapons need an international system of accountability

In March 2014, hundreds of mysterious gunmen in camouflage appeared on the streets of Crimea and began taking over local government buildings. While Russia initially denied the existence of the “little green men”, as they came to be known, President Vladimir Putin admitted that they were Russian military at the one-year anniversary of the Crimean occupation. Ethical issues and the complete violation of the Geneva Conventions notwithstanding, the logic behind this tactic was quite straightforward and compelling: Aid pro-Russian forces while creating enough uncertainty about Russian involvement so as to prevent NATO retaliation and global backlash.

Lethal Autonomous Weapons Systems (LAWS) — which can detect, select and attack targets without human intervention — are one such avenue.

International rules around LAWS are relatively underdeveloped, and in the absence of clear norms on human accountability and attribution for autonomous weapons, we could see states like Pakistan deploy LAWS for operations outside their borders.

LAWS present several benefits for “middle powers”: They increase the reach and effectiveness of forces, reduce casualties and enable persistent presence in vast, inaccessible terrains. Countries like India or South Korea, which operate in a complicated geostrategic context, can therefore use LAWS to effectively police and protect their territory. On the flipside, LAWS can be used by state and non-state actors to engage in asymmetric tactics. This could take three forms: A state could directly deploy LAWS against an adversary state; a state could equip proxies such as insurgent or terrorist groups with autonomous weapons units; a non-state actor steals or otherwise illegally acquires autonomous systems or units.

With this destabilising potential in mind, external state actors that actively aid insurgencies and terrorist organisations will be tempted to deploy autonomous systems and claim they are stolen or rogue units.

While LAWS are still in the development stage and are fairly inaccessible for most states — let alone non-state groups — due to high costs, and lack of skilled AI talent and operators, it is not a complete stretch of imagination to envision a future where autonomous weapons are within the reach of any state or non-state actor that wants them.

However, even in the absence of comprehensive international framework agreements on LAWS, stakeholders in the emerging LAWS ecosystem need to promote the creation of export controls and rules.

Relevant private technology companies — some of which (like Google) have already taken the lead in developing internal ethical guidelines for AI technologies — with buy-in from state actors should establish an export control group to create guidelines for LAWS and component technology sales. This must include basic stipulations on accountability in cases of theft or hacking. Suppliers must be able to prove that they have in place the necessary physical and non-physical safeguards to protect their LAWS technologies. In this vein, the AI and weapons industry must craft specific standards for such safeguards. Autonomous systems themselves could help in export controls through persistent surveillance of LAWS manufacturing facilities, although it may be difficult to get actors to agree to such measures.


This commentary originally appeared in Indian Express.

The views expressed above belong to the author(s). ORF research and analyses now available on Telegram! Click here to access our curated content — blogs, longforms and interviews.

Author

Trisha Ray

Trisha Ray

Trisha Ray is an associate director and resident fellow at the Atlantic Council’s GeoTech Center. Her research interests lie in geopolitical and security trends in ...

Read More +