(AI)mageddon: Who is Liable When Autonomous Weapons Attack?
Militaries are increasingly incorporating autonomous targeting and decision making into machines. While previous autonomous features, like maintaining stability on a drone during a flight, are only tangentially connected to the process of killing, others, like targeting algorithms used, are much more directly implicated in the act.
This is of particular concern when it comes to assigning responsibility and liability for the actions taken by an armed machine. Autonomous features, often branded as Artificial Intelligence, lend themselves to an obscured chain of responsibility, with error possible in the sensor, the coding, the algorithmic process, the orders given by human controllers, or caused by emergent behavior.
Janet Abou-Elias and Lillian Mauldin, of Women for Weapons Trade Transparency, write that accountability and international cooperation are vital to mitigate the harms from lethal decisions by machines on the battlefield.
To address the pressing need for accountability in AWS, policymakers, legal experts and international organizations must work together to strengthen legal frameworks. This includes drafting and agreeing to clear regulations that delineate responsibility for AI-driven actions in warfare to ensure that all stakeholders are held accountable for any violations. Implementing these measures will be undoubtedly challenging as resistance from powerful defense lobbies and the inherent difficulties of achieving international consensus are prospective barriers.
International cooperation is crucial to bridge the legal gaps surrounding AI in warfare. It is only through consensus — building efforts that global standards of transparency, accountability and oversight can be adhered to. By learning from other AI-regulated industries, such as the automotive sector’s efforts to regulate autonomous vehicles and adapting those lessons to a military context, the international community can better safeguard against harms of AI technologies in warfare.
Read the full piece at the Fair Observer.