UNITED NATIONS: Pakistan has called on a “handful of states” to drop their opposition to negotiations on a legally-binding instrument aimed at prohibiting autonomous weapons powered by artificial intelligence (AI) software, called lethal autonomous weapons systems (LAWS), observing they pose a threat to international security.
From drones and missiles to tanks and submarines, semi-autonomous weapons systems have been used for decades to eliminate targets in modern-day warfare — but they all have human supervision.
Nations such as the United States, Russia and Israel are now investing in developing LAWS which can identify, target, and kill a person all on their own — but to date, there are no international laws governing their use.
“The development of LAWS, or killer robots, has emerged as perhaps the defining concern on the international arms control agenda, along with WMDs (weapons of mass destruction),” Ambassador Khalil Hashmi, permanent representative of Pakistan to the UN in Geneva, told the First Committee of General Assembly which deals with disarmament and international security.
“Yet,” the envoy added, “rather than addressing these fundamental concerns, a handful of states continue to stall meaningful progress on the normative track and overtly oppose the development of internationally-agreed legal norms, rules or regulations to govern the design, development and use of these weapon systems.”
Speaking in a thematic debate, Ambassador Hashmi said the military application of new and emerging technologies outpaced the application of existing principles and norms.
LAWS were increasing dangers arising from the use of the new weapons systems even as they reduced or eliminated risks of human casualties for the user, he said in New York.
The possession of those systems increased the propensity of their use and the likelihood of symmetric and asymmetric responses, thereby lowering the threshold for armed conflict, the Pakistani envoy said. As a result, risks and threats to peace, security, and stability at the global and regional levels were growing.
Ambassador Hashmi said LAWS did not comprise one or two types of weapons, but a capability category with layers of unpredictability and cascading, destabilising impacts on regional and international security.
Those weapons also amplified aspects of force, multiplication, and asymmetry, he said, adding that their growing autonomy, based on machine learning algorithms and increased speed of actions during operations, furthered the predictability of their behaviour.
Nations would be tempted to mitigate the military advantage of adversaries by asymmetrical means, the Pakistani envoy said. The spiral of reprisals perpetuating or expanding a conflict was a foregone conclusion.
Ambassador Hashmi also said that the weaponisation of ICTs (information and communication technologies) and cyberspace posed serious risks to peace, security and stability at the international as well as regional levels.
“The ability to act anonymously without traditional geographical limitations, coupled with the ability to mass produce cyber weapons cheaply, makes them extremely attractive and dangerous,” he said.
“Several States are developing ICTs as instruments of warfare to achieve political objectives.”