Skip to main content

Human Rights Impacts of Autonomous Weaponry

Technology and Human Rights

Summary

Learn about how lethal autonomous weapons systems violate international human rights and humanitarian law and why they must be regulated.

The Australian Human Rights Commission (Commission) has provided its submission in response to the United Nations Human Rights Advisory Committee's call for input on new and emerging technologies in the military domain (NTMD). 

Lethal autonomous weapons

Lethal autonomous weapons systems (LAWS) can be understood as weapons that independently select and attack targets. 

The Alliance calls for a full ban of LAWS where the technology is incompatible with international human rights law, international humanitarian law and international law.

Use of LAWS

Some of the most extensively documented use of LAWS in active conflict zones has been in the Libyan civil war and the Russia-Ukraine War.

There is evidence of Russian forces using POM-3 ‘Medallion’ anti-personnel mines in conflict. This mine has a seismic sensor to enable it to detect movement in a radius of 16 meters and detonate.

In Libya, LAWS were used as drones to strike targets without the need for connection between the operator and the munition, in what is described as a ’fire, forget and find’ method.

UN action

In 2023, the UN Secretary-General’s New Agenda for Peace called for the prohibition of LAWS, recommending that States develop a legally binding instrument that bans LAWS

The UN Secretary-General, António Guterres, and the President of the International Committee of the Red Cross, Mirjana Spoljaric, have also made a joint appeal for States to ’urgently establish new international rules on autonomous weapons systems, to protect humanity’.

Since then, the First Committee of the UN General Assembly adopted its first-ever resolution on autonomous weapons on 01 November 2023. 20 The resolution stressed the ’urgent need for the international community to address the challenges and concerns raised by autonomous weapons systems’.

Proportionality 

One of the core principles of international humanitarian law is the principle of proportionality.

This principle prohibits attacks that are 'expected to cause incidental loss of civilian life, injury to civilians, damage to civilian objects, or a combination thereof, which would be excessive in relation to the concrete and direct military advantage anticipated'.

An attack will be proportionate if the perceived advantages outweigh the harms.

LAWS rely upon artificial intelligence (AI) and facial recognition technologies (FRT) to independently identify and conduct attacks.

AI is unable to understand the intrinsic value of human life, thus making it unable to undertake any weighing exercise in relation to proportionality.

If proportionality in warfare is not adhered to, then neither are the human rights of civilians. 

UN Secretary-General António Guterres has previously declared that machines determining proportionality in life or death situations are ‘politically unacceptable and morally repugnant’.

Independent review

Article 36 of Protocol I to the Geneva Conventions provides that States have an obligation to carry out legal reviews of new weapons to ensure that armed forces conduct hostilities in accordance with international law.

This process is an internal one, predicated on good faith reviews as States are also not obliged to disclose the outcome of these reviews.

Given the growing use of LAWS in conflict zones (at the time of writing), and mounting pressure from the global community to regulate and prohibit LAWS the review function must be strengthened.

Investigation and reporting measures applied to NTMD need to be strengthened. The introduction of a new special procedure to advise the Human Rights Council on NTMD would improve transparency.