Skip to main content

Search

Australia and Lethal Autonomous Weaponry

Technology and Human Rights

Summary

Learn more about Australia's approach to lethal autonomous weapons systems and what this means for human rights.

The Australian Human Rights Commission (Commission) has provided a submission to the Joint Standing Committee on Foreign Affairs, Defence and Trade inquiry on the Department of Defence's Annual Report 2022-23. 

Lethal autonomous weapons systems 

Lethal autonomous weapons systems (LAWS) is not easily defined, but can generally be understood as a weapon system that is capable of independently selecting and attacking targets. 

The Commission has previously noted the human rights concerns of LAWS in a global speech at RightsCon Costa Rica, an independent submission to the United Nations and a co-authored submission on behalf of 24 member National Human Rights Institutions.

The Commission has now considered LAWS from a national perspective in this submission. 

International humanitarian law

One of the core principles of international humanitarian law (IHL) is the principle of proportionality. An attack will be proportionate if the perceived advantages outweigh the harms. This requires a weighing exercise on the value of human life against strategic objectives. 

It is arguably impossible for AI to comply with the proportionality rule because AI is unable to understand the intrinsic value of human life. 

UN Secretary-General António Guterres has previously stated that machines determining proportionality in life-or-death situations is ‘politically unacceptable and morally repugnant’.

Distinction 

Another foundational rule of IHL is the rule of distinction, which seeks to minimise the impact of armed conflict on civilians by prohibiting targeting civilians and indiscriminate attacks.

LAWS will have faster responses due to their data processing capabilities. However, the underlying technologies are not sophisticated enough to make such distinctions during time-sensitive operations.

For example, AI and facial recognition technology may struggle to identify combatants in asymmetrical warfare or understand when someone is in the process of surrendering. 

Australia's position 

In 2018, Australia stated that it was premature to regulate LAWS. Half a decade later, Australia has not diverted from this position despite significant advances and uses of the technology. 

Australia is also overreliant on the use of article 36 reviews to ensure LAWS are compliant with IHL. While an important safeguard, the Commission's submission highlights the shortcomings of this approach in isolation.

Recommendations

The Commission's submission makes four important recommendations ranging from better engagement with human rights expertise to adopting a national definition of LAWS. To learn more about these recommendations, please read the submission.