Skip to main content

Search

Military Technologies and Human Rights

Technology and Human Rights

Summary

Learn more about how lethal autonomous weapons systems and other military technologies threaten human rights.

The Australian Human Rights Commission (Commission) and the Danish Institute of Human Rights have jointly prepared this submission on behalf of the 24 member countries of the NHRI Digital Rights Alliance (Alliance).

The submission is in response to the United Nations Human Rights Advisory Committee's call for input on new and emerging technologies in the military domain (NTMD). 

Lethal autonomous weapons

Lethal autonomous weapons systems (LAWS) can be understood as weapons that independently select and attack targets. 

The Alliance calls for a full ban of LAWS where the technology is incompatible with international human rights law, international humanitarian law and international law.

Geneva Convention 

Article 1 of Protocol I to the Geneva Conventions calls on all States to take measures to ensure that international human rights law is given full effect.

Transparency on the development and capabilities of NTMD is fundamental for responsibility and accountability.

Article 36 of Protocol I to the Geneva Conventions places an obligation on States to carry out legal reviews of new weapons to ensure conflict is conducted in accordance with international law. However, article 36 contains no mechanism to ensure accountability of reviews.

There needs to be a strengthened investigatory and reporting measure on NTMD. Such a response should be independent of States, in the form of a new Special Rapporteur on New and Emerging Military Technologies in the Military Domain.

Universality and inalienability 

The universality of human rights is the cornerstone of international human rights law – they are not granted by any State and are inherent to all, regardless of their personal characteristics or circumstances.

Human rights are also inalienable and cannot be taken away, except in specific situations and according to due process.

It is often difficult for NTMD to adhere to international human rights and humanitarian law or for it to apply to due process.

NTMD, especially those utilising artificial intelligence (AI), will challenge principles of universality and inalienability. These are two principles that must remain at the forefront of all human rights discussions of NTMD.

Liability

State and individual responsibility is a prerequisite to ensuring accountability for the violation of any international human rights and humanitarian law.

Such responsibility is often not present in NTMD, especially those integrating AI, so ensuring accountability is difficult.

How, then, can international human rights and humanitarian law apply to ensure accountability if a technology is responsible for the loss of, or harm to, life?

For international human rights and humanitarian law to apply to many NTMD, especially those utilising AI, the Committee must consider where legal liability shall fall.