ICRC President: “We must adopt a human-centered approach to the development and use of new technologies”

Ladies and gentlemen,

At present, the ICRC classifies more than 120 armed conflicts. This number represents more than a threefold increase over the past three decades. Moreover, todayʼs conflicts come with multidimensional ripple effects, locally as well as across regions.

Panelists over-employ terms like “crossroads” or “critical juncture” to raise attention. However, political jargon should not distract us from the reality the ICRC and other humanitarian actors are confronted with in todayʼs battlefields.

New and emerging technologies are changing the way armed conflicts are fought and make them more complex. The use of cyber operations in conjunction with traditional weapons, and the increasing role of artificial intelligence in weapons systems and decision-making, feature prominently in discussions about the evolution of warfare.

These new technologies may well appear to offer military advantages, such as the ability to act at speeds far exceeding human capabilities. Or to carry out attacks in communications-denied environments, such as the maritime spaces that are so key in this Asia Pacific region.

At the same time, these new technologies raise new dangers for societies. They increase risks of unintended escalation; they raise concerns of weapons proliferation; and they aggravate the already unbearable suffering experienced by victims of armed conflicts.

I want to highlight three concerning evolutions:

  • First, trends indicate the future use of autonomous weapons systems against a wider range of targets, over longer time periods, and with fewer possibilities for humans to intervene.
  •  

  • Second, artificial intelligence is influencing and accelerating military decisions about who or what is targeted in armed conflict in ways that surpass human cognitive capacity and therefore undermine the quality of decision-making. This poses additional risks to civilians.
  •  

  • Third, cyber operations are used to disable civilian government services such as water and electricity and to obstruct the work of the medical services and humanitarian operations.
  •  

Complex AI – such as generative models and other forms of machine learning – is increasingly relevant across these various evolutions, as militaries seek to leverage the range of tools that are being developed at an astonishing rate. Integrated to these various weapons and methods of warfare, AI can exacerbate concerns by introducing additional sources of unpredictability and speeding up operational tempo beyond human control, in battlefields that are already characterized by extreme volatility. And in doing so, it creates challenges for commanders and combatants in upholding their IHL obligations.

Geopolitical tensions or perceived military necessity cannot be used as a justification to throw out humanitarian considerations – in fact, it is during such difficult times that preserving humanity becomes the most relevant.

We must adopt a human-centered approach to the development and use of new technologies to ensure that victims of armed conflicts continue to be protected. We cannot allow a situation where new technologies of warfare serve to replicate, and indeed amplify, unlawful or otherwise harmful effects at faster rates and on larger scales.

Therefore, I am making two calls on all States.

First, to make IHL a priority, and reaffirm their commitment to its universally accepted principles. More specifically,

  • Conclude a new treaty regulating autonomous weapon systems. Such a treaty must prohibit unpredictable autonomous weapons and those that target humans; and place strict constraints on all others. Together with the Secretary General of the United Nations, I made a call on States to urgently negotiate and conclude by 2026 a treaty regulating autonomous weapon systems.
  •  

  • When it comes to the use of AI in systems supporting military decision-making, human judgment must remain central, especially in decisions posing risks to peoples’ lives and dignity.
  •  

  • With regards to cyberspace, States must collectively determine how to protect societies against digital threats. Many States expressed support for the concrete recommendations made by the ICRC’s Global Advisory Board on digital threats. We now need to work together to implement.

Second, acknowledge that compliance with IHL is a means to invest in peace. The Geneva Conventions are ratified by all states and therefore constitute a universal consensus. They firmly oppose the concept of military victory at all costs or with all available means.

Any armed conflict comes with horror, despair and a terrible human as well as environmental cost. Future warfare will be no different, whether fought with traditional weapons or new technologies. There are better alternatives. Preventing or reversing arms races, and regulating the development of new technologies of warfare, are effective steps that support humanity’s desire to live in peace.

As we embrace the promise of innovation, we also have to remain vigilant in upholding our moral high-ground and values. The decisions that the international community makes today about these new technologies will shape the future of warfare for generations to come, and the lives of billions of people around the world.

Through cooperation, and by upholding and advancing international agreements, we can build a future where technology puts humanity at its centre and serves the cause of peace.

Thank you.

Source : Icrc