New technologies such as cybertechnology, biotechnology and artificial intelligence (AI ) promise major advances in many areas of society. They have a wide range of civilian and military applications. They can therefore be summarised under the term dual-use technologies: On the one hand, these enable a wide range of non-military, commercial applications (such as agricultural drones, care robots or chatbots in the field of AI); on the other hand, they can also be used for military purposes (as in the AI field, for example, for warfare using autonomous weapons or image recognition processes that are used to make military decisions).
Dual-use technologies are associated with various risks and uncertainties, which result in particular from the fact that it is difficult to predict how and by whom these technologies could be used in the future, and whether technologies that are currently only used for civilian purposes could also be used or misused for military purposes in the future. Dual-use technologies therefore also pose major challenges for international arms control.
This is particularly evident in the example of artificial intelligence: there is no doubt that there are numerous commercial application options from which we expect great benefits. But significant military innovations are also made possible by the use of AI. These applications, which can provide individual states with major strategic advantages, also include autonomous weapon systems.
(Partially) autonomous weapon systems are already in use today. In particular, the strategic competition between major military powers such as the USA and China has fuelled an AI race that is taking place in both the civilian and military sectors. Internationally, however, there is a great deal of dissent regarding the question of whether and how these should be regulated. The international campaign for a ban on lethal autonomous weapons systems has so far failed, particularly due to resistance from major military powers. Strategic incentives for the military use of AI often clash with ethical, moral and international legal reservations.
This example illustrates the problem that dual-use technologies pose for international arms control: Traditional arms control policy instruments – such as instruments for quantitative arms control or the banning of specific types of weapons – are often unsuitable for regulating innovations based on dual-use technologies. This is because a strict ban or restriction of a technology per se is generally neither possible nor desirable. Arms control in the dual-use area is therefore always faced with the challenge of defining the limits of the legitimate use of those technologies. The regulation of dual-use technology will also be made more difficult by the large number of actors who have interests in the development or use of certain applications. These include private and public actors and international NGOs as well as national interest groups. Furthermore, the key stakeholders in the field of AI include not only traditional defence companies, but also large technology groups and start-ups.
Comparability with analogue phenomena
It is a fundamental characteristic of artificial intelligence that it can be described as a dual-use technology, general purpose technology or enabling technology. It is therefore not a specific application (for which there is a concrete comparable analogue phenomenon), but a technology that is used in many different areas – including the defence sector. The transformational potential of artificial intelligence is therefore often compared to major technological innovations such as the discovery of electricity or the industrial revolution. As a problem for arms control, AI can therefore best be compared with other technologies with high dual-use potential. These include newer technologies such as biotechnology as well as nuclear technology, which have been used for civilian and military applications for decades.
Social relevance
The problem outlined above is a phenomenon of the utmost importance in terms of security policy. Arms control in general has been made considerably more difficult by recent international conflicts and the increasing rivalry between the USA on the one hand and countries such as China and Russia on the other. It can also be assumed that the relevance of technologies with a high dual-use potential will continue to increase in the future.
Sources
- Boulanin, V. (2016). Mapping the Innovation Ecosystem Driving the Advance of Autonomy in Weapon Systems. SIPRI Working Paper. Stockholm: SIPRI.
- Dahlmann, A. et al. (2021). Autonome Waffensysteme und menschliche Kontrolle. Konsens über das Konzept, Unklarheit über die Operationalisierung. SWP-Aktuell 31. Berlin.
- DoD (2018). Summary of the 2018 Department of Defense Artificial Intelligence Strategy. Harnessing AI to Advance Our Security and Prosperity. https://media.defense.gov/2019/Feb/12/2002088963/-1/-1/1/SUMMARY-OF-DOD-AI-STRATEGY.PDF [23.07.2024]
- DoD (2012). Directive 3000.09, Autonomy in Weapon Systems. https://irp.fas.org/doddir/dod/d3000_09.pdf [23.07.2024]
- Horowitz, M. C. (2018). Artificial Intelligence, International Competition, and the Balance of Power (May 2018). In: Texas National Security Review (1), 36–57.
- Horowitz, M. C. (2019). When speed kills: Lethal autonomous weapon systems, deterrence and stability. In: Journal of Strategic Studies 42 (6), 764–788.
- Singer, P. W. (2009). Wired for war. The robotics revolution and conflict in the twenty-first century. New York, NY.