Beyond Ukraine: AI and the Next US-Russia Confrontation

This latest crisis should be seen as an opportunity to resolve long-standing challenges and to put new agreements in place to avoid future conflicts.

February 14, 2022
ukrainerussia2018-12-03T160152Z_1313662635_RC14835E3590_RTRMADP_3_UKRAINE-CRISIS-RUSSIA.png
Ukrainian soldiers take part in excercises in the Ukrainian Ground Forces training centre near Honcharivske in Chernihiv region, Ukraine, December 3, 2018. (Valentyn Ogirenko/REUTERS)

Two titans from the Cold War era seem set to go another round, this time over the prospect of Ukraine’s membership in the North Atlantic Treaty Organization (NATO), which the United States calls a sovereign Ukrainian decision and Russia opposes vehemently. Whatever the outcome of the current standoff, another confrontation between the United States and Russia that merits closer attention is brewing — one that may fundamentally reshape the US-Russia security relationship in the not-so-distant future.

Both states are heavily committed to the use of artificial intelligence (AI) in military systems and operations, including logistics, command and control, and intelligence collection and analysis, as well as to the development of more autonomous weapons. As tensions rise, these countries are likely to employ capabilities that are enhanced by AI and machine learning in cyberattacks and misinformation and disinformation campaigns. Rising political temperatures might well encourage fast-tracking of more autonomous military systems as each side seeks to gain the advantage.

The United States and Russia have already tested several autonomous systems. Russia has made important advances on autonomous tanks, while the United States has demonstrated a number of capabilities, including swarming munitions, which have the ability to destroy a surface vessel using a swarm of drones. At the moment, the United States is at the forefront of the development of autonomous systems and military AI applications. However, Russia has approached China to partner with it in building its AI readiness, and such a partnership could be a game changer.

One crucial concern is that the growing autonomy and use of AI in decision making in existing weapons platforms and in cyberspace will result in the deployment of immature systems; the result could be accidents that help to escalate conflict.

At the same time, both Russia and the United States have prevented progress on new international norms and agreements on the development and use of autonomous systems that could help to avoid dangerous situations. The United States and its allies are developing norms on responsible military uses of AI, but little dialogue with potential adversaries has taken place. And so, the competition between the great powers is allowed to grow unchecked.

Both Russia and the United States have prevented progress on new international norms and agreements on the development and use of autonomous systems that could help to avoid dangerous situations.

Understanding the Role of AI in Contemporary Conflicts

A key challenge is that AI itself is not a weapon but a range of functions and technologies. As Michael C. Horowitz has noted, AI can be defined as enabling technology: “AI is not a single widget, unlike a semiconductor or even a nuclear weapon.” In other words, AI is many technologies and techniques. Such an interpretation helps us to understand the range of current and potential uses of this technology in defence applications, including in cyberattacks and digital information warfare, as well as in more autonomous weapons systems.

The United States and other NATO members are already bolstering Ukrainian cyber defences in expectation of cyberattacks by Russia on Ukraine’s infrastructure, including its electrical grid, communication systems and government departments. Yet Russia could use AI to overwhelm Ukrainian systems in conjunction with misinformation and disinformation campaigns, some of which might involve manufactured or manipulated images or video, known as deepfakes.

While cyber and misinformation tactics are not new, AI and machine learning are offering novel ways to engage in them. And, while Western media and analysts focus on Russian cyber techniques, it is almost certainly true that the United States and other NATO members are also pursuing some of these same tactics against Russia.

Most states publicly agree that crossing a certain threshold of cyber activity, as in the targeting of critical infrastructure, could justifiably be perceived as akin to an actual physical attack. Still, it seems possible that some activities short of this threshold are taking place and contributing to the escalation of conflict. UN discussions on responsible behaviour in cyberspace have sought to build shared norms, but more work needs to be done for concrete rules to be established.

Once again, AI-enabled cyber activities pose additional challenges. Information regarding AI-enabled cyber weapons and defences is highly classified, but given what we know about research projects by US defence institutions, it seems highly likely that the United States has developed several such capabilities. Even less is known about the intersection of Russian AI and cyber capabilities, but it is also likely that Russia has focused its AI research and development on the cyber domain.

How to Control Autonomous Systems

Despite constant assurances from various states that autonomous weapons will not be a concern for many years, systems like uncrewed aircraft, ground vehicles and sea vessels have been developed in recent years. Particularly concerning are the advances in swarming technologies and the use of loitering munitions, some of which come with claims that they can function autonomously. Rather than the futuristic robots seen in science-fiction movies, these systems use previously existing platforms that leverage AI technologies.

The ongoing research and development of more autonomous systems increase the potential for the escalation of their use and misuse. Concerns about such momentum have led to calls for an international agreement that regulates the use of AI-enhanced technology.

Russia and the United States are among the group of countries that have hampered progress on such discussions at the Convention on Certain Conventional Weapons (CCW), the forum for multilateral discussions on this issue. During the December 2021 CCW Group of Governmental Experts meeting and the CCW Review Conference, Russia made it clear that it did not see the need for any new legally binding agreements to address the emergence of autonomous weapons systems. The United States and several other countries, including India and Israel, also stonewalled any attempt to reach a new agreement. In general, states with more technologically advanced militaries have been unwilling to accept any constraints on the developments of AI technology.

The current Ukraine crisis and rising tensions between the United States and Russia could make any new agreements on emerging technologies in warfare much more difficult. Certainly, the current atmosphere will add an additional challenge for the UN discussions on autonomous weapons that are set to take place in March 2022 in Geneva. But perhaps the potential for escalation, miscalculation and misperception that the current crisis has once again brought to the fore will prompt a greater number of states to push for international dialogue and agreement to ensure responsible state behaviour.

Diplomacy and arms control, as the Ukraine crisis has shown, are critical to minimize further and future risks for political instability and crisis escalation between the United States and Russia. This latest crisis, therefore, should be seen as an opportunity to resolve long-standing challenges to the extent possible and, crucially, to put new agreements in place to avoid future confrontations.

The opinions expressed in this article/multimedia are those of the author(s) and do not necessarily reflect the views of CIGI or its Board of Directors.

About the Author

Branka Marijan is a CIGI senior fellow and a senior researcher at Project Ploughshares.