Autonomous weapons talks once again put to the test in Geneva

The opening session of the CCW review conference from 12-17 December, 2021. NGOs together with a growing number of countries have been calling for a new legal instrument to regulate autonomous weapons. (Credit: Kasmira Jefford)

UN talks on autonomous weapon systems – or so-called "killer robots" – will restart in Geneva this week. It will be the last chance before a high-level meeting in November for countries to thrash out more concrete plans on how to govern their use.

When Brazilian diplomat Flavio Soares Damico arrived in Geneva to take over as chair of UN talks on autonomous weapons in February, he quickly discovered he had a problem on his hands.

Countries gathered for their first session of the year on 7 March – just two weeks after Russia’s invasion of Ukraine – and the political temperature in the room was high.

No sooner had the five-day meeting begun than it ground to a halt after Russia complained that its experts were unable to attend because of travel restrictions, putting it in an unfair position.

With Russia blocking the consensus, countries were unable to proceed with the meeting. With time quickly running out and after two days of deadlock and heated exchanges between states, the Brazilian ambassador decided to transfer talks to an informal format.

The next session of the so-called Group of Governmental Experts (GGE) kicks off today, and Damico is hopeful that this round of talks will get off to a more productive start.  Over 100 countries, including Russia, will be attending.

“I prefer to see the glass half full rather than half empty, and over the months I still see that there is a very keen interest on the part of members to make progress,” he told Geneva Solutions.

Looking for a breakthrough

The thorny issue of how to deal with rapidly developing, machine-controlled weapons systems has been debated by negotiators in Geneva for more than eight years, and except for 11 guiding principles adopted in 2019, little has been agreed so far.

Read also: What next for talks on regulating ‘killer robots’?

Advocacy groups and a large number of countries have been calling for a new international treaty or instrument to ensure that weapons systems such as tanks, submarines or drones equipped with the latest technology, such as facial recognition software, never operate without human oversight.

Others, however, believe existing international humanitarian law (IHL) is sufficient and does not need to be supplemented with new rules.

Among those countries that agree that there needs to be a new treaty or framework on top of IHL, opinions are divided on whether they should be legally binding or not.

“We are not convinced that you need anything on top of existing IHL... or that any treaty could be flexible enough to keep pace with the technology as it develops after you've negotiated,” Aidan Liddle, the UK’s ambassador on disarmament affairs told Geneva Solutions.

Instead, the UK has submitted two proposals – one for an “IHL manual” that would clarify how existing rules apply to autonomous weapons – and another submitted with Australia, Canada, Japan, the Republic of Korea, and the US that puts forward a set of “principles and good practices”.

In total, seven written proposals have been submitted by individual countries including Russia and China respectively, as well as various groups of states – in what rights groups recognised as a positive step in discussions so far this year compared with 2021.

“Interestingly, there is quite a lot of overlap between these proposals, which is really encouraging to see,” said Ousman Noor, government relations manager at the Campaign to Stop Killer Robots.

One of these areas is how to characterise autonomous weapons, which still lack a precise definition. “There's now a broad agreement around this being systems that select and engage targets with autonomous functionality,” Noor said.

Countries are also in greater agreement over what types of weapons should be banned, for example, if they are designed to be used to attack without “meaningful human control” – where a person is making the decision – or if they can’t distinguish between civilians and combatants.

Reaching consensus

In his draft report, published last week, Damico said he has tried to incorporate each member state’s views with the aim being to adopt the report by the end of the week and pave the way for more intensified discussions next year.

This would already be a major step forward on last year’s discussions, when countries failed to approve the chair’s report.

“What I'm trying to do is consolidate progress achieved so far and from there, start the negotiation on what I call a framework. It might be a code of conduct or a legally binding agreement, depending on what people want to do about it.”

However, for Noor and the Campaign to Stop Killer Robots, the report does not go far enough in recommending that countries begin formal negotiations on a treaty.

“Unless you negotiate it’s just discussions and for us, and if you don't negotiate here, you're suggesting we need to be negotiating in a different forum,” he told Geneva Solutions.

An example of this is the Land Mine Treaty, which was negotiated by a group of countries led by Canada outside of the UN.

For Damico, the reality is much more complicated. “Multilateralism is always something that does not lead to optimal results. It's something that is slower and that needs to take on board different perceptions; no one actually carries the day,” he said.

Push recommendations too far and you risk excluding certain countries and failing to reach a consensus.

“You always have a result that is the product of inputs coming from different directions, and from time to time you have to resort to some sort of constructive ambiguity, to try to move things forward.”

Newsletters