Think about a weapon that once activated, can select and engage targets without further human intervention. This might sound like science fiction, but autonomous weapons systems, also called killer robots, are being deployed in real time. The Turkish STM Kargu II drone was used in Libya in 2020, and it was “programmed to attack targets without requiring data connectivity between the operator and the munition.”
Autonomous weapons systems, or AWS, are also likely to be deployed in the Russian-Ukraine conflict if it does not end soon. Russia started serial production of combat robots that can fight on their own in 2021 and has used the robots in large-scale strategic drills.
The challenge is that despite the applicability of international humanitarian law, these weapons present unique features, such as autonomy, that call into question a long-lasting paradigm of human action and have no specific international regulation.
The international community is aware of the hurdles that autonomous weapons present, and a United Nations group of governmental experts has been debating the hurdles since 2014, under the umbrella of the Convention on Certain Conventional Weapons. The group’s decisions are based on consensus, which often means facing insurmountable obstacles, given the tremendous interest from leading military powers at the table.
After adopting 11 general guiding principles in 2019, the governmental experts confronted a crisis of credibility for their failure to find paths to global governance of AWS and to make other substantial agreements. The situation was described as succumbing to the tyranny of the consensus rule since countries such as Russia have impeded progress.
In the first session of 2022, held in March, the group of governmental experts remained, in the words of the chairman, like a dog chasing its tail, and it was not even able to adopt an agenda as Russia repeatedly said it was being discriminated against by the other countries in the session. Russia affirmed that its experts weren’t able to come to the Geneva meeting because of visa requirements and closure of air routes after Russia’s Feb. 24 invasion of Ukraine. On that ground, Russia used the consensus to bar any step of the experts.
Yet momentum is building for international discussions on AWS. The second session, at the end of July, set the stage for major discussions and concrete proposals. As voiced by the United States delegation during the meeting, “There are empowering areas of consensus emerging from our discussion.”
One is a two-track approach: discussing which AWS are unacceptable and prohibited per se, and which AWS that are not prima facie illegal but need some regulation. Another issue is that AWS require a degree of human-machine interaction, but countries diverge on what this relation requires from humans.
To boost consensus, the chair, Brazilian Ambassador Flávio Soares Damico, presented a draft report that was debated by the governmental experts. The chair summed up that “criticism came from two different wings, one that believes that we have to strengthen what was originally drafted by us and another view that we need to go in a more liberal way and not be so restrictive was also made present.”
Among the issues in the draft report was state responsibility: to secure that internationally wrongful acts, including conduct involving AWS, entail the international responsibility of countries. Until the draft report was written, the issue was barely voiced in previous meetings, as the discussions focused on human responsibility, leaving states’ roles in the shade.
During the group of experts’ discussions, the issue was raised by Cuba and France, for instance, and it surfaced in the proposal on “Principles and Good Practices on Emerging Technologies in the Area of Lethal Autonomous Weapons Systems.” Even with more ambitious proposals on state responsibility, such as Argentina’s claim for state responsibility for nonstate actors, the issue received a great level of acceptance on the floor.
Despite a rather watered-down final report resulting from the second session on AWS, the main breakthrough, on state responsibility, emerged, with the recognition that “every internationally wrongful act of a state, including those potentially involving weapons systems based on emerging technologies in the area of AWS, entails international responsibility of that state, in accordance with international law.”
This is no small feat, given diplomacy’s snail pace and technology’s soaring speed and that holding countries responsible for their actions is a primary pillar of international law.
One final note: there was a serious attempt from Russia and India to silence the voices of civil society by taking the floor and raising issues of order to keep those representatives from making proposals on a text that the experts were considering. On the last day of the meeting, as discussions went over the regular hours and the group had to move to another room, Russia again tried to bar civil society’s participation. However, the chair assured the organizations’ participation with the support of all other delegations that took the floor. Civil society participation is key to providing expertise and a more democratic viewpoint to the group of governmental experts.
Unfortunately, only universities from the global North and mainly NGOs from the region took the floor. The global South needs to be more present in these discussions.
We welcome your comments on this article. What are your thoughts on the use of killer robots?
Lutiana Barbosa has been a federal public defender in Brazil since 2010. She is a Ph.D. candidate in international law at the Federal University of Minas Gerais, working on autonomous weapons systems and the international responsibility of states. She is a member of EDHIA, a research group on ethics, human rights and artificial intelligence at the National School of the Federal Public Defender’s Office.
Gustavo Macedo is a professor of international relations and a postdoctoral fellow at the Institute of Advanced Studies at the University of São Paulo. He was previously a visiting scholar at Columbia Global Policy Initiative at Columbia University. He was also the editor of the report “Making Atrocity Prevention Effective,” presented to the United Nations Office on Genocide Prevention and the Responsibility to Protect, in 2018.