Note: Ethics In Tech board member Brett Wilkins is a member of Collective 20, an international group of socialist activist/writers.
In our first-ever interview, Collective 20 sat down for a socially-distanced conversation with Mary Wareham, advocacy director of the Arms Division at Human Rights Watch (HRW) and global coordinator of the Campaign to Stop Killer Robots, which is working toward an international ban on lethal autonomous weapons.
Wareham played a key role in the implementation of the Ottawa Treaty outlawing anti-personnel landmines around the world with the International Campaign to Ban Landmines (ICBL), work for which she was a joint recipient of the 1997 Nobel Peace Prize. She also served as advocacy director for Oxfam in her home country of New Zealand, leading the successful effort to ban cluster bombs in the 2008 Convention on Cluster Munitions.
Earlier this year, Wareham spoke at a virtual event on surveillance and AI systems hosted by the nonprofit group Ethics In Tech (EIT), where C20 member and EIT board member Brett Wilkins was also a speaker.
This time, Wareham and Wilkins talked about lethal autonomous weapons and Wareham’s new HRW report, “Stopping Killer Robots,” an incisive survey of country positions and policies regarding lethal autonomous weapons.
C20: You fought successfully against landmines and cluster munitions, why lethal autonomous weapons?
MW: I’ve been working on and off since 1998 in the Arms Division of Human Rights Watch and we’ve been involved in pushing international law forward to protect civilians on several fronts. It was landmines when I joined the arms division and right after that treaty was negotiated I was involved in the process of establishing a monitoring initiative to report on treaty compliance by all of the countries. And then I did the same with cluster munitions in the 2000s.
Lethal autonomous weapons were on a list of concerns that the Arms Division wanted to work on over the years but we didn’t have the resources or the time, and finally we got to 2011 and we were watching developments with growing horror, realizing we had to do something. And in 2012 I moved from New Zealand to the US because it didn’t make much sense to be traveling halfway around the world, and at that time I thought offices were important, although that’s now changing a bit, isn’t it?
C20: What is HRWs approach to the issue of killer robots?
MW: HRW approaches from a legal standpoint of international human rights law and international humanitarian law. We thoroughly explored all of the concerns that were raised about incorporating autonomy into weapons systems. And when you look at all of the different concerns, any single one of them would be a reason to make you wonder why we would want to embark on the path toward autonomous weapons. And when you put all of these concerns together it really is quite a compelling case for regulation.
C20:What about unmanned aerial drones? They’re not autonomous (yet) but aren’t they a more serious threat?
MW: I think the work against killer robots and drones will come together. People see from the use of armed drones that there is a shocking lack of accountability. Eventually the work against drones that [Code Pink co-founder] Medea Benjamin and others have been leading and our work against killer robots will come together, just because that’s the way that technology is advancing. I think Medea recognizes that too and that’s why she wanted to make sure that Code Pink was involved from the beginning of the Campaign to Stop Killer Robots. Jody Williams was also a key proponent of the campaign.
C20 note: Jody Williams was also awarded the 1997 Nobel Peace Prize for her work banning and clearing landmines with the ICBL.
Collective 20 member Medea Benjamin says she joined the Campaign to Stop Killer Robots because she believes that “instead of making killing easier, cheaper and with no consequences for the killers, we should be doing just the opposite: banning autonomous weapons.”
“I traveled to Yemen, Pakistan and Afghanistan to talk to families whose loved ones had been killed by our drones,” Benjamin wrote for this piece. “It was devastating to hear their stories and to understand just how evil drone warfare is, with US ‘pilots’ sitting in ergonomic chairs in an air-conditioned room in a US base pressing a button to blow up people thousands of miles away. And then to learn that some of the best scientific minds in our country are working to make drones that are autonomous made my skin crawl.”
C20: How close are countries to actually fielding fully operative lethal autonomous weapons systems?
MW: We’re on the borderline. There is so much going on, so many investments being made, that it’s hard to keep track of all of them. We try to be clear that we are not going after existing weapons systems, like Israel’s Iron Dome and US Phalanx [shipboard defense system]. We’re going after systems like the South Korean [Samsung] SGR-A1 sentry robot on the border with North Korea, the IAI Harop and Harpy loitering munitions that Israel has developed and is selling to Azerbaijan and other countries, the long-range anti-ship missiles (LRASM) that Lockheed Martin has been developing in the US, the BAE Systems Taranis autonomous fighter aircraft being worked on in the UK and tested in the Australian desert and the drone swarms that China and the US have been investing in.
C20: What do you make of the US stated position of neither encouraging or prohibiting the development of autonomous weapons systems?
MW: During the Obama administration, the best thing the US did was to let talks go forward. The US allowed multilateral talks on killer robots to begin back in 2013, and they put it in the Convention on Conventional Weapons because they thought they could control it. And then we had the policy directive, which we had some inkling was underway when we were releasing our killer robots report in November 2012, and it’s been misinterpreted ever since. I still have issues interpreting it. It’s definitely not the clearest policy directive. The big takeaway is that it curbs use but it green-lights development, as you can see with all the various investments that have been made across the board.
As for Trump, we were hoping that he wouldn’t notice this. It’s very difficult to achieve anything with people like [former national security advisor] John Bolton in the administration. There isn’t a very high level of interest in the Trump administration on this issue, they just seem disengaged. Under Obama the US would help broker consensus, there were extensive meetings with civil society, but now the US is like a shadow of its former self. I’ve had UN diplomats thank me in the last year for not pushing too hard because if you push too hard you’re going to get an even worse result. If you think this lack of engagement is bad, you really don’t want us to be engaging under this administration, is what they’re saying.
C20 note: President Barack Obama, an inexplicable Nobel Peace Prize recipient himself, was often called the “drone warrior-in-chief” for his enthusiastic embrace of unmanned aerial drones in the war against terrorism. Obama, who boasted that he was “really good at killing people,” bombed more countries than his predecessor, George W. Bush.
C20: Can you speak to China’s stance of both supporting a ban on the use of killer robots while not backing a ban on their development and manufacture?
MW: If China is going to be on a list of countries that support a ban, it’s going to be with an asterisk. The most important thing about China is that it has been quite clear since at least 2016 that it regards international law as insufficient. It has doubts, it has questions and that said, it does see a need for a legally binding instrument — a treaty — and has given the example of the preemptive ban on blinding lasers. And that’s a lot more than any of the other members on the security council have done yet. I think that really freaked everyone out as well because other countries realized that China had a goal and they didn’t.
Then there’s the United States and Russia, who are just kind of messing with the process, and not really participating in a serious way, especially the Russians, and for China that’s an opportunity to step up and fill a leadership gap.
C20: Can you talk more about that?
MW: China has looked very carefully at what the developing countries have been saying about the issue of killer robots and it acknowledges the widespread opposition from those countries, including in the non-aligned movement. And so China seems to have sided with the developing countries, which are the majority. The whole question at the end of the day, though, is how sincere is any of this, and would China really participate in a treaty negotiation aimed at a preemptive ban.
C20: What about the United Nations? What role does it play in regulating killer robots?
MW: Any treaty would eventually involve the UN at the end of the process and that there are definitely functions for the UN to perform. But you can negotiate a treaty wherever you want, and that’s what we’ve realized from our experience from a football stadium in Dublin to a massive conference hall in Oslo. It doesn’t really matter, you don’t have to be at the UN in order to negotiate.
C20: Is it too late to regulate lethal autonomous weapons?
MW: It’s never too late! I still feel guilty that there were hundreds of thousands of land mine victims before the landmine treaty was concluded in 1997 but since then the numbers have been steadily declining, and it’s a similar situation with cluster munitions. So when I felt bad and down about all the suffering that we document at HRW with the other weapons that are being used, I would turn my attention to killer robots and that would give me a bit of hope for the future, that we would be able to act before it’s too late. And we’re still trying to do that.
C20: Do you think we are close to something similar to the Mine Ban Treaty or the Convention on Cluster Munitions?
MW: There’s a lot more work to be done. The burden of proof is on the states that are developing killer robots. They have to explain how human control functions in a weapons system and in the use of force and make the case for exempting it from prohibition. That’s how it’s worked in the past treaties. It’s another reason why the US should realize it’s in its interest to participate and not sit it out. But I do think that we’re close.
C20: What is the Campaign to Stop Killer Robots doing in terms of working toward a ban?
MW: We are an international coalition of 165 groups in 65 countries. Some of them are single activists, like the one in Moscow, Russia who is operating under quite trying circumstances, and others are nonprofit organizations that people can volunteer with, like Amnesty International and Human Rights Watch, and church groups like Pax Christi. We’ve just launched a site for children and youth who are interested in learning more about killer robots.
C20: How can activists, or even everyday people, get involved in the fight against killer robots?
MW: Here in the United States there needs to be political pressure and engagement just to explain why this is an issue of concern with representatives in Congress. Of course we can’t do all of the outreach on the Hill that we need to do, and we’re outnumbered by defense lobbyists. We have to reach them through face-to-face contacts and correspondence, but also through the media. Having well-known activists with us helps a lot. Noam Chomsky, along with thousands of AI experts, roboticists and academics, signed our 2015 letter [calling for a worldwide autonomous weapon ban].
C20 note: Collective 20 member Noam Chomsky calls killer robots “the latest horror on the agenda” and says that “to add to the lethal mix” of global crises “is tragic irresponsibility.”
“Fully automated systems of murder and destruction — killer robots — should be banned by international agreement, quickly and decisively, along with other monstrosities that perverse intelligence might devise,” Chomsky, a pioneer in the field of cognitive science, wrote in response to a request for input on this piece.
MW: Under pressure, we did see some questioning last year of budget requests when it came to AI and tech. There have been some critical questions asked, especially around cost. This is a lot of money that’s not going to health care, education and other areas.
We also really appreciate original journalism on any issues of concern. I really like the work that Jack Poulson is doing at Tech Inquiry. He’s really been scrutinizing all of the tech company contracts with the Pentagon and providing original research. The more of that the better, because it helps to shine a light and show people the different linkages and the way money is pouring in.
C20: What role can tech workers play in the campaign?
MW: We see the government courting tech companies through programs like the Silicon Valley Outreach Program. We also see tech workers speaking out when the government offers funding for tech companies that will work with them. Or when the government says [cooperation is] for disaster relief, or humanitarian reasons, or transportation, or communications, but it’s really for fighting war. There is definitely a greater realization that a lot of what the government is asking tech companies and workers to contribute to could very likely be used in weapons systems, and only a couple of them seem to admit it. Look at Palantir.
We just need to keep building up the campaign and bringing in new people and developing our own arguments. We need to focus on offering a positive vision for the future that everyone can get behind and I don’t think we can do that by just talking about the need to retain human control over the use of force, we’ve got to acknowledge the broader digital landscape and all the tech activism that’s going into it right now because it’s a whole array of different challenges, this is just one of them and it just happens to be at the sharpest point, taking human life, but there are so many more relevant concerns.