The ethics of using killer robots is adding a new subspeciality to bioethics. This week the United Nations Human Rights Council debated the use of lethal autonomous robots in Geneva. UN special rapporteur Christof Heyns, a South African legal expert, called for a moratorium while legal and ethical issues are knotted out.
"War without reflection is mechanical slaughter," he said. "In the same way that the taking of any human life deserves - as a minimum - some deliberation, a decision to allow machines to be deployed to kill human beings deserves a collective pause worldwide."
Mr Heyns says that LARs could be as big a step in warfare as gunpowder or nuclear weapons. But it would be qualitatively different, since machines, not humans, would be deciding whether or not to kill. He also worries that LARs could make war more likely, as nations would not feel inhibited by the fear of wasting the lives of their own soldiers. In a report to the Human Rights Council, he writes:
“There is a qualitative difference between reducing the risk that armed conflict poses to those who participate in it, and the situation where one side is no longer a ‘participant’ in armed conflict inasmuch as its combatants are not exposed to any danger. LARs seem to take problems that are present with drones and high-altitude airstrikes to their factual and legal extreme.”
One important issue posed by the LARs is the lack of a clear chain of responsibility. Drones are also mechanical killers, but at the moment the decision to kill is taken by a human being. Mr Heyns says:
"Their deployment may be unacceptable because no adequate system of legal accountability can be devised. LARs can potentially be also used by repressive governments to suppress internal domestic opponents. "Do we want a world in which we can be killed either as combatants or as collateral damage by robots with an algorithm which takes the decision? It's this issue of diminishing human responsibility that concerns me."
Human Rights Watch has been campaigning to ban the use of killer robots. It issued a report last November calling for a pre-emptive ban.
This article is published by
and BioEdge under a Creative Commons licence. You may republish it or translate it free of charge with attribution for non-commercial purposes following these guidelines. If you teach at a university we ask that your department make a donation. Commercial media must contact us for permission and fees. Some articles on this site are published under different terms.