Any views expressed in this article are those of the author and not of Thomson Reuters Foundation.Tweet Widget Facebook Like Email All governments should support international talks to address the threat posed by fully autonomous robotic weapons, Human Rights Watch said today. Human Rights Watch and the Harvard Law School International Human Rights Clinic on October 21, 2013, issued a question-and-answer document about the legal problems posed by these weapons.
(New York) - All governments should support international talks to address the threat posed by fully autonomous robotic weapons, Human Rights Watch said today. Human Rights Watch and the Harvard Law School International Human Rights Clinic on October 21, 2013, issued a question-and-answer document about the legal problems posed by these weapons.
Representatives from the Campaign to Stop Killer Robots, including Human Rights Watch, will present their concerns about fully autonomous weapons at a United Nations event in New York on October 21.
"Urgent international action is needed or killer robots may evolve from a science fiction nightmare to a deadly reality," said Steve Goose, arms director at Human Rights Watch. "The US and every other country should support holding international talks aimed at ensuring that humans will retain control over decisions to target and use force against other humans."
Fully autonomous weapons - also called "lethal autonomous robotics" or "killer robots" - have not yet been developed but technology is moving toward increasing autonomy. Such weapons would select and engage targets without further intervention by a human.
In recent months, fully autonomous weapons have gone from an obscure issue to one that is commanding the attention of many governments, international institutions, and groups around the world.
Earlier in October, Austria, Egypt, France, Pakistan, and other countries called for international talks on fully autonomous weapons during the UN General Assembly First Committee on Disarmament and International Security in New York. France, as chair of the next meeting of the Convention on Conventional Weapons, has been consulting to solicit support for adding fully autonomous weapons to the convention's work program.
In a report issued earlier in May 2013, the UN special rapporteur on extrajudicial, summary, or arbitrary executions, Christof Heyns, called on governments to institute immediate moratoriums on fully autonomous weapons. His report also suggested that a high-level panel of experts consider the issue. At a UN Human Rights Council debate on the report on May 29, more than two dozen countries spoke on the issue for the first time and all agreed that the prospect of fully autonomous weapons requires urgent international action.
On October 16, 272 engineers, computing and artificial intelligence experts, roboticists, and professionals from related disciplines issued a statement calling for a ban on fully autonomous weapons. They cast doubt on the notion that robotic weapons could meet legal requirements for the use of force "given the absence of clear scientific evidence that robot weapons have, or are likely to have in the foreseeable future, the functionality required for accurate target identification, situational awareness, or decisions regarding the proportional use of force."
"We are seeing significant interest in tackling the issue of fully autonomous weapons, and now it's time to act," Goose said. "The only viable solution will be a pre-emptive ban on the development, production, and use of these weapons."
In November 2012, Human Rights Watch and the Harvard Law School International Human Rights Clinic issued "Losing Humanity: The Case against Killer Robots," a 50-page report outlining numerous legal, ethical, policy, and other concerns with fully autonomous weapons. The new "questions and answers" document clarifies and expands on some of the issues the report raised.
Most governments are in the process of determining their policy position on fully autonomous weapons and have not spoken publicly. One exception is the United States. The Defense Department issued a directive on November 21, 2012, that, for now, requires a human being to be "in-the-loop" when decisions are made about using lethal force, unless department officials waive the policy at a high level.
The US policy directive, while positive, is not a comprehensive or permanent solution to the potential problems posed by fully autonomous systems, Human Rights Watch said. The policy of self-restraint it embraces may also be hard to sustain if other nations begin to deploy fully autonomous weapons systems.
Human Rights Watch is the initial coordinator of the Campaign to Stop Killer Robots, announced in April by an international coalition of civil society groups. The campaign is working pre-emptively to ban weapons that would be able to select and attack targets without any human intervention.
The coalition says that this prohibition should be achieved through an international treaty, as well as through national laws and other measures, to enshrine the principle that decisions to use lethal force against a human being should always be made by a human being.