'No' to ban on killer robots

Date Published: 03.12.2014

Lethal Autonomous Weapon Systems (LAWS) should be regulated rather than banned says Wadham Fellow Tom Simpson in a new Policy Memo for the Blavatnik School of Government.

Setting out recommendations for policymakers to establish guidelines and systems to regulate LAWS, or ‘killer robots’, Simpson argues against the growing campaign that wants to ban them, saying that LAWS may well reduce suffering and death in war.

The Policy Paper, published in November 2014, claims that technological development will see LAWS become widespread in the near future.

According to the paper: “The characteristic forerunners of this technology are drones armed with missiles. At present, a pilot flies the system remotely; intelligence analysts assess the target information provided; and military commanders supported by legal advice make the decision to attack. Current R&D programmes aim to automate these processes for systems deployable on land, at sea, underwater and in the air. In time, a system’s movement, target acquisition, and ultimately decision to kill, could be made autonomously.”

Simpson and his co-author, Prof Vincent Müller, argue for the implementation of five recommendations for policymakers to establish guidelines and systems to regulate LAWS.

  • Establish an international technical standards agency for LAWS
  • Establish national technical standards and licensing bodies for LAWS
  • Extend war crimes legal instruments to the illegitimate use of LAWS
  • Permit the distribution of LAWS only when there is better-than-human performance
  • Permit the use of LAWS for killing only when there is compelling military reason

Simpson writes: “There is pressing need for a regulatory regime to govern LAWS, both legally and in terms of technical standards. This is a moral responsibility of governments and international organisations. The deployment of LAWS is acceptable only if that system reduces risk to both combatants and civilians. Risk reduction is the overall beneficial outcome that justifies their development. It is not permissible to reduce risk to soldiers by increasing that to civilians.”

The most significant moral question arises about the attribution of responsibility for LAWS’ killings. When things go wrong and civilians are killed, who is to blame? Say Simspon and Müller: “We propose that responsibility for the effects of LAWS should be attributed in exactly the same way as any other technological system. Consider medicines. These have generally predictable results but with a risk of negative side-effects. So drugs are tested during development and only then licensed for prescription. When prescribed in accordance with the guidelines, neither the doctors nor drug companies are responsible for side-effects (though reliability for defective manufacturing, defective design or failing to warn still holds). Instead, the body that licenses the medicine, such as the FDA or NICE, is responsible for ensuring overall beneficial outcomes. Apply this to LAWS. The same division of responsibility occurs between engineers, military users, and the government.”

This policy document draws on research laid out in two forthcoming publications, Autonomous killer robots are probably good news (Müller and Simpson) and Just War and Robots' Killings (Simpson and Müller).

Tom Simpson

More like this

View all

OBE for Honorary Fellow

A Wadham Honorary Fellow has been honoured with an OBE for services to artificial intelligence research.

Find out more

Putting humanity into AI

Can AI help solve the world’s problems and what are the risks and benefits?

Find out more