Skip to main content

The Need for a Preemptive Prohibition on Fully Autonomous Weapons

By Stephen Goose

The future of warfare will be increasingly determined by ever more autonomous weapons systems. But these weapons cross a fundamental moral line. From now on, life-or-death decisions on the battlefield will be delegated to machines. According to Stephen Goose, this is tantamount to an attack on human dignity.

In his essay, the author asks whether it is actually possible to respect the principles of international humanitarian law (IHL) while using fully autonomous weapons. Fully autonomous weapons are able to select and engage targets without any further human intervention. The decision to act is taken by the fully autonomous weapons system itself, without a human operator. But can a machine make ethical decisions?

Other concerns relate to the technology and the proliferation of such weapons systems. This also poses risks for civilians and soldiers. Goose calls for a ban on these fully autonomous weapons. He sees the principle of human control over the final decision as being a central argument against their use, since otherwise there is a danger of war becoming de-moralized.

Stephen Goose supports his theory by pointing out that non-governmental organizations and even governments are now calling for national moratoria on fully autonomous weapons systems.

Full article