Are Robot Warriors Headed Into Battle?

Some countries are developing autonomous killing machines.

A robot is pictured in front of the Houses of Parliament and Westminster Abbey as part of the Campaign to Stop Killer Robots in London.


In a world of robotic cars and robotic assembly lines, could warriors be next? Some countries, including the U.S., Israel, and the U.K., are developing lethal autonomous robots that would mimic human soldiers and could be sent into battle.

Several robot models have been designed to carry machine guns and advanced rifles. The company iRobot, which makes the Roomba vacuum cleaner, has developed a robot to taser an enemy. The U.S. Navy has tested an unmanned drone that is capable of taking off and landing on an aircraft carrier and can deploy two tons of artillery.

Proponents see benefits, including the ability to wage combat with an indefatigable military, as well as the advantage of fewer casualties. But international organizations are hesitant about such dangerous technology and the risks it may pose. National Geographic asked Christof Heyns—a human rights lawyer and the United Nations' Special Rapporteur on extrajudicial, summary, or arbitrary executions—about lethal autonomous robots (LARs), as these warrior robots are known, and whether developers should tread lightly.

You've called for a moratorium on lethal autonomous robots. What concerns you about this technology?

Can robots make the kind of decisions that are required in war, [such as] distinguish between combatants and civilians? They could make it easier for states to go to war, [and] thus could be used by despots to repress their own people. Who would be responsible if a robot goes wild? Is it acceptable that machines kill people?

Many countries have long had lethal automatic instruments of warfare like missiles and bombs, not to mention the increasing use of drones. What would make these robots so different?

Bombs are not autonomous, and neither are drones. This is a crucial distinction and the very basis of the concern about LARs. Drones have a human in the loop. LARs don't, so with LARs, the machine makes the decision whether and who to kill.

Despite the legal, human, and foreign policy risks, is there an upside to automated warriors?

Yes, those who promote them say they can be more targeted and as a result reduce civilian casualties. They can also obtain [tactical] information [about a particular enemy] that we may be able to get or process. But then again, they can make it easier to go to war.