National Geographic News
A robot is pictured in front of the Houses of Parliament and Westminster Abbey as part of the Campaign to Stop Killer Robots in London April 23, 2013.

A robot is pictured in front of the Houses of Parliament and Westminster Abbey as part of the Campaign to Stop Killer Robots in London.

Photograph by Luke MacGregor, Reuters

Daniel Stone

National Geographic

Published June 4, 2013

In a world of robotic cars and robotic assembly lines, could warriors be next? Some countries, including the U.S., Israel, and the U.K., are developing lethal autonomous robots that would mimic human soldiers and could be sent into battle.

Several robot models have been designed to carry machine guns and advanced rifles. The company iRobot, which makes the Roomba vacuum cleaner, has developed a robot to taser an enemy. The U.S. Navy has tested an unmanned drone that is capable of taking off and landing on an aircraft carrier and can deploy two tons of artillery.

Proponents see benefits, including the ability to wage combat with an indefatigable military, as well as the advantage of fewer casualties. But international organizations are hesitant about such dangerous technology and the risks it may pose. National Geographic asked Christof Heyns—a human rights lawyer and the United Nations' Special Rapporteur on extrajudicial, summary, or arbitrary executions—about lethal autonomous robots (LARs), as these warrior robots are known, and whether developers should tread lightly.

You've called for a moratorium on lethal autonomous robots. What concerns you about this technology?

Can robots make the kind of decisions that are required in war, [such as] distinguish between combatants and civilians? They could make it easier for states to go to war, [and] thus could be used by despots to repress their own people. Who would be responsible if a robot goes wild? Is it acceptable that machines kill people?

Many countries have long had lethal automatic instruments of warfare like missiles and bombs, not to mention the increasing use of drones. What would make these robots so different?

Bombs are not autonomous, and neither are drones. This is a crucial distinction and the very basis of the concern about LARs. Drones have a human in the loop. LARs don't, so with LARs, the machine makes the decision whether and who to kill.

Despite the legal, human, and foreign policy risks, is there an upside to automated warriors?

Yes, those who promote them say they can be more targeted and as a result reduce civilian casualties. They can also obtain [tactical] information [about a particular enemy] that we may be able to get or process. But then again, they can make it easier to go to war.

2 comments
Hamlyn Terry
Hamlyn Terry

why not have the robot war where you live, do you not know that the Sahara الصحراء الكبرى‎ - is not a dustbin but part of many populated sovereign counties.

C. Dufour
C. Dufour

Heres what we should do:

every country makes one giant battle robot, send them all to the middle of the sahara desert, Let them loose on each other. Everyone get to blow off some steam and no one dies!

Switzerland can be the referee!

Share

Featured Article

Latest From Nat Geo

See more photo galleries »

The Future of Food Series

  • Download: Free iPad App

    Download: Free iPad App

    We've made our magazine's best stories about the future of food available in a free iPad app.

  • Why Food Matters

    Why Food Matters

    How do we feed nine billion people by 2050, and how do we do so sustainably?

See more food news, photos, and videos »