Robot apocalypse avoidable

Large-scale robot uprisings were once limited to sci-fi novels and Hollywood thrillers. However, many see them as a very real possibility in the not-so-distant future. Some Americans have always mistrusted — or even feared — advanced robotics, an attitude likely inspired by the treatment of robots in popular culture.

Many people’s darkest  fears seemed to be  confirmed by an in-depth report on the future of military robotics released by the U.S. Office of Naval Research. Created by the Ethics and Emerging Technology department of California State Polytechnic University, it is one of the first serious studies of its kind, dealing with practical ethical issues raised by the deployment of robots into combat zones.

Dangers do exist, but it is unlikely that robots will get anywhere close to posing a threat to human civilization. A popular concern is that robots will learn to act completely on their own, overwrite their programming and establish their own goals and objectives. The report deals with this hypothetical phenomenon, but states that programmers will likely avoid designing such robots should the technology be developed, so it is not a problem “for the foreseeable future.”

Autonomous robots with the ability to react and learn in combat situations will soon exist, but serious moral and ethical concerns must be addressed first. Robots must be able to distinguish between civilians, enemies and friendly soldiers in a changing environment.

By their very nature, military robots must be able to kill, but they must also know when not to kill. It may not be possible for a machine to draw the line between killing efficiently and committing an atrocity. Another issue is who would shoulder the blame for such an error: the designer, the robot or the government that deployed it?

Whether or not creating killer robots is ethical, it certainly seems inevitable. A concern the report raises is that the race to create the most advanced robots may make ethical guidelines and safety measures a secondary concern. The report warns that if ethics are ignored, “there is little hope that the early generations of such systems and robots will be adequate, making mistakes that may cost human lives.”

Military robots today are all ultimately controlled by humans, but America is on the verge of robot autonomy — in which robots’ actions are not dictated by human control. The report details nearly autonomous robots used around the world for combat, surveillance and defense. On the cutting edge is a security system in South Korea developed by Samsung that is “capable of interrogating suspects, identifying potential enemy intruders, and autonomous firing of its weapon.”

The U.S. is on the forefront of military robotics, using them routinely in Iraq and Afghanistan. Congress has mandated that by 2010 one-third of all deep-strike aircraft must be unmanned, and that by 2015 one-third of all military ground vehicles must be unmanned. Robot deployment in the Middle East increased from 162 units in 2004 to 5,000 in 2007, according to dailytech.com. This haste has already led to unforeseen malfunctions.

In April 2008, the Army recalled its talon SWORDS robots from active deployment. The SWORDS are small, mobile robots equipped with machine guns, grenade launchers or
anti-tank rockets. The machines can move through combat zones on their own, but a human operator must remotely pull the trigger for them to fire. The military has disclosed little information regarding the recall, but initial reports claimed the machines malfunctioned and began targeting friendly soldiers.

A far worse incident occurred in October 2007 in South Africa, where a semi-autonomous robotic cannon malfunctioned, killing nine friendly soldiers and wounding 14 others. Incidents like these highlight the need for caution in the years to come.

The world should not rush into the age of autonomous robots, but should instead move carefully. Robots will not become our overlords, but the potential for small-scale tragedies is too great if programmers and militaries do not instill effective systems of ethics into their robot warriors.

Michael Hardcastle is a freshman majoring in mass communications.