News & Politics  
comments_image Comments

The Rise of Killer Drones That Can Think for Themselves

The U.S. military may be a decade or so away from deploying an army of pilotless drones capable of collaborating and killing without any human guidance.

Continued from previous page

 
 
Share
 
 
 

Charles River Analytics garnered a contract to develop the “Adversary Behavior Acquisition, Collection, Understanding, and Summarization (ABACUS)” system, which will use a combination of intelligence gathered from informants, UAV surveillance, and wire-tapped phone calls to formulate an “intent-based threat assessments of individuals and groups.”  

In case that doesn’t work, another firm,  Modus Operandi, Inc., is using its Army contract to design “Clear Heart”, a program that determines “the likelihood of adversarial intent” based on an individual's behavior and apparent emotions. 

Rules of Engagement

Since drones enable soldiers to assassinate individuals by remote control from thousands of miles away, their increased use in Iraq and Afghanistan—along with countries such as Pakistan, Yemen, and Somalia, where the U.S. has not officially declared war—has led to a great deal of  controversy over ethics and international law.  Therefore, the introduction of autonomous killing machines further complicates the already shaky rules of engagement. 

Although it's not applied consistently, an international legal framework does exist to hold people accountable for human rights violations and war crimes.  However, there is no parallel legal structure that regulates the behavior of autonomous robotic weaponry, which is advancing at a faster rate than we can understand.  For example, if a drone malfunction leads to civilian deaths, who is held responsible? The machine? The programmer? The commander who approved the use of the machine? This is where it gets confusing.

Ronald Arkin, director of Georgia Tech’s  Mobile Robot Laboratory and author of  Governing Lethal Behavior in Autonomous Robots, argues that lethal robots can and should be programed to make ethical decisions and follow International law in warfare.  He even suggests robots will behave more ethically than humans because they lack emotions, meaning they won’t make reckless decisions that harm civilians based on anger, vengeance or fear of death.

However, Arkin’s hypothesis has yet to be proven. In the meantime, a group of robotics specialists and human rights advocates formed the  International Committee for Robot Arms Control (ICRAC) out of serious concern over the proliferation and advancement of robotic weapons technology without an international framework or doctrine to abide by. The ICRAC is calling for the international community to institute an arms-control regime to reduce the dangers associated with killer machines.

According to the  Unmanned Aircraft Systems Flight Plan 2009-2047, humans will retain the authority to override the system during a mission.  That’s good news, considering that every so often, a robot goes crazy and spontaneously empties its magazine, a not-so-hypothetical scenario that took place in South Africa in 2007 when an  antiaircraft cannon mysteriously malfunctioned, killing 9 soldiers and seriously injuring 14.  Nonetheless, Peter Singer, author of  Wired for War, told the  BBC, “We can turn the system off, we can turn it on, but our power really isn't true decision-making power. It's veto power now.”

Lethal Robots on the Rise

As robotic technology quickly advances, human willingness to gradually surrender control to autonomous thinking machines isn’t inherently bad.  If used appropriately, robotics could potentially save lives when geared toward search and rescue operations, medical equipment and treatments, or the destruction of mines and other explosives, all of which are already taking place.  However, these tasks are far different from the increasingly likely possibility of weaponized autonomous systems assassinating human targets.     

When UAVs were first deployed in 2001, there were 50 drones in the Pentagon's unmanned arsenal.  Fast forward ten years, and the Pentagon’s inventory has soared to over  7,000 unmanned vehicles that come in a variety shapes and sizes--and that doesn't even include the  15,000 driverless vehicles on the ground.  According to  National Defense Magazine, there are over 2,000 unmanned robots deployed alongside human ground troops in Afghanistan.