04/30/2013 12:08 EDT | Updated 06/30/2013 05:12 EDT

How Automated Wars Rob Us Of Humanity

Militaries, and the U.S. military in particular, envisions a future where weapons do the thinking -- that is, planning, target selection and engagement. If we are not careful about the increasing push towards autonomous weapons, we may lose our very humanity in the process.

Getty Images
Protestors against the use of drone strikes by the US military hold a model of a drone aircraft during the 'March On Wall Street South' rally in Charlotte, North Carolina, ahead of the Democratic National Convention, on September 2, 2012. Hundreds of people chanting slogans and carrying signs against and for an assortment of different causes marched through the city to protest what they said was seedy corporate influence on politics. AFP PHOTO / ROBYN BECK (Photo credit should read ROBYN BECK/AFP/GettyImages)

Hannah Arendt once used the phrase "the banality of evil" to describe the character of Adolph Eichmann's acquiescence in committing atrocities for the Nazi regime. What this phrase means, in Eichmann's case, is that it was his "sheer thoughtlessness -- something by no means identical with stupidity -- that predisposed him to become one of the greatest criminals of that period." Indeed, it is "that such remoteness from reality and such thoughtlessness can wreak more havoc than all the evil instincts taken together," that evil is in this sense banal, means that there is no thought -- no decision -- to be (or to act) evil. It is so commonplace, and it is a lack of thinking that results in the most horrific of actions. Thus Eichmann's most dangerous element was that he threw away what it meant to be human -- he threw away his capacity for rational thought and reflection on right and wrong, good and evil.

We are at a similar juncture with regards to a "lack of thinking." In our case, however, it is in regards to the delegation of thinking to a machine, and a lethal machine in particular. What I mean here is that militaries, and the U.S. military in particular, envisions a future where weapons do the thinking -- that is, planning, target selection and engagement. Already the U.S. military services have capabilities that enable weapons to seek out and queue targets, such as the F-35 joint fighter and some targeting software platforms on tanks, like the M1 Abrams, as well as seeking out targets and automatically engaging them, like Phalanx or Counter Rocket, Artillery and Mortar (CRAM) systems.

The U.S.' decision to rely on unmanned aerial vehicles, or "drones," admits to the appeal of fighting at a distance with the use of automated technology. The current drones in combat operations, such as the Predator and Reaper, show the ease with which killing by remote can be accomplished. While drones are certainly problematic, from a legal and moral standpoint in regards to targeted killings, human beings still ultimately control this type of technology. Human pilots are in the "cockpit," and for better (or worse) there are human beings making targeting decisions.

The worry, however, is that militaries are planning to push autonomy further than the F-35 joint striker (which is far more autonomous than the Predator or Reaper) to "fully autonomous" weapons. Moreover, while we might try to push this worry aside and claim that it is a long way off, or too futuristic, we cannot deny the middle term between now and "fully autonomous" weapons. In this middle term, the warfighter will become increasingly dependent upon such technologies to fight. Indeed, we already see this in "automation bias" (or the over-reliance on information generated by an automated process as a replacement for vigilant information seeking and processing). With increased dependence on the technology, this automation bias will only increase and thus will lead to a degeneration of not only strategic thinking in the services, but like the case of Eichmann, a lack of thinking more generally.

The evil here is that through the banality of autonomy, we risk not only creating a class of unthinking warfighters, but that the entire business of making war becomes so removed from human judgment and critical thinking that it too becomes commonplace. In fact, it might become so banal, so removed from human agency, that even the word "war" starts to lose meaning. For what would we call a conflict where one side, or both, hands over the "thinking" to a machine, doesn't risk its soldiers' lives, and perhaps doesn't even place human beings outside of its own borders to fight? "War" does not really seem to capture what is going on here.

The danger, of course, is that conflicts of this type might not only perpetuate asymmetric violence, but that it further erodes the very foundations of humanity. In other words, if we are not careful about the increasing push towards autonomous weapons, we risk vitiating the thinking, judging and thus rational capacity of humanity. What was once merely automation bias becomes the banality of autonomy, and in an ironic twist, humans lose their own ability to be "autonomous."

The human warfighter is now the drone.

Photo gallery Drones: The Future Of Flight See Gallery
America Votes
The latest polls, breaking news and analysis on the U.S. election from HuffPost’s Washington, D.C. bureau