Saturday, March 20, 2010

Predator Drones: The Immoral Use of Autonomous Machines

Predator Drones: The Immoral Use of Autonomous Machines
James Fetzer / 18 March 2010

"A robot may not injure a human being or, through inaction, allow a human being to come to harm" -- Isaac Asimov's "First Law of Robotics"

Madison, WI (OpEdNews) -- As a former Marine Corps officer, I am not opposed to weapons of war as a matter of principle. It's not the efficiency of predator drones that bothers me but the uncertainty of the identity of the actual targets and the collateral damage they cause. Predator drones are deadly [click here], but my concern is with whether their use is moral, especially when consideration is given to the political context involved. How many wedding parties are we going to take out because a drone saw group behavior that it had been programmed to hit? How often do we have sufficient information to know that we are actually targeting insurgents, not innocents? A recent report, "Over 700 killed in 44 drone strikes in 2009," for example, has calculated that 140 innocents are being killed for every "insurgent".

We are now invading Pakistani air space in our relentless determination to take out those who oppose us. From the point of view of the countries that we have invaded and occupied, they might be more aptly described as "freedom fighters." Since we invaded these countries in violation of international law, the UN Charter and the US Constitution, we appear to be committing crimes against humanity. We cannot know our conduct is immoral, however,unless we know the nature of morality. Consider what are usually referred to by philosophers as consequentialist and non-consequentialist theories. Under consequentialism, for example, an action is right when it produces as much GOOD (usually taken to be happiness) as any available alternative. But the problem remains of deciding FOR WHOM that happiness ought to be produced.

Rethink Afghanistan" is a ground-breaking, full-length documentary focusing on the key issues surrounding this war. By releasing this film in parts for free online, the producers are able to stay on top of news of the war as it continues to unfold.

According to Ethical Egoism, for example, an action is right if it brings about as much happiness for you personally as any available alternative. Consequences for others simply don't count. So Ted Bundy, John Gacy, and Jeffrey Dahmer, for example, are home free morally speaking, though few juries would be likely to be impressed by the argument that killing gave them more happiness than any available alternative. So Ethical Egoism does not adequately solve the problem. According to Limited Utilitarianism, by contrast, an action is right when it brings about as much happiness for your group as any available alternative. This is good news for The Third Reich, the Mafia, and General Motors. If no available alternative(s) would produce more happiness for Nazis than territorial acquisition, military domination, and racial extermination, then those qualify as moral actions if this theory is true. Predator drones are good if their use benefits your interests. The consequences for others, once again, simply don't matter.

The rockets launched from drones are highly destructive. This image shows the site of a missile attack in North Waziristan, Pakistan in March 2009.

Classic Utilitarianism, among consequentialist theories, is the only one that dictates encompassing the effects actions have upon everyone rather than some special class. But if a social arrangement with a certain percentage of slaves, say, 15%, would bring about greater happiness for the population as a whole because the increase in happiness of the masters outweighed the decrease in happiness of the slaves, then that arrangement would qualify as moral, necessarily! So, if theories that qualify manifestly immoral behavior as "moral" ought to be rejected, perhaps a non-consequentialist approach can do better. According to what is known as Deontological Moral Theory, in particular, actions are moral when they involve treating other persons with respect. More formally expressed, it requires that other persons should always be treated as ends (as intrinsically valuable) and never merely as means (instrumentally). Let us adopt this standard here.

When we are talking about a so-called "autonomous machine," then the question becomes whether or not such an entity is even capable of understanding what it means for something to be a person or to treat it with respect. There are ways to guarantee killing the enemy within a target zone, namely, by killing everyone in it. And there are ways to avoid killing the wrong target, namely, by killing no one in it. The problem is to kill all and only the intended targets. But is that possible? This becomes extremely problematic in the case of unconventional warfare. In principle, persons are entitled to be treated with respect by following rules of due process, where no one is deprived of life, liberty, or property without having the opportunity to defend themselves. In the case of the use of predator drones, however, the only processes utilized by autonomous machines are those that accrue from the target identification criteria with which they are programmed.

These machines, like other tools including computerized systems, are inherently amoral, neither moral nor immoral from a deontological point of view. They, like other digital machines, have no concept of morality, of personhood or of mutual respect. They are simply,complex causal systems that function on the basis of their programs. Were these conventional,wars involving well-defined terrain and uniformed combatants, their use, in principle, would be no different than high-altitude bombing or artillery strikes, where, even though the precise identity of our targets is not always known, in cases of that kind, we know who they are with high probability. In cases like these, our information is partial, sketchy, and all too often wrong. We are killing about 140 innocents for every intended target!

At least 55 strikes by unmanned drones have occurred since President Obama's inauguration. There were only 45 during the Bush era.

We are taking out citizens of Iraq, Afghanistan, and now Pakistan, which, alas, if research on 9/11 is well founded [] and [], have never threatened us. So we really have no business being there at all. Yet to this day we continue to hear about the threat from al-Qaeda and from Osama binLaden, who appears to have died in 2001 [click here]. We are depriving the citizens of other countries of their life, liberty, and property with no virtually no semblance of due process. We once believed it was better for ten guilty men to go free than for one innocent man to be punished. We now practice the policy that it is better for 140 civilians to die than for one suspected "insurgent" to live. We have come a long way from Isaac Asimov's "First Law."

No comments:

Post a Comment