×
welcome covers

Your complimentary articles

You’ve read one of your four complimentary articles for this month.

You can read four articles free per month. To have complete access to the thousands of philosophy articles on this site, please

Articles

Can Robots Be Ethical?

No, says Robert Newman.

Should the driverless vehicles being developed by Apple, Google and Daimler be programmed to mount the pavement to avoid a head-on collision? Should they be programmed to swerve to hit one person in order to avoid hitting two? Two instead of four? Four instead of a lorry full of hazardous chemicals? Driverless cars programmed to select between these options would be one example of what the science journal Nature has taken to calling ‘ethical robots’. Another is the next generation of weapons. If drones weren’t bad enough, the US Defence Department is developing Lethal Autonomous Weapons Systems (LAWS). These select their own kill list using a set of algorithms, and need no human intervention, at however remote a distance. Autonomous drones in development include tiny rotorcraft smaller than a table-tennis ball, which will be able to float through homes, shops and offices to deliver a puncture to the cranium.

In July 2015, Nature published an article, ‘The Robot’s Dilemma’, which claimed that computer scientists “have written a logic program that can successfully make a decision… which takes into account whether the harm caused is the intended result of the action or simply necessary to it.” (I find the word ‘successfully’ chilling here; but not as chilling as ‘simply necessary’.) One of the scientists behind the ‘successful’ program argues that human ethical choices are made in a similar way: “Logic is how we… come up with our ethical choices.” But this can scarcely be true. To argue that logic is how we make our ethical decisions is to appeal to what American philosopher Hilary Putnam describes as “the comfortable eighteenth century assumption that all intelligent and well-informed people who mastered the art of thinking about human actions and problems impartially would feel the appropriate ‘sentiments’ of approval and disapproval in the same circumstances unless there was something wrong with their personal constitution” (The Collapse of the Fact/Value Dichotomy and Other Essays, 2002). However, for good or ill, ethical choices often fly in the face of logic. They may come from emotion, natural cussedness, vague inkling, gut instinct, or even imagination. For instance, I am marching through North Carolina with the Union Army, utterly logically convinced that only military victory over the Confederacy will abolish the hateful institution of slavery. But when I see the face of the enemy – a scrawny, shoeless seventeen-year-old – I throw away my gun and run sobbing from the battlefield. This is an ethical decision, resulting in decisive action: only it isn’t made in cold blood, and it goes against the logic of my position.

The Descartes System
The Descartes System © Peter Pullen 2015 peterpullen.weebly.com

Unable to do the difficult human business of working through incompatible ideas, an ethical robot’s program simply excludes them. If, for example, the US Defense Department wants LAWS to be a ‘successful’ weapons system, their ethical data entry had better exclude the International Covenant on Civil and Political Rights (1966), with its provisions on the security of persons, procedural fairness, and rights of the accused – or else the drone might turn tail and direct its fire at those who gave wings to its eternal mission of unlimited extra-judicial killing.

Talk of procedural justice brings us to a philosopher whose work, I think, offers valuable insights into the problems associated with ‘ethical robots’. As an intelligence officer in World War II, Stuart Hampshire’s interrogation of senior Nazis led him to argue for the primacy of procedural justice over all other concepts of justice. There can be no justice in the broad sense without justice in the narrow, procedural sense, he said. Even if the outcome of a jury trial is identical to the outcome of a kangaroo court, due process leaves one verdict just and the other unjust.

It’s the same with automated law. All the gigabytes in the world will never make a set of algorithms a fair trial. Rather, justice entails being judged by flesh and blood citizens in a fair process – flesh and blood because victims increasingly demand that the court consider their psychological and emotional suffering. By its very nature, justice cannot be impersonal and still be just. “Use every man after his desert,” Hamlet snaps at Polonius, “and who shall ‘scape whipping?”

Delegating ethics to robots is unethical not just because robots do binary code, not ethics, but also because no program could ever process the incalculable contingencies, shifting subtleties, and complexities entailed in even the simplest case to be put before a judge and jury. And yet the law is another candidate for outsourcing, to ‘ethical’ robot lawyers. Last year, during a BBC Radio 4 puff-piece on the wonders of robotics, a senior IBM executive explained that while robots can’t do the fiddly manual jobs of gardeners or janitors, they can easily do all that lawyers do, and will soon make human lawyers redundant. However, when IBM Vice President Bob Moffat was himself on trial in the Manhattan Federal Court, accused of the largest hedge fund insider-trading in history, he inexplicably reposed all his hopes in one of those old-time human defence attorneys. A robot lawyer may have saved him from being found guilty of two counts of conspiracy and fraud, but when push came to shove, the IBM VP knew as well as the rest of us that the phrase ‘ethical robots’ is a contradiction in terms.

© Robert Newman 2015

Robert’s BBC Radio 4 series begins transmission on October 8th, 11.30 pm, called Robert Newman’s Entirely Accurate Encyclopaedia of Evolution. The similarly-titled book, The Entirely Accurate Encyclopaedia of Evolution, is published Oct 1st by Freight.

This site uses cookies to recognize users and allow us to analyse site usage. By continuing to browse the site with cookies enabled in your browser, you consent to the use of cookies in accordance with our privacy policy. X