The Biggest Ethical Problems with Driverless Cars

Thursday, 21 August 2014 - 2:16PM
Thursday, 21 August 2014 - 2:16PM
The Biggest Ethical Problems with Driverless Cars

"The car stopped at stop signs. It glided around curves. It didn't lurch or jolt. The most remarkable thing about the drive was that it was utterly unremarkable." 

 

This quotation is from a recent Reuters report about a test drive of Google's new driverless car. Several companies are concurrently building this technology, with self-driving cars set to be tested on the road in the UK starting in early 2015. If the technology works as intended, it could have immeasurable benefits to society. Aside from the obvious- saving drivers time and effort- it could allow the elderly and disabled to regain some measure of self-sufficiency, it would make drunk driving a moot point (not to mention texting and driving), and all the early data suggests that it would exponentially reduce the number of accidents in general, maybe even virtually eliminating them. That being said, there are also a host of potential ethical problems that need to be considered before the cars are released to the general public. 

 

1) The Tunnel Problem

 

The Tunnel Problem, published by philosopher Jason Millar, is a variation of the infamous Trolley problem. If a robotic car is driving on a narrow road, and a child falls into the middle of the road just before it enters a tunnel, then it will hypothetically have two options. It can continue straight and kill the child, or it can swerve into the wall of the tunnel and kill the driver. For argument's sake, death is assumed to be guaranteed for either the child or the driver.

 

This is an ethical dilemma that is nearly impossible for a human to resolve, let alone a robot without capacity for moral reasoning. There are humans designing the car's software, but Millar argues that they don't have the right to make an ethical decision for millions of drivers. This would be paternalistic, in the sense that it would take away the drivers' right to make their actions reflect their own morals. He also compares it to an end-of-life decision, as it is a choice that determines how and when the person will die. Just as a terminal patient has the right to refuse life-saving measures, drivers should have the right to sacrifice themselves for another living being, if they so choose.

 

"We should expect drivers' preferences to demonstrate their personal moral commitments. It could be that a very old driver would always choose to sacrifice herself to save a child. It might be that a deeply committed animal lover might opt for the wall even if it were a deer in his car's path. It might turn out that most of us would choose not to swerve. These are all reasonable choices when faced with impossible situations. Whatever the outcome, it is in the choosing that we maintain our personal autonomy."

 

However, his own data showed that just as many people do not want moral autonomy in this situation. In a poll that asked who should make the decision for the Tunnel Problem, 138 initial responses showed that 46% of respondents believed that the driver should make the decision, while 12% thought the car's designers should program the car to make a certain decision in a given situation, and 31% wanted to shift responsibility to lawmakers.

 

"That so many people were willing to trust a life and death situation to politicians and lawmakers really surprised me," Mr. Millar says. "Many of them said they wanted a standard behaviour so that people would know what to expect in that situation, while others simply wanted someone else to make the decision and take it off their hands."

 

Incidentally, what would happen if drivers were to make that decision? That same poll showed that a full two-thirds of people would kill the child. 

 

2) The difference between laws and ethics

 

Early discourse has assumed that these cars will be programmed to obey the law, but is that necessarily ethical? Certain laws, such as speeding or jaywalking, are broken constantly, often because there is no substantial moral value to obeying them in certain situations. The engineers behind the Google driverless car seem to recognize this issue to some extent, as they recently announced that their car would be programmed to go up to ten miles per hour over the speed limit in order to keep pace with other cars on the road. But this is mostly solving a logistical problem rather than an ethical one, as the cars will still not be able to exercise any kind of "judgment" in situations in which speeding is the most ethical response to a situation. For example, in case of a medical emergency, one would likely want to drive more than ten miles per hour above the speed limit, but that kind of discretion would not be in this machine's repertoire. On the other hand, infallibly following any set of rules will make the cars more predictable and reduce the number of accidents in society as a whole, so there's an argument to be made that having hard-and-fast rules is, in fact, the most ethical solution.

 

3) Can robots have moral responsibility?

 

This is as much a legal issue as an ethical one, as many are beginning to ask who would be liable in case of an accident involving autonomous vehicles. "There's also the problem of who's culpable when a car crashes. If we maintain current standards of product liability, then the fault will tend to lie with the manufacturer, but we may also shift to a system where we consider the robot at fault," said Millar. "Holding the robot responsible may be less satisfying for those with a mind for punitive justice."

 

But in addition to being a legal quandary, this also goes to the heart of all of these problems. These machines will be given an enormous amount of responsibility, as they will be entirely entrusted with people's safety and often their lives. In many ways, they are far more equipped to handle this responsibility than humans, but it will be both unsettling and problematic to give this responsibility to an entity that cannot claim to have morals or moral responsibility.

Science
Technology

Load Comments