Google's Driverless Cars Could Be Treated as Human Drivers Under the Law

Wednesday, 10 February 2016 - 12:17PM
Wednesday, 10 February 2016 - 12:17PM
Driverless cars have already become a reality, in the sense that the technology exists. Google's self-driving car has been in development for years, and in the past two years, a working prototype has been built without a steering wheel or pedals. But it has been questionable whether these cars, when they're ready, will actually be allowed on the roads, especially if the human driver has no way to control the car, or is absent altogether. Now, Google has had a major legal breakthrough, and has been officially informed that their autonomous operating system could be recognized as a "driver" under the law.

In a response to a letter from Google, made public today, the U.S. National Highway Traffic Safety Administration (NHTSA) said that it would interpret the word "driver" or "operator" in many provisions of the Federal Motor Vehicle Safety Standards (FMVSSs) as the system actually operating the car, rather than the human occupants (if there are any).
Opening quote
"If no human occupant of the vehicle can actually drive the vehicle, it is more reasonable to identify the driver as whatever (as opposed to whoever) is doing the driving," they said in their response. "In this instance, an item of motor vehicle equipment, the Self-Driving System, is actually driving the vehicle."
Closing quote

So what does this mean? It doesn't mean, of course, that driverless cars will be treated as "human beings" in the sense that they can go to jail for recklessness or vehicular homicide. The provisions mentioned in the letter mostly refer to the definitions of various parts of the cars, such as turn signals, brakes, hazards, etc, that need to comply with federal definitions that have the word "driver" in it. For example,  the standards state that "Each vehicle shall have one or more visual brake system warning indicators, mounted in front of and in clear view of the driver." In this case, it would only need to be "visible" to the self-driving operating system, or in other words, the SDS would need to be notified in some way. As a result of this interpretation of the law, driverless cars can potentially live up to NHTSA standards and be allowed on the roads.

However, there will clearly be many ethical and liability issues to be worked out before autonomous cars can be made available to the public, as it clearly doesn't make sense to treat a machine the same way one would an actual ethical agent. If there is an accident and someone is killed, responsibility will be a thorny issue. The letter mentions treating the word "driver" as meaningless before choosing to interpret the word "driver" as the operating system, but if I had to guess, that's essentially what will happen when it comes to liability. As long as the car doesn't malfunction and follows the rules of the road, accidents will probably be treated as just that, and no human will be held responsible. That sounds a little scary, but in the big picture, is probably much less scary than having much more fallible (and sometimes intoxicated) human drivers on the road.

Load Comments