Self-Driving Uber Detected the Pedestrian It Killed – But Still Didn't Stop

Tuesday, 08 May 2018 - 10:59AM
Artificial Intelligence
Tuesday, 08 May 2018 - 10:59AM
Self-Driving Uber Detected the Pedestrian It Killed – But Still Didn't Stop
< >
Pixabay/Pexels Composite
In March of this year, 49-year-old Elaine Herzberg was struck and killed by a self-driving Uber while walking her bike across the street in Arizona. Although an operator was in the vehicle at the time, the car was reportedly in autonomous mode. This accident was the first involving an autonomous car and has raised critical questions about the safety of the technology.

According to new reports, the features on the car that detect and avoid objects were working just fine – but the software may have chosen not to stop.

The sensors in the self-driving vehicle detected that Herzberg was crossing the street but, because of a now-obvious flaw in the software, she was identified as trash or debris. The software decided that the "false-positive" wasn't a big deal, so it told the car to ignore the flag and continue along its course – which it presumably would not have done if Herzberg was recognized as a pedestrian (this isn't Stephen King's Maximum Overdrive).

One immediate question would be: How big does an object have to be for the software to decide to avoid it? A tree limb the size of a person-plus-a-bike would cause significant damage to anything short of a Tesla semi truck. Herzberg should, by all rights, have been avoided regardless.

More importantly, however, is the question of how exactly a self-driving car determines what is human or animal and what is not. The slightest glitch in identification can – as we have already witnessed – have devastating consequences. Where does a machine draw the line?

Uber would not comment to Mashable about the software report, but the company said that it is cooperating with an investigation by the National Transportation Safety Bureau. "Our review is looking at everything from the safety of our system to our training processes for vehicle operators, and we hope to have more to say soon," a spokesperson said.

With everything from Uber rides to Amazon Prime delivery moving closer to full automation, it's important to ensure we're not putting human lives at risk for the sake of convenience. Relatively speaking, one self-driving accident doesn't hold a candle to the number of people who are killed by human drivers on a daily basis – but when the driver is a computer, assigning or absolving blame becomes a lot trickier.

Science
Technology
Artificial Intelligence
No