The Uber Accident's Impact


The New York Times reports that on the night of March 18, a self-driving car operated by Uber killed a pedestrian, forty-nine year-old Elaine Herzberg, on a public road in Tempe, Arizona. This is the first pedestrian fatality from an autonomous vehicle, sparking controversies in technological, legal, and civil circles.

Dashcam footage released the following Thursday showed Herzberg was nearly invisible before the impact while walking her bike across an intersection without any reflective gear. Tempe police believe the pedestrian may be at fault.

Even though limited visibility obstructed sight of the pedestrian, a self-driving vehicle should be even more equipped than a human driver to detect obstacles. Autonomous cars tested on the roads are equipped with cameras, radar, and lidar (laser pulses instead of sound frequencies detecting objects).

Even if the HDR cameras were not able to see the pedestrian, the other sensors should have detected her. Either the sensors were faulty, or the system did detect her but decided not to slow down. Neither of these scenarios casts a good light on Uber.

Although I am generally optimistic about the future of automated transportation, I believe this accident shows a gross negligence in Uber’s development and testing protocol. If we are to value the safety of self-driving cars over human drivers, we must hold companies to a higher standard in their attempts to mitigate harm.

I’m not saying we should fight the inevitable. Self-driving cars will become ubiquitous regardless of anyone’s feelings. The National Highway Traffic Safety Administration reports that ninety-four percent of all automotive accidents are the fault of human drivers. Who can argue against drivers who can’t drive drunk, get distracted, or become tired?

This issue may seem distant and irrelevant, but one day you will need to decide whether to purchase a self-driving vehicle. The policies made in the meantime will shape your relationship with technology of normal, everyday transportation.

The issue is not whether we will get to that future, but how. I propose lawmakers should hold Uber responsible for the Tempe accident as a product liability, in the same way a traditional auto manufacturer would be held responsible for a faulty engine design.

Ultimately, if Uber’s vehicle was unable to detect a pedestrian while only going forty miles per hour, Uber clearly has no business testing their vehicles on public roads. Uber’s response so far has been to temporarily pull all self-driving cars from the roads, and according to CNBC, Arizona governor Doug Ducey ordered Uber to suspend all testing operations on March 26.

It’s hard to say at this point whether there will be long-term repercussions from the Tempe accident. When Uber resumes testing, what decisions will have been made to ensure an accident of this caliber won’t happen again?

Incidents like these can make us question the underlying motives of progressive tech companies: do they value human life, are they concerned about the external impact they might have, and do they realize the precautions they should take? At the very least, both Uber and its competitors will need to seriously reevaluate their internal commitment to safety in every stage of their development and testing process.