Yesterday, self-driving cars claimed their first victim.
On Sunday night, around 10 p.m., a 49-year-old woman was struck by an Uber self-driving vehicle while walking her bicycle across the street outside of a crosswalk in Tempe, Arizona. The accident took place while the Uber was traveling northbound on Mill Avenue where it intersects Curry Road. The woman was rushed to the hospital, but died from her injuries on Monday, March 19, 2018.
As multiple news sources are reporting, this is the first known pedestrian death due to an autonomous vehicle on a public road. Will it be the last? Probably not. As more driverless cars are made and tested and eventually make it to the open road, anything can go wrong…although we sincerely hope it doesn’t.
In response to this tragedy, Uber has suspended its self-driving vehicle testing in Phoenix, San Francisco, Pittsburgh, Phoenix, and Toronto—all of the North American tests. “Our hearts go out to the victim’s family. We are fully cooperating with local authorities in their investigation of this incident,” Uber spokeswoman Sarah Abboud expressed in a statement.
Though the vehicle was in “full autonomous mode,” according to Tempe police, there was an operator behind the wheel who had the ability to take control at any moment. But the self-driving car did not yield to the pedestrian; and the driver apparently had no time to react. The full cause of the crash is currently being investigated by several agencies, including a team from the National Transportation Safety Board, but this accident will almost certainly lead to stricter regulation for driverless cars. This is not necessarily a bad thing if it means more oversight and saved lives.
One thing is certain about this case: the car itself can’t be blamed, legally. Machines have absolutely no liability in court, and will not pay for a victim’s medical bills, lost wages, or other expenses. Similar to when a medical device malfunctions, the “fault” for the injuries it caused would fall on the company that made the device, or the doctor who didn’t place the device properly. In short, human or corporate error is still to blame when machines make mistakes and people are hurt.
Online commentators have been asking some hard questions on the New York Times’ website: “Will someone face manslaughter or reckless driving charges for this incident?” “Who gets sued, human driver or Uber or both?” “Who was negligent?” “In the time since reading this, how many people died from human-driven cars?” “Does this incident show that self-driving cars are not safer than humans?”
Though this fatal accident showed us all that self-driving cars can be lethal, people are still the main cause of car crashes in the United States by the numbers. Many industry experts believe that in the long run, autonomous vehicles will actually be safer than human-driven vehicles because they’ll never be distracted and always obey traffic laws.
But when it comes to braking hard for a kid chasing his ball into the road, can soulless machines be programmed well enough to do what every good, defensive driver instinctively knows to do? It’s all about expecting the unexpected, and we at Jurewitz Law Group hope that every person behind the driverless car industry took this tragedy to heart, and will use it to make better solutions.
Ross Jurewitz is the founder and managing lawyer of the Jurewitz Law Group, a San Diego personal injury attorney law firm. These San Diego injury accident lawyers specialize in helping people seriously injured in a variety of accidents throughout San Diego County and California.
Connect with Ross Jurewitz on Google+