March 19, 2018 by David Gambrill
Uber has suspended all testing of its self-driving cars after an Uber car in autonomous mode reportedly struck and killed a woman in Phoenix, Arizona.
Uber suspended its testing of autonomous vehicles in Toronto, Phoenix, Pittsburgh and Arizona after a woman in Phoenix was struck and killed late Sunday or early Monday morning. Citing police sources, a report by Associated Press says an Uber vehicle was in autonomous mode, with a driver in the car, when the woman was struck while crossing the street outside of a crosswalk.
The woman later died from her injuries in hospital.
“Some incredibly sad news out of Arizona,” Uber CEO Dara Khosrowshahi posted on Twitter. “We’re thinking of the victim’s family as we work with local law enforcement to understand what happened.”
Many questions have yet to be answered about the incident, including when it happened, whether a bike may have been involved (and how that affected vehicle sensors), and where and how the pedestrian was crossing the street, Jonathan Grnak, a lawyer at Danson Recht LLP, told Canadian Underwriter on Monday.
Reports that there was a driver in the automated vehicle at the time of the crash raise issues about shared responsibility, Grnak said.
“My gut feeling is that, in the context of the levels of autonomy, with 0 being no autonomy and Level 5 being complete autonomy, we are at a stage where — between Levels 2 and 3, when you turn off the ability of the car to drive itself — you are going to have issues with: 1) driver attentiveness, and 2) liability,” Grnak said.
“During testing, Ontario has passed regulations saying that it doesn’t matter whether the vehicle is in autonomous or manual mode, the driver is responsible. It’s status quo in relation to liability, at least in Ontario.”
But the issue may get thornier in the future, as the Levels of autonomy begin to increase, as noted in the Insurance Institute of Canada paper, Automated Vehicles.
“In the future, when vehicles collide, insurers and the courts need to determine the shared responsibility of the drivers, vehicle owners, automakers, vendors, software engineers, and vehicle maintenance professionals,” the author of the Institute’s paper, Paul Kovacs, wrote. “Starting now, long-standing experience that drivers were responsible for most collisions must give way to a more complex reality of shared responsibility.”
The incident in Phoenix is thought to be the first time that a pedestrian has died in an incident involving a self-driving car.
In July 2016, a Tesla driver was killed in the first known fatal crash involving a self-driving car. The Tesla driver reportedly was watching a Harry Potter movie at the time of the collision with a truck in Florida, according to a truck driver involved in the crash.
A U.S. National Transportation Board report found the autonomous vehicle technology in the Tesla incident was not designed to pick up crossing traffic on the highway, such as the truck into which the Tesla collided.
“Contributing to the car driver’s overreliance on the vehicle automation was its operational design, which permitted his prolonged disengagement from the driving task and his use of the automation in ways inconsistent with guidance and warnings from the manufacturer,” the National Transportation Board report found.