Canadian Underwriter
News

Uber suspends testing of automated cars after pedestrian killed


March 19, 2018   by David Gambrill


Print this page Share

Uber has suspended all testing of its self-driving cars after an Uber car in autonomous mode reportedly struck and killed a woman in Phoenix, Arizona.

Uber suspended its testing of autonomous vehicles in Toronto, Phoenix, Pittsburgh and Arizona after a woman in Phoenix was struck and killed late Sunday or early Monday morning. Citing police sources, a report by Associated Press says an Uber vehicle was in autonomous mode, with a driver in the car, when the woman was struck while crossing the street outside of a crosswalk.

FILE – In this Monday, Sept. 12, 2016, photo, a group of self-driving Uber vehicles position themselves to take journalists on rides during a media preview at Uber’s Advanced Technologies Center in Pittsburgh. On Monday, March 27, 2017, Uber said it is resuming its self-driving car program in Arizona and Pittsburgh after it was suspended following a crash over the weekend. The company had also grounded self-driving cars in San Francisco over the weekend but they resumed operating earlier on Monday. The company said that it paused the operations over the weekend to better understand what happened in Arizona, but feels confident in returning the cars to the road. (AP Photo/Gene J. Puskar, File)

The woman later died from her injuries in hospital.

“Some incredibly sad news out of Arizona,” Uber CEO Dara Khosrowshahi posted on Twitter. “We’re thinking of the victim’s family as we work with local law enforcement to understand what happened.”

Many questions have yet to be answered about the incident, including when it happened, whether a bike may have been involved (and how that affected vehicle sensors), and where and how the pedestrian was crossing the street, Jonathan Grnak, a lawyer at Danson Recht LLP, told Canadian Underwriter on Monday.

Reports that there was a driver in the automated vehicle at the time of the crash raise issues about shared responsibility, Grnak said.

“My gut feeling is that, in the context of the levels of autonomy, with 0 being no autonomy and Level 5 being complete autonomy, we are at a stage where — between Levels 2 and 3, when you turn off the ability of the car to drive itself — you are going to have issues with: 1) driver attentiveness, and 2) liability,” Grnak said.

“During testing, Ontario has passed regulations saying that it doesn’t matter whether the vehicle is in autonomous or manual mode, the driver is responsible. It’s status quo in relation to liability, at least in Ontario.”

But the issue may get thornier in the future, as the Levels of autonomy begin to increase, as noted in the Insurance Institute of Canada paper, Automated Vehicles.

“In the future, when vehicles collide, insurers and the courts need to determine the shared responsibility of the drivers, vehicle owners, automakers, vendors, software engineers, and vehicle maintenance professionals,” the author of the Institute’s paper, Paul Kovacs, wrote. “Starting now, long-standing experience that drivers were responsible for most collisions must give way to a more complex reality of shared responsibility.”

The incident in Phoenix is thought to be the first time that a pedestrian has died in an incident involving a self-driving car.

In July 2016, a Tesla driver was killed in the first known fatal crash involving a self-driving car. The Tesla driver reportedly was watching a Harry Potter movie at the time of the collision with a truck in Florida, according to a truck driver involved in the crash.

A U.S. National Transportation Board report found the autonomous vehicle technology in the Tesla incident was not designed to pick up crossing traffic on the highway, such as the truck into which the Tesla collided.

“Contributing to the car driver’s overreliance on the vehicle automation was its operational design, which permitted his prolonged disengagement from the driving task and his use of the automation in ways inconsistent with guidance and warnings from the manufacturer,” the National Transportation Board report found.


Print this page Share

1 Comment » for Uber suspends testing of automated cars after pedestrian killed
  1. Panda says:

    Just like drones, they are not counting on the fact that people will mess with them. They will shoot at drones, and bullets land somewhere. They will be agggressive with self driving cars and try to get them to screw up. And that doesn’t even count the unpredictability factor of humans, not the extremely poor quality and inconsistency of LOTS of roads. SDCs and drones are ludicrous with the low level of AI so far achieved, esp. given that roads were never designed to be navigated by computers!

Have your say:

Your email address will not be published. Required fields are marked *

*