August 2, 2016 by Terry Pedwell - THE CANADIAN PRESS
OTTAWA – Until Canadians own cars that truly drive themselves, they can forget getting off the legal hook if they’re in an accident with a vehicle that still has a steering wheel, suggests a report from Canada’s biggest law firm.
Under Canada’s common-law legal system, driving in semi-autonomous mode isn’t much different than operating a vehicle with cruise control, says the brief issued by Borden Ladner Gervais.
“As long as a driver with some ability to assume or resume control of the vehicle is present, there would seem to be a continuing basis for driver negligence and liability as they presently exist,” said the report entitled Autonomous Vehicles, Revolutionizing Our World, published this week on the firm’s website.
The report comes as the federal government contemplates developing regulations for automated vehicles. Ottawa set aside $7.3 million over two years in the spring budget to improve motor vehicle safety, with part of that money earmarked for developing new rules for self-driving cars.
But until fully autonomous vehicles hit the consumer market, there’s not much need to enact new laws, says BLG partner and report author Kevin LaRoche.
“With regards to driver liability, common law, coupled with the current legislation, may be sufficient to address liability involving all levels of autonomous vehicles, short of fully autonomous vehicles which do not require any level of human control,” LaRoche wrote.
“For fully autonomous vehicles, it would seem that legislative amendments would be required to clarify whether the owner would be vicariously liable and under what circumstances.”
Several jurisdictions have allowed testing of fully autonomous cars, buses and trucks. Ontario launched a program in January – under specific restrictions – to let auto manufacturers and high-tech companies try out their driverless inventions on the province’s roadways. None of the carmakers had applied for a testing permit under the program as of early July.
But with semi-autonomous vehicles – such as the Tesla Model S – already being sold to consumers, few jurisdictions have yet put legislative parentheses around where, when and how to drive them.
Ontario uses the SAE Standard to define categories of self-driveability on a scale from zero to five, with zero representing no automation features and five being full automation.
Category three vehicles are those considered to operate with conditional automation that requires a driver to pay attention to the road and take over control if the vehicle encounters a problem that can’t be handled fully by automated systems.
Germany’s federal transport ministry said recently it was working on a draft law to govern SAE level three and four cars.
The National Highway Traffic Safety Administration in the United States is working on new guidelines, but currently regulates autonomous vehicles under a slightly different system that was adopted in 2013.
Regardless of which scale is used, unless the car has no steering wheel, the driver will always face potential liability in an accident, with the scope depending on the circumstances of the mishap, said BLG partner Robert Love.
“There’s always going to be, we believe, that element of saying, ‘Did the driver act appropriately, prudently, in the circumstances of either engaging or disengaging whatever feature it happens to be?’” said Love.
It will be up to Canadian judges to decide, however, who is ultimately responsible for causing an accident in Canada – and that could also include the carmaker, he said.
Lawyers and legislators in the U.S. may already have their first test case in Florida following a recent fatal crash involving a Tesla and a tractor-trailer.
While investigators have revealed few details about the exact circumstances of the crash, there have been reports that the driver may have been distracted by a movie playing in his car.
The question for a judge may ultimately revolve around whether the driver was at fault for failing to pay attention to road hazards, or whether the sensors connected to the Tesla’s autopilot system failed to detect the white truck as it turned into the path of the car.
And it’ll be the judge who ends up portioning the blame, if there is any to be had, LaRoche predicted.
“Both parties will ultimately be before the court.”