A Tesla Model X similar to this was involved in a crash in the US where the driver was killed while the car was in autopilot mode.   Picture: TESLA MOTORS
A Tesla Model X similar to this was involved in a crash in the US where the driver was killed while the car was in autopilot mode. Picture: TESLA MOTORS

US electric car maker Tesla has said that one of its vehicles was under the control of its Autopilot software when it crashed on a California highway two weeks ago, killing the driver.

The disclosure on Friday marks the second confirmation in less than two weeks of a fatal accident involving a car that was effectively driving itself, following the death of a pedestrian who was struck by an Uber vehicle in Arizona.

The accidents are set to become the first serious test of whether regulators have been right to release imperfect versions of driverless car technology on to public roads, even when there are people behind the wheel who are meant to take control in dangerous situations.

Tesla said more lives were being saved by releasing the technology now, with Tesla cars being involved in 3.7 times fewer fatal accidents than others.

The US National Transportation Safety Board said early last week that it was investigating the crash of one of Tesla’s Model X SUVs on Highway 101, the main artery that links San Francisco with Silicon Valley.

The severe damage to the vehicle, which hit a concrete barrier acting as a lane divider, made it difficult to tell at first whether the Autopilot system had been engaged. Concerns about the crash contributed to a sharp sell-off in Tesla’s shares.

The car maker said that computer logs recovered from the vehicle showed that Autopilot had been in use. The software is an advanced driver-assistance system designed to handle aspects such as lane-holding on highways. Tesla CE Elon Musk has promoted the technology as capable of handling almost all aspects of a car journey, adding to the impression that it is at the forefront of driverless car technology nowadays.

The first fatal accident involving a car under the control of the Autopilot technology occurred in Florida two years ago, when a Tesla Model S drove into the side of a truck that was turning across the road in front of it.

In the latest accident, Tesla said computer records showed that the driver — identified by police as local resident Wei Huang — had not had his hands on the wheel in the six seconds before the crash. The concrete barrier was in view 137m before the collision, the company said, adding that "the vehicle logs show that no action was taken" to avoid the collision.

The accident is likely to prompt renewed scrutiny of driverless systems that rely on human intervention when the technology fails. Videos released by police in Arizona showed that Uber’s test driver did not look at the road for five seconds before its car hit and killed a pedestrian crossing the road in front of it.

Arizona’s regulators, who have been the most permissive in allowing driverless car tests on the state’s roads, immediately suspended Uber’s licence to test its vehicles. Tesla’s accident, meanwhile, presents a challenge for regulators in California, the US state where the largest number of car companies have been licensed to test the technology.

As a lower level of technology that is already built into its production vehicles, Tesla’s Autopilot does not require a special driverless car test licence. But the accident will draw fresh attention to the advanced sensors and other technologies that are meant to allow the cars to operate largely under their own control in certain situations.

© The Financial Times 2018

Please sign in or register to comment.