Rescue workers attend the scene where a Tesla electric SUV crashed into a barrier on US Highway 101 in Mountain View, California. Picture: KTVU FOX 2 VIA REUTERS
Rescue workers attend the scene where a Tesla electric SUV crashed into a barrier on US Highway 101 in Mountain View, California. Picture: KTVU FOX 2 VIA REUTERS

Washington — The Tesla Model X that crashed in California earlier this year while being guided by its semi-autonomous driving system sped up to about 114km/h in the seconds before the vehicle slammed into a highway barrier, investigators said Thursday.

A US National Transportation Safety Board (NTSB) preliminary report on the March 23 accident in Mountain View raises new questions about the capabilities of Tesla’s semi-autonomous driving system and the actions of the driver. His hands were detected on the steering wheel only 34 seconds during the last minute before impact and he had programmed the car to drive at 120km/h, the report said.

The investigation is the latest to shine a spotlight on to potential flaws in emerging autonomous driving technology. In Tesla’s case, the company has touted its system as having self-driving capabilities, even though vehicles have failed to stop for stationary objects in the road in several cases and owners are warned to remain attentive.

Another National Transportation Safety Board probe of a self-driving Uber car that killed a pedestrian on March 18 in Arizona found that the car’s sensors picked up the victim, but the vehicle was not programmed to brake for obstructions.

Walter Huang, a 38-year-old engineer who worked at Apple, died in the March 23 crash when his Model X struck the barrier as he was using the driver-assistance system known as Autopilot. The car’s computer did not sense his hands on the steering wheel for six seconds before the collision, according to the safety board.

Tesla shares, which had been up 3.3%, fell during the day. The shares closed down 1.09% to $316.02 in New York trading.

The preliminary report did not include conclusions about what caused the crash. "All aspects of the crash remain under investigation as the NTSB determines the probable cause, with the intent of issuing safety recommendations to prevent similar crashes," the report said.

A Tesla spokeswoman declined to comment on the safety board’s report and pointed to a March 30 company blog post. In that post, the company said the driver had about five seconds and 150m of unobstructed view of the highway barrier but took no action to avoid the collision, citing vehicle logs.

"Tesla Autopilot does not prevent all accidents — such a standard would be impossible — but it makes them much less likely to occur," the company wrote in the post. "It unequivocally makes the world safer for the vehicle occupants, pedestrians and cyclists."

An attorney hired by Huang’s family, Mark Fong, said the National Transportation Safety Board report appears to contradict Tesla’s earlier characterisation of the accident and bolsters the family’s view that the car’s systems failed. "The Autopilot system should never have caused this to happen," said Fong of Minami Tamaki.

Huang was using Tesla’s Autopilot system continuously for nearly 19 minutes before the accident. The system made two visual and one auditory alert for the driver to place his hands on the steering wheel, but those occurred more than 15 minutes before the crash, according to the report.

The safety board did not report any alerts in the moments leading up to the crash.

The Tesla was following a lead vehicle at about 104km/h eight seconds before the crash. A second later, the car began to steer left while still following the lead vehicle. Four seconds before the crash it was no longer following the lead vehicle, the safety board said.

The Model X then accelerated from 99km/h to about 114km/h in the final three seconds before impact. The Autopilot’s cruise control system, which is designed to match the speed of a slower vehicle ahead of it, was set at 120km/h.

The Tesla collided with a so-called crash attenuator, a device covering the concrete barrier that is designed to absorb a vehicle impact to lower risks of damage and injuries. The attenuator had been damaged 11 days earlier in a previous accident and had not been repaired, according to National Transportation Safety Board. The barrier is in the median of the highway where it splits into two different directions.

No pre-crash braking or evasive steering movement was detected, according to safety board’s summary of performance data recorded by the car.

Huang, found belted in his seat after the crash, was removed by bystanders before being taken to a nearby hospital, where he died from his injuries. The impact was so violent it tore off the front section of the vehicle.

While Tesla tells drivers they must keep their hands on the steering wheel and monitor the semi-autonomous system, the car can follow traffic, steer and control speed in some situations.

The safety board originally announced it was looking into a fire that erupted in the car’s battery, which was damaged in the impact. The agency is also investigating a fire in a fatal Tesla crash on May 8 in Fort Lauderdale, Florida.

The Model X’s battery pack was sheared open by the crash and erupted in flames, which firefighters extinguished with about 75l of water and foam. The battery reignited on March 28 in an impound lot, the National Transportation Safety Board said. The safety board has examined the risks of battery fires for more than a decade.

Investigators expanded their probe in the Mountain View crash to include the vehicle’s automation after the company revealed the Autopilot system was switched on.

Two consumer advocacy groups charged on May 23 that Tesla’s promotional material on Autopilot were deceptive. A Tesla website says its vehicles have "full self-driving hardware". The site also contains a video of a car navigating streets without human input with text saying "the car is driving itself".

In Tesla’s May earnings call, CEO Elon Musk dismissed the notion that Autopilot users involved in accidents had the mistaken belief that the system was capable of fully autonomous driving. Driver complacency is more of a challenge, he said.

"When there is a serious accident, almost always — in fact, maybe always — the case that it is an experienced user and the issue is more one of complacency. Like, they get too used to it," Musk said on a conference call with analysts. "That tends to be more of an issue. It is not a lack of understanding of what Autopilot can do. It’s actually thinking they know more about autopilot than they do."

Tension in the National Transportation Safety Board Mountain View inquiry boiled over on April 11 when Tesla released information about the accident without first clearing it with investigators, prompting the agency to take the unusual action of removing the car company from official participation.

Tesla had issued comments blaming the driver of the Tesla SUV. "While we understand the demand for information that parties face during an NTSB investigation, uncoordinated releases of incomplete information do not further transportation safety or serve the public interest," safety board chairman Robert Sumwalt said in a statement.

Musk hung up on Sumwalt as he explained the removal, according to the safety board chief.

While the action was a rebuke to the company and cuts it out of the information loop on some matters, the National Transportation Safety Board retains legal authority to obtain information from Tesla engineers about how its car performed in the accident.

Issues that arose last year in a separate safety board investigation of a Tesla accident are likely to be raised again in the latest case.

The safety board found in September that Tesla’s design of its Autopilot contributed to a 2016 fatal crash in Florida. An Ohio man had driven for 37 minutes before the accident while only occasionally putting his hands on the steering wheel. Sensors on Tesla cars, which help it stay within highway lanes and brake when a car stops ahead, were not designed to see the large truck, the National Transportation Safety Board found.

The board recommended that regulators find better ways to measure driver attentiveness, such as using scanners that monitor where a person’s eyes are looking. While Tesla updated its vehicle’s software to make it more difficult to drive without hands on the wheel, it has rejected this scanning technology.

The safety board in the earlier accident said the drivers of both the truck and the Tesla were also partly responsible for the crash.

The safety board is also investigating two other incidents involving Teslas: An accident near Los Angeles in January in which a Model S struck a fire truck parked on a freeway while the car was on Autopilot and a Model X that crashed into a garage in August in Lake Forest, California, and its battery caught fire.

The National Highway Traffic Safety Administration is also investigating the case of a Model S driven by a woman that struck a stopped fire truck on a South Jordan, Utah, roadway on May 11. Autopilot was engaged and did not brake for the truck as the driver was looking at her phone, according to police.

Bloomberg

Please sign in or register to comment.