The Tesla Model X in the Mountain View crash also collided with a Mazda3 and an Audi A4, before the batteries burst into flame
The report into the March 2018 crash that killed Walter Huang has blamed a litany of failures in Tesla’s Autopilot system for the fatal accident.
Huang was killed when his Model X veered into a concrete barrier on the central reservation of a Mountain View road. Huang had earlier complained to his wife that the Tesla had a tendency to veer towards the crash barrier at that spot.
“System performance data downloaded from the Tesla indicated that the driver was operating the SUV using the Traffic-Aware Cruise Control (an adaptive cruise control system) and Autosteer system (a lane-keeping assist system), which are advanced driver assistance systems in Tesla’s Autopilot suite,” the report states.
The investigation also reviewed previous crash investigations involving Tesla’s Autopilot to see whether there were common issues with the system.
The NTSB findings and recommendations on the fatal Walter Huang crash are now available (PDF here: https://t.co/ERvmDSho26). Here are a few of what I believe are the most consequential:
— E.W. Niedermeyer (@Tweetermeyer) February 25, 2020
In its conclusion, it found a series of safety issues, including US highway infrastructure shortcomings. It also identified a larger number of issues with Tesla’s Autopilot system and the regulation of what it called “partial driving automation systems”.
One of the biggest contributors to the crash was driver distraction, the report concludes, with the driver apparently running a gaming application on his smartphone at the time of the crash. But at the same time, it adds, “the Tesla Autopilot system did not provide an effective means of monitoring the driver’s level of engagement with the driving task, and the timing of alerts and warnings was insufficient to elicit the driver’s response to prevent the crash or mitigate its severity”.
This is not an isolated problem, the investigation continues. “Crashes investigated by the NTSB [National Transportation Safety Board] continue to show that the Tesla Autopilot system is being used by drivers outside the vehicle’s operations design domain (the conditions in which the system is intended to operate). Despite the system’s known limitations, Tesla does not restrict where Autopilot can be used.”
But the primary cause of the crash was Tesla’s system itself, which mis-read the road.
“The Tesla’s collision avoidance assist systems were not designed to, and did not, detect the crash attenuator. Because this object was not detected,
(a) Autopilot accelerated the SUV to a higher speed, which the driver had previously set by using adaptive cruise control;
(b) The forward collision warning did not provide an alert; and,
(c) The automatic emergency braking did not activate. For partial driving automation systems to be safely deployed in a high-speed operating environment, collision avoidance systems must be able to effectively detect potential hazards and warn of potential hazards to drivers.”
The report also found that monitoring of driver-applied steering wheel torque is an ineffective way of measuring driver engagement, recommending the development of higher performance standards. It also added that US authorities hands-off approach to driving aids, like Autopilot, “essentially relies on waiting for problems to occur rather than addressing safety issues proactively”.
Tesla is one of a number of manufacturers pushing to develop full vehicle self-driving technology, but the technology still remains a long way off from completion.