A deadly crash that killed the driver of a Model S is being investigated by the National Highway Transportation Safety Administration (NHTSA), Tesla announced on Thursday. At the time of the accident, Tesla’s autopilot software was engaged. This is the first such incident in over 130 million miles where Autopilot was activated, the company said.
Tesla described how the accident occurred in a blog post.
What we know is that the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S. Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied. The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact cause the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S.
Before autopilot can be engaged, Tesla is explicit with drivers that the technology is still beta software and is merely an assist feature, not something drivers should rely on when driving. As such, Tesla’s visual warnings tell drivers to keep their hands on the wheel at all times should they need to take over control from the software. Frequent checks are also done to ensure a driver’s hands are on the wheel.
Tesla expanded further on the Model S’s autopilot features and why human intervention is still required in certain situations.
As more real-world miles accumulate and the software logic accounts for increasingly rare events, the probability of injury will keep decreasing. Autopilot is getting better all the time, but it is not perfect and still requires the driver to remain alert. Nonetheless, when used in conjunction with driver oversight, the data is unequivocal that Autopilot reduces driver workload and results in a statistically significant improvement in safety when compared to purely manual driving.
It’s unclear whether or not the driver of the Model S had their hands on the wheel and if they were alert to the situation on the road. Many pundits have criticized Tesla’s autopilot feature for falsely giving drivers the impression that the software is doing more than it really is. The NHTSA’s investigation will figure out whether or not Tesla’s software “worked according to expectations.”
In other words, was this human error or is Tesla’s software to blame?