The National Transportation Safety Board released a preliminary report on the fatal March 2019 crash of a Tesla Model 3 with a semi-trailer in Delray Beach, Fla., stating that the Autopilot driver-assist system was active at the time of the crash.
The 2018 Tesla Model 3 EV was southbound in the right through lane of the 14000 block of State Highway 441 (US 441) in Delray Beach, Palm Beach County, Florida, when it struck an eastbound 2019 International truck-tractor in combination with a semitrailer.
As the Tesla approached the private driveway, the combination vehicle pulled from a driveway and traveled east across the southbound lanes of US 441. The truck driver was trying to cross the highway’s southbound lanes and turn left into the northbound lanes. According to surveillance video in the area and forward-facing video from the Tesla, the combination vehicle slowed as it crossed the southbound lanes, blocking the Tesla’s path.
The Tesla struck the left side of the semitrailer. The roof of the Tesla was sheared off as the vehicle underrode the semitrailer and continued south. The 50-year-old male Tesla driver died as a result of the crash. The 45-year-old male driver of the combination vehicle was uninjured.
he driver engaged the Autopilot about 10 seconds before the collision. From less than 8 seconds before the crash to the time of impact, the vehicle did not detect the driver’s hands on the steering wheel. Preliminary vehicle data show that the Tesla was traveling about 68 mph when it struck the semitrailer. Neither the preliminary data nor the videos indicate that the driver or the ADAS executed evasive maneuvers.
Autopilot also was active at the time of a 2016 fatal crash in Florida involving a semi-trailer and several other serious crashes since that incident.
Either Autopilot can’t see the broad side of an 18-wheeler, or it can’t react safely to it. This system can’t dependably navigate common road situations on its own and fails to keep the driver engaged exactly when needed most. Yet Tesla claims that it’s leading the way toward lifesaving, self-driving cars. If Tesla really wants to be a leader on safety, then the company must restrict Autopilot to conditions where it can be used safely and install a far more effective system to verify driver engagement.—David Friedman, Vice President of Advocacy for Consumer Reports
Consumer Reports sai that taking these steps would mean embracing the NTSB’s safety recommendations arising from the 2016 crash, which Tesla has failed to fully address despite those recommendations being made more than a year and a half ago.
In light of this latest tragic death and those preceding it, a disturbing pattern is becoming clearer—these kinds of crashes appear to be happening with some frequency in Teslas and not in other brands with similar technology. The National Highway Traffic Safety Administration should investigate whether Tesla Autopilot is the outlier it appears to be. If it is, then it poses an unreasonable risk to consumers’ safety and is defective.—David Friedman
Consumer Reports urges all drivers to pay close and consistent attention to the driving task, including when using a vehicle with driving automation features.
CR has previously called on Tesla to improve the safety of its Autopilot system. Last month, following an event for investors, CR urged Tesla to stop treating its customers like guinea pigs and instead demonstrate a driving automation system that is substantially safer than what is available today, based on evidence that is transparently shared with regulators and consumers; backed by rigorous simulations, track testing, and the use of safety drivers in real-world conditions; and validated by independent third-parties.
CR also urged the company to focus in the meantime on making sure that proven crash avoidance technologies on Tesla vehicles, such as automatic emergency braking with pedestrian detection, are as effective as possible.