Aviation industry players collaborate on first regional aviation ‘Perfect Flight’ in Sweden
Hyundai Capital America and flexdrive partner to develop alternative models to vehicle ownership

NTSB investigation of Model 3 fatal crash in March finds Autopilot was active

The National Transportation Safety Board released a preliminary report on the fatal March 2019 crash of a Tesla Model 3 with a semi-trailer in Delray Beach, Fla., stating that the Autopilot driver-assist system was active at the time of the crash.

The 2018 Tesla Model 3 EV was southbound in the right through lane of the 14000 block of State Highway 441 (US 441) in Delray Beach, Palm Beach County, Florida, when it struck an eastbound 2019 International truck-tractor in combination with a semitrailer.

As the Tesla approached the private driveway, the combination vehicle pulled from a driveway and traveled east across the southbound lanes of US 441. The truck driver was trying to cross the highway’s southbound lanes and turn left into the northbound lanes. According to surveillance video in the area and forward-facing video from the Tesla, the combination vehicle slowed as it crossed the southbound lanes, blocking the Tesla’s path.

The Tesla struck the left side of the semitrailer. The roof of the Tesla was sheared off as the vehicle underrode the semitrailer and continued south. The 50-year-old male Tesla driver died as a result of the crash. The 45-year-old male driver of the combination vehicle was uninjured.


he driver engaged the Autopilot about 10 seconds before the collision. From less than 8 seconds before the crash to the time of impact, the vehicle did not detect the driver’s hands on the steering wheel. Preliminary vehicle data show that the Tesla was traveling about 68 mph when it struck the semitrailer. Neither the preliminary data nor the videos indicate that the driver or the ADAS executed evasive maneuvers.

Autopilot also was active at the time of a 2016 fatal crash in Florida involving a semi-trailer and several other serious crashes since that incident.

Either Autopilot can’t see the broad side of an 18-wheeler, or it can’t react safely to it. This system can’t dependably navigate common road situations on its own and fails to keep the driver engaged exactly when needed most. Yet Tesla claims that it’s leading the way toward lifesaving, self-driving cars. If Tesla really wants to be a leader on safety, then the company must restrict Autopilot to conditions where it can be used safely and install a far more effective system to verify driver engagement.

—David Friedman, Vice President of Advocacy for Consumer Reports

Consumer Reports sai that taking these steps would mean embracing the NTSB’s safety recommendations arising from the 2016 crash, which Tesla has failed to fully address despite those recommendations being made more than a year and a half ago.

In light of this latest tragic death and those preceding it, a disturbing pattern is becoming clearer—these kinds of crashes appear to be happening with some frequency in Teslas and not in other brands with similar technology. The National Highway Traffic Safety Administration should investigate whether Tesla Autopilot is the outlier it appears to be. If it is, then it poses an unreasonable risk to consumers’ safety and is defective.

—David Friedman

Consumer Reports urges all drivers to pay close and consistent attention to the driving task, including when using a vehicle with driving automation features.

CR has previously called on Tesla to improve the safety of its Autopilot system. Last month, following an event for investors, CR urged Tesla to stop treating its customers like guinea pigs and instead demonstrate a driving automation system that is substantially safer than what is available today, based on evidence that is transparently shared with regulators and consumers; backed by rigorous simulations, track testing, and the use of safety drivers in real-world conditions; and validated by independent third-parties.

CR also urged the company to focus in the meantime on making sure that proven crash avoidance technologies on Tesla vehicles, such as automatic emergency braking with pedestrian detection, are as effective as possible.



TESLAs seem to have certain difficulties seeing large trucks/trailers sideways?

Roger Pham

Another winner of the Darwin's Award. "Fool me once, shame on you...Fool me twice, shame on me."

Brian P

The collision prevention logic has to be sorted out first, including the defensive-driving logic ... and only then, and after an extensive validation period to prove the system as fault-free as possible, the car may be allowed to steer itself under limited circumstances. The other manufacturers understand this. Tesla has put the cart before the horse.

There are SO many things that Tesla could do:
- Fix obstacle detection and evasion is the obvious one. Until it is bulletproof ... No auto-steer, and with that, no auto-navigate. It's fine for this to be proven out in limited circumstances or limited locations or limited conditions. Disallow operation otherwise!
- Defensive driving needs to be implemented. The system needs to recognize other vehicles on a conflicting path, and do something about it.
- Stop sign and red traffic signal detection needs to be bulletproof before hands-off operation is allowed on any road that has stop signs or traffic signals.
- Any time it is in automatic operation, it needs to have a plan for a graceful exit in case the circumstances allowing automatic operation come to an end and the driver doesn't respond. It follows from this, that for automatic operation to be allowed, it has to be able to respond to all foreseeable panic-emergency situations, because the driver cannot be expected to take over quickly.
- The car knows where it is. It knows what sort of road it's on. Disallow self-steer operation unless the car is known to be on a road suitable for its use and in conditions suitable for its use, and with a pre-planned graceful exit in the event that the driver does not respond to a request to take over. (See: Cadillac SuperCruise) It's fine for this to start out in limited circumstances and gradually expand as other locations and other conditions get proven out.

I'm not optimistic about Level 5 (full self driving in all conceivable circumstances, the no-steering-wheel scenario) ever happening. Level 4 with planned graceful exits will probably happen on pre-planned repeated routes and in decent weather. Level 3 is extremely dangerous due to lack of ability of humans to pay attention and immediately take over in circumstances when they are generally not doing anything (i.e. they'll be sleeping, or texting, not watching the road). Level 2 is proving to be dangerous when it masquerades as something it is not.


"Consumer Reports urges all drivers to pay close and consistent attention to the driving task,"

What has happened to the world? "Not driving with due care and attention" is a criminal offence in the UK - not an optional extra.

The comments to this entry are closed.