The National Highway Traffic Safety Administration (NHTSA) delved into the functionality and safety of Tesla’s (NASDAQ: TSLA) Autopilot system, yielding significant findings regarding driver engagement, crash rates, and system design.
The investigation focused on extensive crash analysis, human factors, and vehicle evaluations related to Tesla’s Autopilot system. This system, categorized as an SAE Level 2 (L2) Advanced Driver Assistance System (ADAS), has been the subject of scrutiny due to concerns over its control authority and driver engagement technologies.
Tesla’s response to these concerns culminated in the filing of a Defect Information Report on December 12, 2023. This recall, applicable to all Tesla models equipped with any version of Autopilot, aimed to address issues highlighted during the investigation, particularly those related to insufficient system controls leading to driver disengagement and avoidable crashes.
During the investigation, NHTSA and subject matter experts conducted a comprehensive examination, including multiple information requests to Tesla and peer manufacturers, crash analyses, technical meetings with Tesla, and hands-on vehicle evaluations. These efforts revealed alarming trends in crash occurrences involving Tesla vehicles operating with Autopilot engaged.
“These insufficient controls can lead to foreseeable driver disengagement while driving and avoidable crashes,” the report said.
Key findings from the crash analysis identified three primary categories of incidents:
- Frontal Plane Crashes: Involving 211 incidents, these crashes typically occurred when Tesla vehicles struck obstacles or other vehicles directly ahead. The investigation highlighted instances where hazards were visible to an attentive driver well before impact, suggesting a failure in maintaining driver engagement.
- Yaw Loss of Control: 145 crashes were attributed to Autopilot’s inability to maintain directional control, particularly in low-traction environments, leading to road departures and subsequent collisions.
- Inadvertent Override: 111 crashes occurred due to drivers inadvertently deactivating Autosteer while Tesla’s Traffic-Aware Cruise Control (TACC) remained engaged, often resulting in single-vehicle departures from the roadway.

Furthermore, the investigation revealed gaps in Tesla’s telematic data, which hindered accurate assessment of crash rates and comparison with peer L2 systems. Unlike other systems, Autopilot demonstrated resistance to manual steering inputs, potentially discouraging driver involvement in the driving task.
The report noted that “Tesla is not aware of every crash involving Autopilot even for severe crashes because of gaps in telematic reporting” and “Tesla’s telematics also do not fully account for the difference in crash report trends with other L2 systems.”
The investigation underscored the importance of clear communication and terminology in promoting responsible driver-automation interaction, saying “the term ‘Autopilot’ does not imply an L2 assistance feature, but rather elicits the idea of drivers not being in control. This terminology may lead drivers to believe that the automation has greater capabilities than it does and invite drivers to overly trust the automation.”
“Devastating for Tesla”
In a series of posts, X user Montana Skeptic delves into the implications of NHTSA’s findings, describing them as “devastating for Tesla in numerous ways.” The thread suggests that the information provided by the investigation could serve as valuable evidence for attorneys handling claims related to injuries or fatalities involving Tesla vehicles.
1/ I've taken a look at the NHTSA's most recent update by its Office of Defects Investigation. It's devastating for Tesla in numerous ways.
— Montana Skeptic @MontanaSkeptic@masto.ai (@montana_skeptic) April 29, 2024
Moreover, the thread posits that the investigation’s revelations could empower Tesla owners seeking reimbursement for what they perceive as defective products, particularly Enhanced Autopilot and Full Self-Driving features. Additionally, it suggests that investors who bought Tesla stock based on claims of superior safety may now find themselves facing contradictory evidence.
Echoing these comments, another observer asserts that the investigation’s findings are detrimental to Tesla’s position, suggesting that the data presented in the investigation report could significantly bolster the cases of plaintiffs.
This report could not be worse for $tsla, because it means that every lawyer filing an injury case for a non-driver involved w/ a Tesla crash involving Autopilot or FSD starts in a winning position w/ a jury. Tesla insurance just got a whole lot less profitable. pic.twitter.com/5ZsgEFQ3zb
— KSalberta @k_salberta@masto.ai @ksalberta.bsky.soc (@k_salberta) April 26, 2024
Of particular concern are the statistics related to frontal crashes and incidents involving FSD, which are described as alarming and likely to provoke strong reactions from juries. The posts imply that Tesla may face substantial financial liabilities as a result of these findings, especially considering the potential impact on insurance profitability and corporate liability.
Likewise, the observer also highlighted the possibility of shareholder lawsuits, suggesting that Tesla’s claims regarding the safety and future potential of Autopilot and FSD could be construed as fraudulent.
Information for this story was found via the NHTSA and the sources mentioned. The author has no securities or affiliations related to the organizations discussed. Not a recommendation to buy or sell. Always do additional research and consult a professional before purchasing a security. The author holds no licenses.