DETROIT — The National Highway Traffic Safety Administration (NHTSA) has raised concerns regarding Tesla’s communications to drivers, suggesting that they imply the vehicles can operate autonomously. This assertion contradicts formal guidance found in user manuals and discussions with the agency, which stress the necessity for human oversight while using the electric cars.
In a communication dated May, Gregory Magno, a division chief with the NHTSA’s Office of Defects Investigation, urged Tesla to clarify their messaging. This request accompanied a letter addressing an investigation into incidents involving Tesla’s “Full Self-Driving” (FSD) system during conditions of poor visibility, such as fog and sun glare. This investigation was initiated in October after reports of four specific crashes attributed to these factors, including one fatal incident involving a pedestrian in Arizona.
The inquiry reflects ongoing scrutiny surrounding Tesla’s naming conventions for its partially automated driving technologies, including “Full Self-Driving” and “Autopilot.” Critics, including Transportation Secretary Pete Buttigieg, have expressed concerns that these terms mislead consumers into believing they can rely on the system for complete autonomy. The communications from NHTSA further prompt questions about whether Tesla’s FSD technology is genuinely capable of safe operation without human intervention on public roads, a prospect Tesla CEO Elon Musk has often suggested.
Musk has indicated plans for autonomous versions of the Model Y and Model 3 to be operational without human drivers as early as next year, while robotaxis without traditional controls are expected to debut in 2026 in areas like California and Texas. A request for comments was sent to Tesla regarding these communications.
In the email from Magno, he noted that Tesla had previously provided the agency with details about a free trial of their FSD system and underscored that both the owner’s guide and user interface stress the importance of driver vigilance. However, he references multiple posts from Tesla’s account on X, the social media platform founded by Musk, which suggest that FSD can function independently.
Magno highlighted that these posts could mislead drivers into considering the FSD system as a fully autonomous solution rather than a system that necessitates ongoing attention and occasional intervention. For example, one post boasted about a user who relied on the FSD to reach a hospital during a medical emergency after the start of the free trial.
Furthermore, Tesla’s website mentions that full usage of FSD and Autopilot requires meeting standards of reliability and receiving regulatory approval. This statement, however, was paired with a video clip featuring a driver casually sitting with their hands off the wheel, accompanied by the implication that the car was independently managing the drive.
Regarding the investigation into low-visibility driving capabilities, Magno asserts that it will assess whether the system adequately informs users when conditions exceed its operational limits. He emphasized the potential lack of effectiveness in communication about how, when, or where FSD can be reliably used.
The letter specifically requested that Tesla outline the visual and audio warnings provided to drivers concerning situations where the FSD system may struggle with visibility issues. The NHTSA has set a deadline of December 18 for Tesla to respond, although the automaker is able to request an extension.
The timeline for concluding the investigation may coincide with potential changes in the regulatory landscape due to the anticipated administration of President-elect Donald Trump, who has expressed interest in placing Musk in a position to oversee governmental efficiency. Concerns have arisen among auto safety advocates that Musk’s influence could hinder oversight of Tesla’s technologies amidst ongoing investigations.
Musk has even entertained the notion of contributing to the creation of national safety standards for self-driving vehicles. Critics worry that such involvement would lead to conflicts of interest, raising alarms about democracy and regulatory integrity. Michael Brooks, director of the Center for Auto Safety, voiced deep concerns about a scenario where a business leader directly affects the regulations governing their industry. “That’s a huge problem for democracy, really,” he stated.