In October, NHTSA initiated an investigation into 2.4 million Tesla vehicles equipped with FSD software following reports of four accidents, one of which was fatal in 2023, occurring under conditions such as sun glare, fog, and airborne dust.
In an email released on May 14, NHTSA informed Tesla that its social media content might lead the public to perceive FSD as a robotaxi, rather than recognizing it as a partial automation and driver assistance system that necessitates ongoing driver attention and occasional intervention.
NHTSA referenced Tesla's posts on X, including a story about an individual who used FSD to travel 13 miles (21 km) to an emergency room during a heart attack, as well as another post showcasing a 50-minute FSD drive home from a sporting event.
The agency stated, "We believe that Tesla’s postings conflict with its stated messaging that the driver is to maintain continued control over the dynamic driving task," and requested that Tesla reassess its communications.
Tesla, which had discussions with NHTSA in May regarding these social media posts, asserted that its owner's manual and other materials clarify that the vehicle is not autonomous and that drivers must remain attentive.
Tesla did not provide an immediate comment on Friday. Elon Musk serves as the CEO of Tesla and is the owner of X, the social media platform previously known as Twitter.
On Friday, NHTSA issued a letter dated Monday to Tesla, requesting responses to inquiries related to its investigation by December 18, including concerns about the driver assistance system's "potential failure to perform, particularly in detecting and responding appropriately in situations with reduced roadway visibility that may hinder FSD’s safe operation."
NHTSA stated that its investigation will evaluate whether the feedback or information provided by the system is sufficient for drivers to make timely decisions when the system's capabilities are exceeded.
A 71-year-old woman tragically lost her life in Rimrock, Arizona, after exiting her vehicle following a rear-end collision involving two other cars. She was struck by a Tesla operating in Full Self-Driving mode, with the driver struggling against sun glare, who did not face any charges.
In December 2023, Tesla consented to recall more than 2 million vehicles in the United States to implement new safety measures in its Autopilot advanced driver-assistance system, responding to NHTSA's ongoing assessment of the adequacy of these safeguards.