The National Highway Traffic Safety Administration (NHTSA) is examining Tesla’s Full Self-Driving (FSD) feature regarding its involvement in four accidents. These incidents occurred under conditions of reduced visibility, with either the beta or supervised versions of FSD in use. One notable accident in November 2023 in Arizona involved a Tesla Model Y, which unfortunately resulted in a pedestrian fatality, as reported by TechCrunch. Among the other three crashes, which took place between March and May of this year and involved Model 3 electric vehicles, one incident resulted in an injury.
The NHTSA has highlighted factors such as sun glare, fog, and airborne dust as contributing to reduced visibility in these cases. The agency’s Office of Defects Investigation (ODI) is evaluating FSD’s capability to “detect and respond appropriately to reduced roadway visibility conditions.” The investigation aims to identify if there have been other similar incidents with FSD active and to assess any modifications made by Tesla to the system that might influence its performance under such conditions. This review will scrutinize the timing, objective, and functionalities of any system updates, as well as Tesla’s evaluation of their safety impact.
In April, the NHTSA concluded an investigation into numerous crashes involving Tesla’s Autopilot system, determining that 13 of these incidents were fatal. The agency found that in many instances, drivers were “not sufficiently engaged” and that the warnings issued by Autopilot when Autosteer was active did not effectively ensure that drivers remained attentive to the driving task.
Recently, Tesla CEO Elon Musk announced that starting next year, the Model 3 and Model Y SUV should operate without supervision in California and Texas. At the same event, Musk introduced the Cybercab, a two-seater robotic cab with no steering wheel or pedals, which the company plans to start producing by 2027.
Tesla currently does not maintain a media relations department available for commentary.