Auto regulators said Thursday that they have opened a new investigation to determine whether Tesla’s Full Self-Driving software safely works in low-visibility conditions.
The probe launched by the National Highway Traffic Safety Administration comes in response to four reports of crashes involving the system. Full Self-Driving is available in most Teslas in the United States, about 2.4 million vehicles, though not all of their owners have opted to buy the FSD package. About half of Tesla owners were using the FSD system in the first quarter of this year, and that figure was still increasing, Tesla CEO Elon Musk said in April.
All Teslas do include a less robust driver-assist system called Autopilot.
The four reported crashes involved Tesla vehicles that entered areas where visibility on the roadway was poor, stemming from glare, fog or airborne dust, and Full Self-Driving was found to be active, NHTSA said. In one of those crashes, the vehicle struck and killed a pedestrian, the agency said, while another involved a reported injury.
Tesla did not immediately respond to a request for comment.
The investigation marks a continuation of years of regulatory scrutiny into Tesla’s driver-assistance systems, which were found to have been activated in numerous crashes. Tesla recalled more than 2 million vehicles in December after a lengthy investigation, and in April, NHTSA said it had opened a new probe into whether the recall had worked.
The investigation announced this week covers Tesla Models S and X from 2016 to 2024, Model 3s from 2017 through 2024, Model Y vehicles from 2020 through 2024, and Cybertruck vehicles from 2023 or 2024.