The National Highway Traffic Safety Administration of the United States (NHTSA) has assured that during its investigation, over the last three years, into the safety of the autonomous driving system of Tesla cars, it has identified the at least 13 accidents in which a total of 23 deaths occurred, and 54 more resulting in serious injuries, in which “the foreseeable misuse of the system by the driver apparently played a role,” according to the newspaper. British The Guardian.
Additionally, NHTSA says it has also found evidence that “Tesla’s weak driver engagement system was not appropriate for Autopilot’s operational capabilities,” leading to a “critical safety gap.”
Despite its name, Autopilot, Tesla’s driving assistance system, is only a system with level 2 (L2) of self-driving, on a scale in which L5 is the maximum possible, when the vehicle can travel in any condition without the need for driver intervention.
NHTSA has also expressed concern that the name Autopilot “may lead drivers to believe that automation has greater capabilities than it does and invite drivers to place too much trust in automation.”
In this sense, the company owned by Elon Musk recognized, in December, that the controls of the autopilot software system “may not be sufficient to prevent driver misuse” and could increase the risk of an accident.
Basically, the problem is that when the system warns the driver that he must pay attention or take the wheel, he can simply tell the system that he has seen the warnings and then ignore them, without the system stopping working and forcing the driver to take control of the vehicle.
That’s why Tesla recalled more than 2 million of its U.S. vehicles (which means almost all of its vehicles on U.S. roads) again in December, with the goal of making improvements that would ensure drivers pay attention when driving. They use their driving assistance system.
According to Tesla, this update included a greater importance of visual alerts and the deactivation of Autosteer – automatic lane change – if drivers do not respond to inattention warnings. Tesla even said it would restrict the use of Autopilot for a week if significant misuse by the vehicle driver was detected.
After this software update, the NHTSA opened a new investigation to determine if this was appropriate, after twenty new accidents in which, according to this agency’s analysis, vehicles with the new version of Autopilot installed were involved.
Tesla said it disagreed with the NHTSA analysis, but would still implement a new software update, this time over-the-air, that “will incorporate additional controls and alerts to further encourage the driver to comply with their driving responsibility.” keep going”. That investigation covers models Y, X, S, 3 and Cybertruck manufactured in the United States between 2012 and 2024 and equipped with the Autopilot system.
It should be remembered that in February 2023, Tesla already recalled 362,000 US vehicles to update their software after the NHTSA said the vehicles did not adequately comply with road safety laws and could cause accidents. And at the beginning of this month of April, Mexico’s consumer protection office called for software review and correction of more than 4,000 Tesla “due to a possible risk” due to the small font size of the visual warning indicators of the braking and parking systems.