An investigation by the U.S. into Tesla vehicles that were partially automated and crashed into emergency vehicles parked nearby has brought us closer to recall.
Thursday’s announcement by the National Highway Traffic Safety Administration (NHTASA) was that the probe will be upgraded to an engineering analysis. This is another indication of increased scrutiny for the electric vehicle maker as well as automated systems that do at least some driving tasks. The engineering analysis is the last stage of an investigation. In most cases, NHTSA determines in less than a year whether the probe should be closed or recalled.
The agency posted documents Thursday that raise serious concerns about Tesla’s Autopilot program. It was found that the system is being used in areas that are not within its capabilities, and that many drivers don’t take action to avoid accidents despite being warned by the vehicle.
According to the agency, there have been 16 accidents involving emergency vehicles or trucks bearing warning signs. This has resulted in 15 injuries and one death.
NHTSA started its investigation in August last year following a string of Teslas that had been using Traffic Aware Cruise Control or Autopilot systems. These Teslas crashed into vehicles in places where first responders used flares, flashing lights or cones to warn of dangers.
This probe covers 830,000 vehicles. It includes almost every Austin, Texas carmaker that has sold in the U.S. since 2014.
The agency stated that investigators will examine additional data and vehicle performance to “examine the extent to which Autopilot or associated Tesla systems might exacerbate human factors, behavioral safety risks undermining driver supervision effectiveness”
The Teslas sent forward collision alerts to drivers in the majority of 16 crashes. In about half of the cases, automatic emergency braking was used to slow down the cars. According to NHTSA documents, Autopilot lost control of Teslas in less than one second after the crash.
NHTSA stated in documents detailing its engineering analysis that it is also investigating crashes involving similar patterns, but that they did not include emergency vehicles and trucks with warning signs.
In many cases, the agency discovered that drivers were able to reach for the steering wheel but failed to act to avoid a collision. The agency stated that this suggests drivers might be willing to comply with the driver engagement strategy.
Investigators stated that the misuse or use of the driver monitoring system by a driver “or operating a vehicle in an unsanctioned manner” does not necessarily prevent a system defect.
Before pursuing a recall, the agency will need to determine if there is any safety defect.
The agency reviewed 191 crashes, but only 85 were removed because of other drivers or insufficient information. The main cause of the remaining 106 crashes appears to have been Autopilot being used in areas that it is restricted or can be affected by. “Example: operation on roads with limited access or in low visibility conditions like rain, snow, or ice.