Tesla’s 2 million car Autopilot recall is now under federal scrutiny


Enlarge / A 2014 Tesla Model S driving on Autopilot rear-ended a Culver City fire truck that was parked in the high-occupancy vehicle lane on Interstate 405.

Tesla’s lousy week continues. On Tuesday, the electric car maker posted its quarterly results showing precipitous falls in sales and profitability. Today, we’ve learned that the National Highway Traffic Safety Administration is concerned that Tesla’s massive recall to fix its Autopilot driver assist—which was pushed out to more than 2 million cars last December—has not actually made the system that much safer.

NHTSA’s Office of Defects Investigation has been scrutinizing Tesla Autopilot since August 2021, when it opened a preliminary investigation in response to a spate of Teslas crashing into parked emergency responder vehicles while operating under Autopilot.

In June 2022, the ODI upgraded that investigation into an engineering analysis, and in December 2023, Tesla was forced to recall more than 2 million cars after the analysis found that the car company had inadequate driver-monitoring systems and had designed a system with the potential for “foreseeable misuse.

NHTSA has now closed that engineering analysis, which examined 956 crashes. After excluding crashes where the other car was at fault, where Autopilot wasn’t operating, or where there was insufficient data to make a determination, it found 467 Autopilot crashes that fell into three distinct categories.

First, 221 were frontal crashes in which the Tesla hit a car or obstacle despite “adequate time for an attentive driver to respond to avoid or mitigate the crash.” Another 111 Autopilot crashes occurred when the system was inadvertently disengaged by the driver, and the remaining 145 Autopilot crashes happened under low grip conditions, such as on a wet road.

As Ars has noted time and again, Tesla’s Autopilot system has a more permissive operational design domain than any comparable driver-assistance system that still requires the driver to keep their hands on the wheel and their eyes on the road, and NHTSA’s report adds that “Autopilot invited greater driver confidence via its higher control authority and ease of engagement.”

The result has been disengaged drivers who crash, and those crashes “are often severe because neither the system nor the driver reacts appropriately, resulting in high-speed differential and high energy crash outcomes,” NHTSA says. Tragically, at least 13 people have been killed as a result.

NHTSA also found that Tesla’s telematics system has plenty of gaps in it, despite the closely held belief among many fans of the brand that the Autopilot system is constantly recording and uploading to Tesla’s servers to improve itself. Instead, it only records an accident if the airbags deploy, which NHTSA data shows only happens in 18 percent of police-reported crashes.

The agency also criticized Tesla’s marketing. “Notably, the term “Autopilot” does not imply an L2 assistance feature but rather elicits the idea of drivers not being in control. This terminology may lead drivers to believe that the automation has greater capabilities than it does and invite drivers to overly trust the automation,” it says.

But now, NHTSA’s ODI has opened a recall query to assess whether the December fix actually made the system any safer. From the sounds of it, the agency is not convinced it did, based on additional Autopilot crashes that have happened since the recall and after testing the updated system itself.

Worryingly, the agency writes that “Tesla has stated that a portion of the remedy both requires the owner to opt in and allows a driver to readily reverse it” and wants to know why subsequent updates have addressed problems that should have been fixed with the December recall.



Source link

About The Author

Scroll to Top