Generative Data Intelligence

NHTSA reminds Tesla to cough up data for Autopilot probe

Date:

An investigation by America’s National Highway Traffic Safety Administration (NHTSA) into the safety of Tesla Autopilot has led to a threat of fines if Elon Musk’s electric car company doesn’t hand over the data requested. 

If Tesla doesn’t comply with the order in the NHTSA’s July 3 letter [PDF], the agency said it could issue fines and penalties that could reach as high as $26,315 per violation per day – capping out at $131.5 million. 

That’s not to suggest that Tesla has been avoiding giving US highway regulators the data they’ve asked for. Documents from the investigation indicate Tesla has turned over information several times already. The NHTSA told The Register that fine warnings are a standard part of such letters no matter which manufacturer is getting them.

Among the data requested by the NHTSA is a full rundown of information on vehicles included in the investigation, which is a lot: “All Tesla vehicles, model years 2014–2023, equipped with [Autopilot] at any time.”

The NHTSA wants to know the software, firmware and hardware versions of each and every Tesla that falls into its investigative purview, whether the vehicles have a cabin camera installed, when the vehicle was admitted into Tesla’s full-self-driving beta, and dates of the most recent software/firmware/hardware updates.

That, as mentioned in the original engineering assessment document [PDF] filed in June of last year, includes an estimated 830,000 vehicles. And the NHTSA wants it all by July 19 – just two weeks after it sent the letter.

Cars on autopilot are one thing, but people?

The NHTSA’s investigation of Autopilot goes back to 2021, following a series of accidents in which ostensibly self-driving-ish Teslas plowed into vehicles stopped on the sides of freeways. 

After ten months of digging, the NHTSA upgraded its investigation to an engineering assessment – the first step toward a recall of the affected vehicles.

At the time, the NHTSA said it found reasons to investigate “the degree to which Autopilot and associated Tesla systems may exacerbate human factors or behavioral safety risks by undermining the effectiveness of the driver’s supervision.” 

In February, the agency revealed Tesla was voluntarily conducting an update of some 362,758 Teslas equipped with the full-self-driving beta because Autopilot software was causing them to ignore stop signs and generally “act unsafe around intersections.” 

For what it’s worth: Tesla’s Autopilot and the full-self-driving suite of features aren’t truly autonomous self-driving systems. They’re more driver-assist or super-cruise-control.

Tesla meanwhile admitted in February that the US Department of Justice had kicked off a criminal investigation into the same Autopilot issues as the NHTSA.

According to NHTSA data presented last year, some 70 percent of crashes involving driver assist software involve Teslas. More broadly, since the NHTSA began collecting level 2 automated driver-assist accident data in 2019 (Tesla Autopilot is a level 2 ADAS system no matter what Musk et al claim), Tesla vehicles using Autopilot have been involved in 799 accidents.

The data includes 22 fatal ADAS level 2 accidents since data collection began – 21 of which involved Teslas.

Tesla didn’t respond to requests for comment, and the NHTSA told us it can’t share information regarding ongoing investigations – including whether we can expect this years-long process to wrap up anytime soon. ®

spot_img

Latest Intelligence

spot_img

Chat with us

Hi there! How can I help you?