Tesla hasn’t released Autopilot safety data in a year – something to hide?

Tesla hasn’t released an Autopilot safety data report in about a year. It’s not clear why, but it is disappointing as the company is being opaque with its self-driving data while missing timelines to deliver on its promises.

Since 2018, Tesla has been trying to create a benchmark for its improvement in Autopilot safety by releasing a quarterly report that compares the number of miles per accident on Autopilot versus off Autopilot. The data was always limited and criticized for not taking into account that accidents are more common on city roads and undivided roads than on the highways, where Autopilot is mostly being used.

However, it was still helpful to compare it against itself over time and see if there were any improvements, and there were some incremental improvements at times. But then Tesla suddenly stopped releasing those quarterly reports in 2022 without any explanation.

In January 2023, the company released reports again for the first three quarters of 2022. A few months later, the company released the Q4 report, but it has since stopped releasing data again, making the latest data almost a year old.

It’s not clear why Tesla is not releasing the data quarterly anymore like it has done for years.

For the last few years, Tesla hasn’t had a press relation team in the US to ask them questions like, “Why haven’t you released an Autopilot safety report in almost a year?” We have to speculate.

Electrek’s Take

As we previously reported, the data is far from perfect since Autopilot is primarily used on highways while the NHTSA data is for accidents everywhere, and it also includes data from all vehicles, including older vehicles without maintenance, which tends to more often be involved in accidents than newer vehicles like Teslas.

It’s possible that Tesla found the report to not be that useful, but it was still useful to track against itself and see improvements over time. The other explanation is that there have been no improvements over the last year, and Tesla is trying to hide that.

That’s a real possibility, especially considering Tesla’s history of trying to be very opaque about its Autopilot and FSD Beta data.

While we had very good access to self-driving data from programs by Waymo, Cruise, and others, thanks to the California DMV’s self-driving testing oversight, Tesla has managed to avoid being included in that by arguing that its FSD beta, which stands for “Full Self-Driving Beta,” is not a self-driving test program but a level 2 assisted driving system.

Tesla’s unwillingness to be more open to releasing data is concerning. Instead, CEO Elon Musk has often simply suggested that people watch videos of FSD Beta drives to keep track of progress, but that’s a very limited dataset.

It is also a problem that the most popular videos and the ones promoted by Musk and the Tesla community are often the ones that make FSD Beta look the best.

When talking to a broader array of FSD Beta testers, you will get a much wider range of opinions than what you see with a YouTube search. The consensus is that Tesla’s computer vision system is truly impressive and the driving behavior is good, but it still often makes dangerous mistakes and the path to a level 4 or 5 self-driving capability is less than clear.

That’s why it would be nice to have some data to track and see improvements over time toward Tesla actually delivering on a promise it has been making to new buyers since 2016.

Why no data, Tesla? Why?