How bad is Tesla Autopilot’s safety problem? According to thousands of alleged Tesla customers in the U.S. and around the world, pretty bad.
A huge data dump based on a whistleblower’s leak of internal Tesla documents shows that problems with Tesla’s automated driving technology may be far more common that media reports and regulators have let on, according to the German newspaper Handelsblatt, which published an article about it Thursday.
The reportedly leaked files add to the troubling anecdotes that have appeared in the media and on social media over the years about Tesla’s Autopilot and the experimental technology it has branded as Full Self-Driving. They spotlight Tesla’s attempts to keep safety complaints secret, and what appears to be a strategy to limit customer communications that might end up in lawsuits.
Tesla CEO Elon Musk did not respond to a request from The Times for comment.
Here are four of the biggest takeaways from the article about the leak.
1. The files include thousands of alleged customers’ complaints and descriptions of crashes.
In an article titled “ ‘My autopilot almost killed me,’ ” Handelsblatt said it received 100 gigabytes of data and 23,000 files including 3,000 entries about customers safety concerns and descriptions of more than 1,000 crashes. The complaints cover Teslas manufactured from 2015 to March 2022, the article said. The files contain more than 2,400 complaints about sudden acceleration and more than 1,500 complaints about braking problems, including unintentional emergency braking and so-called “phantom stops,” where the car suddenly brakes for no apparent reason, according to the article.
Customer phone numbers were included in the files, the article said. Handelsblatt said it contacted dozens of them, who confirmed the complaints were legitimate. A Michigan man reported that his Tesla “suddenly braked hard, as hard as you can imagine. I was pushed into the seat belt and the car almost came to a stop. Then another car hit me from behind.”
Beyond verifying complaints with customers, Handelsblatt showed the files to the Fraunhofer Institute for Secure Information Technology, which concluded there is no reason to believe “the data set does not come from IT systems belonging to or in the environment of Tesla.”
2. Tesla systematically avoids communicating with customers in writing.
The files, according to Handelsblatt, include “precise guidelines” for communicating with customers. Employees are instructed that unless lawyers are involved, they should not send written versions of their reviews but pass it on “VERBALLY to the customer.”
“Do not copy and paste the report below into an email, text message, or leave it in a voicemail to the customer,” the guidelines say, according to the article.
“They never sent emails, everything was always verbal,” the article quoted a California doctor, who said her Tesla accelerated on its own in the fall of 2021 and crashed into two concrete pillars.
Some customers did receive written responses, including one who complained about phantom braking and was told the Autopilot system was behaving “absolutely normally” and that he should reread the manual, according to the article.
Tesla has a long history of trying to cover up customer complaints about safety problems. As far back as 2016, the National Highway Traffic Safety Administration had to announce that customers were allowed to publicize safety issues after reports that Tesla was requiring customers to sign nondisclosure agreements to qualify for warranty repairs on problematic Model S suspensions systems.
3. Tesla threatened legal action over the leak.
When confronted about the leak and the safety concerns, Handelsblatt says, Tesla lawyers demanded the news organization send the company a copy of the data, and delete all other copies, and that it planned to take legal action “for the theft of confidential and personal data.”
4. These are the data that could force regulators to step up.
The alleged files will likely play a role in existing wrongful death lawsuits against Tesla alleging fundamental safety problems with its technology, and could prompt state and federal regulators to finally take action.
In 2021, the National Highway Traffic Safety Administration, dismissed similar customer complaints about unintended acceleration and blamed “driver error.”
Several of the U.S. safety regulator’s investigations into Tesla have been continuing for years, include a probe into an apparent tendency for Tesla cars to crash into fire trucks, ambulances and police cars parked on highways with their lights flashing.
The California Department of Motor Vehicles is investigating whether the name of Tesla’s $15,000 Full Self-Driving option violates state law and its own regulations against marketing vehicles as autonomous when they are not. The DMV will not say why that investigation has been going on for more than two years without resolution.
In China, regulators are already taking action. Just two weeks ago, Tesla was forced to issue an urgent software update on almost every car it has sold in China to address problems with unintended sudden acceleration.
Tesla CEO Musk has been promising actual full self driving since 2016, but has yet to deliver.
The Handelsblatt article is available behind a paywall, with an English translation.
This story originally appeared in Los Angeles Times.