Tesla withheld data, lied, misdirected police to avoid blame in Autopilot crash

Aug 4, 2025 - 17:15
 0  0
Tesla withheld data, lied, misdirected police to avoid blame in Autopilot crash

Tesla was caught withholding data, lying about it, and misdirecting authorities in the wrongful death case involving Autopilot that it lost this week.

The automaker was undeniably covering up for Autopilot.

Last week, a jury found Tesla partially liable for a wrongful death involving a crash on Autopilot. I explained the case in the verdict in this article and video.

But we now have access to the trial transcripts, which confirm that Tesla was extremely misleading in its attempt to place all the blame on the driver.

The company went as far as to actively withhold critical evidence that explained Autopilot’s performance around the crash.

Tesla withheld the crash‑snapshot data that its own server received within minutes of the collision

Within about three minutes of the crash, the Model S uploaded a “collision snapshot”—video, CAN‑bus streams, EDR data, etc.—to Tesla’s servers, the “Mothership”, and received an acknowledgement. The vehicle then deleted its local copy, resulting in Tesla being the only entity having access.

What ensued were years of battle to get Tesla to acknowledge that this collision snapshot exists and is relevant to the case.

The police repeatedly attempted to obtain the data from the collision snapshot, but Tesla led the authorities and the plaintiffs on a lengthy journey of deception and misdirection that spanned years.

Here, in chronological order, is what happened based on all the evidence in the trial transcript:

1 | 25 Apr 2019 – The crash and an instant upload Tesla pretended never happened

Within ~3 minutes of the crash, the Model S packaged sensor video, CAN‑bus, EDR, and other streams into a single “snapshot_collision_airbag-deployment.tar” file and pushed it to Tesla’s server, then deleted its local copy.

We know that now, thanks to forensic evidence extracted from the onboard computer.

The plaintiffs hired Alan Moore, a mechanical engineer who specializes in accident reconstruction, to forensically recover data from the Autopilot ECU (computer).

Based on the data, Moore was able to confirm that Tesla had this “collision snapshot” all along, but “unlinked” it from the vehicle:

“That tells me within minutes of this crash Tesla had all of this data … the car received an acknowledgement … then said ‘OK, I’m done, I’m going to unlink it.’”

The plaintiffs tried to obtain this data, but Tesla told them that it didn’t exist.

Tesla’s written discovery responses were shown during the trial to prove that the company acted as if this data were not available.


2 | 23 May 2019 – Tesla’s lawyer scripts the homicide investigator’s evidence request

Corporal Riso, a homicide investigator with the Florida Highway Patrol (FHP), sought Tesla’s help in retrieving telemetry data to aid in reconstructing the crash.

He was put in contact with Tesla attorney Ryan McCarthy and asked if he needed to subpoena Tesla to get the crash data.

Riso said of McCarthy during the trial:

“He said it’s not necessary. ‘Write me a letter and I’ll tell you what to put in the letter.’

At the time, he didn’t see Tesla as an adversary in this case and thought that McCarthy would facilitate the retrieval of the data without having to go through a formal process. However, the lawyer crafted the letter to avoid sending the police the full crash data.

Riso followed the instructions verbatim. He said during the trial:

“I specifically wrote down what the attorney at Tesla told me to write down in the letter.”

But McCarthy specifically crafted the letter to ommit sharing the colllision snapshot, which includes bundled video, EDR, CAN bus, and Autopilot data.

Instead, Tesla provided the police with infotainment data with call logs, a copy of the Owner’s Manual, but not the actual crash telemetry from the Autopilot ECU.

Tesla never said that it already had this data for more than a month by now.


3 | June 2019 – A staged “co‑operation” that corrupts evidence

Tesla got even more deceptice when the police specifically tried to collect the data directly from the Autopilot computer.

On June 19, 2019, Riso physically removed the MCU and Autopilot ECU from the Tesla.

Again, the investigator thought that Tesla was being collaborative with the investigation at the time so he asked the company how to get the data out of the computer. He said at the trial:

I had contacted Mr. McCarthy and asked him how I can get this data off of the computer components. He said that he would coordinate me meeting with a technician at their service center, the Tesla service center in Coral Gables.

Tesla arranged for Riso to meet Michael Calafell, a Tesla technician, at the local service center in in Coral Gables with the Autopilot ECU and the Model S’ MCU, the two main onboard computers.

To be clear, Tesla already had all this data in its servers and could have just sent it to Riso, but instead, they lured him into its service center with the piece of evidence in his custody.

What ensued was pure cinema.

Michael Calafell, who testified never having been tasked with extracting data from an Autopilot ECU before, connected both computers to a Model S in the shop to be able to access them, but he then claimed that the data was “corrupted” and couldn’t be access.

Riso said during his testimony:

I brought the center tablet [MCU] and the flat silver box [Autopilot ECU] with multicolored connectors to the Tesla service center.”

“I watched Mr. Calafell the whole time. The evidence was in my custody. I did not let it out of my sight.”

However, the situation got a lot more confusing as Calafell swore in an affidavit that he didn’t actually power the ECU, only the MCU, on that day, June 19.

Only years later, when Alan Moore, the forensic engineer hired by the plaintiff, managed to get access to the Autopilot ECU, we learned that Tesla undeniably powered up the computer on June 19 and the data was accessible.


4 | 2019 – 2024 – Repeated denials and discovery stonewalling

Through years of communications with the police, the plaintiffs and the court through the investigation and later the discovery process for the lawsuit, Tesla never mentioned that it had all the data that explained how Autopilot saw the crash, which everyone was seeking, sitting on its servers for years.

The facts are:

  • Tesla had the data on its servers within minutes of the crash
  • When the police sought the data, Tesla redirected them toward other data
  • When the police sought Tesla’s help in extracting it from the computer, Tesla falsely claimed it was “corrupted”
  • Tesla invented an “auto-delete” feature that didn’t exist to try explain why it couldn’t originally find the data in the computer
  • When the plaintiffs asked for the data, Tesla said that it didn’t exist
  • Tesla only admitted to the existence of the data once presented with forensic evidence that it was created and transfered to its servers.

5 | Late 2024 – Court orders a bit‑for‑bit NAND‑flash image

By late 2024, the court allowed the plantiffs to have a third-party expert access the Autopilot ECU to try to acccess the data that Tesla claimed was now corrupted.

The court allowed the forensic engineers to do a bit-for-bit NAND flash image, which consists of a complete, sector-by-sector copy of the data stored on a NAND flash memory chip, including all data, metadata, and error correction code (ECC) information.

The engineers quickly found that all the data was there despite Tesla’s previous claims.

Moore, the forensic engineer hired by the plaintiffs, said:

“Tesla engineers said this couldn’t be done… yet it was done by people outside Tesla.”

Now, the plaintiffs had access to everything.


6 | Feb‑Mar 2025 – The forensic “treasure‑trove” reveals the file name & checksum

Moore was astonished by all the data found through cloning the Autopilot ECU:

“For an engineer like me, the data out of those computers was a treasure‑trove of how this crash happened.”

The data that Tesla had provided was not as easily searchable, the videos were grainy, and it was missing key alerts and timestamps about Autopilot and its decision-making leading up to the crash.

On top of all the data being so much more helpful, Moore found unallocated space and metadata for ‘snapshot_collision_airbag‑deployment.tar’, including its SHA‑1 checksum and the exact server path.


7 | May 2025 – Subpoenaed server logs corner Tesla

Armed with the the newly found metadata, plaintiffs were able to subpoenaed Tesla’s AWS logs.

Tesla still fought them, but facing a sanctions hearing, Tesla finally produced the untouched TAR file plus access logs showing it had been stored since 18:16 PDT on 25 Apr 2019—the same three‑minute timestamp Moore had highlighted.

The automaker had to admit to have the data all along.

During the trial, Mr. Schreiber, attorney for the plaintiffs, claimed that Tesla used the data for its own internal analysis of the crash:

They not only had the snapshot — they used it in their own analysis. It shows Autopilot was engaged. It shows the acceleration and speed. It shows McGhee’s hands off the wheel.

Yet, it didn’t give access to the police nor the family of the victim who have been trying to understand what happened to their daughter.


8 | July 2025 Trial – The puzzle laid bare for the jury

Finally, this entire situation was laid bare in front of the jury last month and certainly influenced the jury in their verdict.

The jury was confronted with clear evidence of Tesla trying to hide data about the crash, and then, they were shown what that data revealed.

The data recovered made a few things clear:

  • Autopilot was active
  • Autosteer was controlling the vehicle
  • No manual braking or steering override was detected from the driver
  • There was no record of a “Take Over Immediately” alert, despite approaching a T-intersection with a stationary vehicle in its path.
  • Moore found logs showing Tesla systems were capable of issuing such warnings, but did not in this case.
  • Map and vision data from the ECU revealed:
    • Map data from the Autopilot ECU included a flag that the area was a “restricted Autosteer zone.”
    • Despite this, the system allowed Autopilot to remain engaged at full speed.

Moore commented on the last point:

“Tesla had the map flag. The car knew it was in a restricted zone, yet Autopilot did not disengage or issue a warning.”

This was critical to the case as one of the arguments was that Tesla dangerously let owners use Autopilot on roads it was not designed to operate on as it was specifically trained for highways.

The National Transportation Safety Board (NTSB) had even worn Tesla about it and the automaker didn’t geofenced the system:.

The NTSB had wrote Tesla:

“Incorporate system safeguards that limit the use of automated vehicle control systems to those conditions for which they were designed (the vehicle’s operational design domain).”

The driver was responsible for the crash and he admitted as such. He admitted to not using Autopilot properly and not paying attention during the crash.

However, the main goal of the plaintiffs in this case was to assign part of the blame for the crash to Tesla for not preventing such abuse of the system despite the clear risk.

The logic is that if Tesla had implemted geofencing and better driver monitoring, the driver, McGee, would have never been able to use Autopilot in this case, which could have potentially avoidded putting himself in the situation that led to the crash.

That’s on top of Autopilot failing at what Tesla has repeatedly claim it could do: stop those crashes from happening in the first place.

Electrek’s Take

Tesla fans need to do a quick exercise in empathy right now. The way they are discussing this case, such as claiming the plaintiffs are just looking for a payout, is truly appalling.

You should put yourself in the family’s shoes. If your daughter died in a car crash, you’d want to know exactly what happened, identify all contributing factors, and try to eliminate them to give some meaning to this tragic loss and prevent it from happening to someone else.

It’s an entirely normal human reaction. And to make this happen in the US, you must go through the courts.

Secondly, Tesla fans need to do a quick exercise in humbleness. They act like they know exactly what this case is about and assume that it will “just be thrown out in appeal.”

The truth is that unless you read the entire transcripts and saw all the evidence, you don’t know more about it than the 12 jurors who unanimously decided to assign 33% of the blame for the crash to Tesla.

And that’s the core of the issue here. They want to put all the blame on the driver, and what the plaintiffs were trying to do was just assign part of the blame on Tesla, and the jurors agreed.

The two sides are not that far off from each other. They both agreed that most of the blame goes to the driver, and even the driver appears to agree with that. He admitted to being distracted and he quickly settled with the plaintiffs.

Tesla programming the auto-delete of forensic data is corporate implementation of pre-planed destruction of evidence. There is no other reason to do it.

This is criminal, and worse than VW dieselgate

Regulators in all countries where AP was sold should be looking at criminal charges and billions in fines View all comments

This case was only meant to explore how Tesla’s marketing and deployment of Autopilot might have contributed to the crash, and after looking at all the evidence, the jury agreed that it did.

There’s no doubt that the driver should bare most of the responsability and there’s no doubt that he didn’t use Autopilot properly.

However, there’s also no doubt that Autopilot was active, didn’t prevent the crash despite Tesla claiming it is safer than humans, and Tesla was warned to use better geo-fencing and driver monitoring to prevent abuse of the system like that.

I think a 33% blame in this case is more than fair. Add Electrek to your Google News feed. 

FTC: We use income earning auto affiliate links. More.

Stay up to date with the latest content by subscribing to Electrek on Google News. You’re reading Electrek— experts who break news about Tesla, electric vehicles, and green energy, day after day. Be sure to check out our homepage for all the latest news, and follow Electrek on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our YouTube channel for the latest reviews.

What's Your Reaction?

Like Like 0
Dislike Dislike 0
Love Love 0
Funny Funny 0
Angry Angry 0
Sad Sad 0
Wow Wow 0