Skip to content(if available)orjump to list(if available)

Tesla withheld data, lied, misdirected police to avoid blame in Autopilot crash

Ajedi32

The fact that Tesla doesn't have a process for making crash data available to investigators is pretty indefensible IMO, given they're retaining that data for their own analysis. Would be one thing if they didn't save the data for privacy reasons, but if they have it, and there's a valid subpoena, they obviously need to hand it over.

For context though, note that this crash occurred because the driver was speeding, using 2019 autopilot (not FSD) on a city street (where it wasn't designed to be used), bending down to pick up a phone he dropped on the floor, and had his foot on the gas overriding the automatic braking: https://electrek.co/2025/08/01/tesla-tsla-is-found-liable-in... The crash itself was certainly not Tesla's fault, so I'm not sure why they were stonewalling. I think there's a good chance this was just plain old incompetence, not malice.

michael1999

The article explains that the crash snapshot shows: - hands off wheel - autosteer had the steering wheel despite a geofence flag - no take-over warnings, despite approaching a T intersection at speed

Letting people use autopilot in unsafe conditions is contributory negligence. Given their marketing, that's more than worth 33% of the fault.

That they hid this data tells me everything I need to know about their approach to safety. Although nothing really new considering how publicly deceitful Musk is about his fancy cruise-control.

kibwen

> I think there's a good chance this was just plain old incompetence, not malice.

The meme of Hanlon's Razor needs to die. Incompetence from a position of power is malice, period.

ahmeneeroe-v2

This doesn't acknowledge reality. Tesla has a position of power, but that doesn't mean Tesla is free from incompetence or can ever be free from it.

sorokod

As a codendeced variant, yes.

A bit more nuanced version is that incompetence from a position of power is a choice.

Ajedi32

That seems contrary to my experience. Large, powerful bureaucracies are often highly incompetent even when said incompetence works strongly against their own interests.

I guess you could go even more nuanced and say sometimes incompetence from a position of power is a choice, but now the statement seems so watered down as to be almost meaningless.

cnst

> https://electrek.co/2025/08/01/tesla-tsla-is-found-liable-in...

> Update: Tesla’s lawyers sent us the following comment about the verdict:

> Today’s verdict is wrong and only works to set back automotive safety and jeopardize Tesla’s and the entire industry’s efforts to develop and implement life-saving technology. We plan to appeal given the substantial errors of law and irregularities at trial. Even though this jury found that the driver was overwhelmingly responsible for this tragic accident in 2019, the evidence has always shown that this driver was solely at fault because he was speeding, with his foot on the accelerator – which overrode Autopilot – as he rummaged for his dropped phone without his eyes on the road. To be clear, no car in 2019, and none today, would have prevented this crash. This was never about Autopilot; it was a fiction concocted by plaintiffs’ lawyers blaming the car when the driver – from day one – admitted and accepted responsibility.

---

Personally, I don't understand how people can possibly be happy with such verdicts.

Recently in 2025, DJI got rid of their geofences as well, because it's the operator's responsibility to control their equipment. IIRC, DJI did have support of the FAA in their actions of removing the geofencing limitations.

These sorts of verdicts that blame the manufacturer for operator errors, are exactly why we can't have nice things.

It's why we get WiFi and 5G radios, and boot loaders, that are binary-locked, with no source code availability, and which cannot be used with BSD or Linux easily, and why it's not possible to override anything anymore.

Even as a pedestrian, I'm glad that Tesla is fighting the good fight here. Because next thing I know, these courts will cause the phone manufacturers to disable your phone if you're walking next to a highway.

ahmeneeroe-v2

I agree. This hurts competent people who want to have responsibility and the freedom that brings.

simion314

The article claims that the software should have been geo fensed in that area but Tesla failed to do that, that the software should have trigger warnings of collisions but it did not do that. So there were things Tesla wanted to hide.

Ajedi32

I don't necessarily disagree, but I personally find these "but you theoretically could have done even more to prevent this"-type arguments to be a little dubious in cases where the harm was caused primarily by operator negligence.

I do like the idea of incentivizing companies to take all reasonable steps to protect people from shooting themselves in the foot, but what counts as "reasonable" is also pretty subjective, and liability for having a different opinion about what's "reasonable" seems to me to be a little capricious.

For example, the system did have a mechanism for reacting to potential collisions. The vehicle operator overrode it by pushing the gas pedal. But the jury still thinks Tesla is still to blame because they didn't also program an obnoxious alarm to go off in that situation? I suppose that might have been helpful in this particular situation. But exactly how far should they legally have to go in order to not be liable for someone else's stupidity?

elAhmo

As long as there is no criminal liability for people doing this, nothing will change. This is pocket change for a company, rounding error, as Tesla's valuation has gone significantly since this happened in 2019, six years ago.

goosejuice

This seems pretty dumb of Tesla, as I find it rather moot to the conclusion of fault in the accident. The obstruction of justice is damning.

Autopilot is cruise control. When you understand this, claiming that Tesla is partially at fault here does not match the existing expectations of other driver assistance tech. Just because Tesla has the capability of disabling it doesn't mean they have to.

This all comes down to an interpretation of marketing speak. If you believe "autopilot" is misleading you'd agree with the jury here, if you don't you wouldn't. I'm no lawyer, and don't know the full scope of requirements for autopilot like features, but it seems that Tesla is subject to unfair treatment here given the amount of warnings you have to completely ignore and take no responsibility for. I've never seen such clear warnings on any other car with similar capabilities. I can't help but think there's maybe some politically driven bias here and I say that as a liberal.

Happy to be convinced otherwise. I do drive a Tesla, so there's that.

burkaman

Do you think Tesla spends more time and money on making their warnings convincing, or making their marketing convincing? If a person is hearing two conflicting messages from the same group of people, they'll have to pick one, and it shouldn't be surprising if they choose to believe the one that they heard first and that was designed by professionals to be persuasive.

In other words, if you bought the car because you kept hearing the company say "this thing drives itself", you're probably going to believe that over the same company putting a "keep your eyes on the road" popup on the screen.

Of course other companies have warnings that people ignore, but they don't have extremely successful marketing campaigns that encourage people to ignore those warnings. That's the difference here.

abrouwers

I might challenge with "autopilot is cruise control." To me, Tesla is marketing the feature much differently. Either way, looking up the definitions of each:

"Auto Pilot: a device for keeping an aircraft or other vehicle on a set course without the intervention of the pilot."

"Cruise Control: an electronic device in a motor vehicle that can be switched on to maintain a selected constant speed without the use of the accelerator."

MBCook

It IS fancy cruise control.

That is not how it’s marketed at all.

goosejuice

In both cases, they are driver assistance. A pilot is responsible and must monitor an autopilot system in a plane. We license drivers and pilots and the responsibility is placed on them to understand the technology before using it and putting themselves and others at risk.

Would Boeing or John Deere be responsible for marketing language or just the instruction manual. We know the latter is true. It's there any evidence of the former? Intuitively I would say it's unlikely we'd blame Boeing if a pilot was mislead by marketing materials. Maybe that has happened but I haven't found anything of that sort (please share if aware).

null

[deleted]

gamblor956

Would Boeing or John Deere be responsible for marketing language or just the instruction manual. We know the latter is true

Actually, the former is true. Courts and juries have repeatedly held that companies can be held responsible for marketing language. They are also responsible for the contents of their instruction manual. If there are inconsistencies with the marketing language it will be held against the company because users aren't expected to be able to reconcile the inconsistencies; that's the company's job. Thus, it's irrelevant that the small print in the instruction manual says something completely different from what all the marketing (and the CEO himself) says.

The "autopilot is limited" argument would have worked 20 years ago. It doesn't today. Modern autopilots are capable of maintaining speed, heading, takeoff, and landing so they're not just pilot assistance. They're literally fully capable of handling the flight from start to finish. Thus, the constant refrain that "autopilot in cars is just like autopilot in planes" actually supports the case against Tesla.

m463

I'm reminded of Vitamin Water...

the Center for Science in the Public Interest filed a class-action lawsuit

The suit alleges that the marketing of the drink as a "healthful alternative" to soda is deceptive and in violation of Food and Drug Administration guidelines.

Coca-Cola dismissed the allegations as "ridiculous," on the grounds that "no consumer could reasonably be misled into thinking Vitaminwater was a healthy beverage"

goosejuice

Interesting case but I'm not sure it's apples to apples.

One, you don't need a license to buy a non alcoholic beverage. Two, while the FDA has clear guidelines around marketing and labeling, I'm not aware of any regulatory body having clear guidelines around driver assistance marketing. If they did it wouldn't be controversial.

MBCook

> given the amount of warnings you have to completely ignore and take no responsibility for.

The article says no warnings were issued before the crash.

So which warning did the driver miss?

goosejuice

The one you accept when you first turn it on. And the numerous ones you ignored/neglected to read when using features without understanding them.

This is the responsibility of a licensed driver. I don't know how a Mercedes works, but if I crash one because I misused a feature clearly outlined in their user manual, Mercedes is not at fault for my negligence.

gamblor956

Tesla's not being treated unfairly. It advertised Autopilot as having more capabilities than it actually did. Tesla used to sell Autopilot as fully autonomous. ("The driver is only there for legal reasons.")

And it didn't warn users about this lack of capabilities until it was forced to do so. Those warnings you're talking about were added after this accident occurred as part of a mandated recall during the Biden administration.

iknowstuff

If this is the 300M jury case 100% they will win in appeals. The driver is clearly responsible for driving and there’s never a moment of doubt about it with Autopilot

mook

Note that the driver wasn't found to be fault-free (they got something like two thirds of the blame), so it's unclear why appeals would overturn this.

freejazz

Appeal of what and on what grounds? Are you an attorney or are you just making this up?

freejazz

> but it seems that Tesla is subject to unfair treatment here given the amount of warnings you have to completely ignore and take no responsibility for.

Lol is this for real? No amount of warnings can waive away their gross negligence. Also, the warnings are clearly completely meaningless because they result in nothing changing if they are ignored.

> Autopilot is cruise control

You're pointing to "warnings" while simultaneously saying this? Seems a bit lacking in self awareness to think that a warning should muster the day, but calling cruise control "autopilot" is somehow irrelevant?

> I can't help but think there's maybe some politically driven bias here

Look only to yourself, Tesla driver.

goosejuice

First of all I stated my bias.

What part of how autopilot is marketed do you find to be gross negligence?

I would ask, what is the existing definition of autopilot as defined by the FAA? Who is responsible when autopilot fails? That's the prior art here.

Additionally if NTSB failed to clearly define such definitions and allowments for marketing, is that the fault of Tesla or the governing body?

I'm pretty neurotic about vehicle safety and I still don't think this clearly points to Tesla as being in the wrong with how they market these features. At best it's subjective.

thebruce87m

> they result in nothing changing if they are ignored.

That’s not true

> Do I still need to pay attention while using Autopilot?

> … Before enabling Autopilot, you must agree to “keep your hands on the steering wheel at all times” and to always “maintain control and responsibility for your vehicle.” Once engaged, Autopilot will also deliver an escalating series of visual and audio warnings, reminding you to place your hands on the wheel if insufficient torque is applied. If you repeatedly ignore these warnings, you will be locked out from using Autopilot during that trip.

> If you repeatedly ignore the inattentive driver warnings, Autosteer will be disengaged for that trip. If you receive several ‘Forced Autopilot Disengagements’ (three times for vehicles without a cabin camera and five times for vehicles with a cabin camera), Autosteer and all features that use Autosteer will be temporarily removed for approximately one week.

https://www.tesla.com/en_gb/support/autopilot

freejazz

And you don't respond to your own point about it being called autopilot despite it not being an autopilot

>> If you repeatedly ignore the inattentive driver warnings, Autosteer will be disengaged for that trip. If you receive several ‘Forced Autopilot Disengagements’ (three times for vehicles without a cabin camera and five times for vehicles with a cabin camera), Autosteer and all features that use Autosteer will be temporarily removed for approximately one week.

There are videos of people on autopilot without their hands on the wheel...

keepper

"it's never the crime... its the cover up". So in this case, they are kinda screwed.

I've owned two Tesla's ( now a Rivian/Porsche EV owner). Hands down Tesla has the best cruise control technology in the market. There-in lies the problem. Musk constantly markets this as self driving. It is NOT. Not yet at least. His mouth is way way way ahead of his tech.

Heck, stopping for a red light is a "feature", where the car is perfectly capable of recognizing and doing so. This alone should warrant an investigation and one that i completely, as a highly technical user, fell for when i first got my model 7 delivered... Ran thru a red light trying out auto pilot for the first time.

I'm honestly surprised there are not more of these lawsuits. I think there's a misinterpretation of the law by those defending Tesla. The system has a lot of legalese safe-guards and warnings. But the MARKETING is off. WAY OFF. and yes, users listen to marketing first.

and that ABSOLUTELY counts in a court of law. You folks would also complain around obtuse EULA, and while this isn't completely apples to apples here, Tesla absolutely engages in dangerous marketing speak around "auto pilot". Eliciting a level of trust for drives that isn't there, and they should not be encouraging.

So sorry, this isn't a political thing ( and yes, disclaimer, also a liberal).

Signed... former Tesla owner waiting for "right around the corner" self driving since 2019...

goosejuice

> ABSOLUTELY counts in a court of law

Are there clear guidelines set for labeling and marketing of these features? If not, I'm not sure how you can argue such. If it was so clearly wrong it should have been outlined by regulation, no?

throwanem

I would like to see how the decision was justified to implement automated deletion of the onboard snapshot.

jeffbee

There aren't enough details in the somewhat hyperbolic narrative format to really say, but if I were going to create a temporary archive of files on an embedded system for diagnostic upload, I would also delete it, because that's the nature of temporary files and nobody likes ENOSPACE. If their system had deleted the inputs of the archive that would seem nefarious, but this doesn't, at first scan.

AlotOfReading

The main reasons to store data are for safety and legal purposes first, diagnostics second. Collision data are all three. They need to be prioritized above virtually everything else on the system and if your vehicle has had so many collisions that the filesystem is filled up, that's a justifiable reason to have a service visit to delete the old ones.

If I were implementing such a system (and I have), I could see myself deleting the temporary without much thought. I would still have built a way to recreate the contents of the tarball after the fact (it's been a requirement from legal every time I've scoped such a system). Tesla not only failed to do that, but avoided disclosing that any such file was transferred in the first place so that the plaintiffs wouldn't know to request it.

adolph

The tar file is a convenient transport mechanism for files that presumably exist in original form elsewhere within the system. (All bets off if sources for the tar were changed afterward.)

Given storage is a finite resource, removing the tar after it was confirmed in the bucket is pure waste.

thrill

When a vehicle crash occurs, that embedded system should no longer be treating data as "temporary", but as what it now is, civil and potentially criminal evidence, and it should be preserved. To go to the effort of creating that data, uploading it to a corporate server, and then having programming that explicitly deletes that data from the source (the embedded system), certainly reads as nefarious without easily verifiable evidence to the contrary. The actions of a company that has acted this way in no fashion lends any credibility to being treated as anything other than a hostile party in court. Any investigators in the future involving a company with such a history need to act swiftly and with the immediate and heavy hand of the court behind them if they expect any degree of success.

int0x29

I would love to see what you need so much disk space for after the car is crashed and airbags are deployed. If that event fires the car is going in to the shop to have its airbags replaced at a minimum. Adding a service step to clear up /tmp after a crash is fairly straitforward.

throwanem

"Their system" is a car, sold as a consumer product, which has just experienced a collision removing it indefinitely from normal operation. Reconsider your analysis.

jeffbee

Yes? But the article doesn't say that Tesla deleted the EDR, it says they uploaded the EDR file in an archive format, then deleted the uploaded entity. Which strikes me as totally normal.

detourdog

What about not handing the tar ball to the police looking for data. Do you see that as a problem?

adolph

It seems they waited for a subpoena. Would you prefer automakers send the police a notification anytime the car records a traffic infraction, or maybe they should just set up direct billing for municipalities?

jeffbee

That's obviously problematic. I am only commenting on the belief in a conspiracy of programmers here. The overwhelmingly most likely reason that a temporary file would be unlinked after use is that is what any experienced systems programmer always does as a matter of course.

1970-01-01

Tesla was having issues with log file writes destroying their chips. They can argue they have precedent for deleting data, but not hiding it.

throwanem

I would be fascinated to entertain arguments for how the future write life of a flash memory chip, meant for storing drive-time telemetry in a wrecked car, merits care for preservation.

Hamuko

Wouldn't creating an archive on the filesystem and then deleting the archive cause more writes than just creating it without a delete?

null

[deleted]

eptcyka

For enospace, I’d put telemetry data on an entirely different partition or even device, as to isolate the rest of the filesystem.

declan_roberts

It's difficult for me to tell in the article because how much the terms are used interchangeably, but it was it FSD or autosteer that was driving the car when it crashed?

My autosteer will gladly drive through red lights, stop signs, etc.

And the fact that we have telemetry at all is pretty amazing. Most car crashes there's zero telemetry. Tesla is the exception, even though they did the wrong thing here.

indoordin0saur

This was in 2019 so I don't think FSD was a thing yet.

eptcyka

Autopilot, not autosteer. Wording here is important.

sbassi

"...the investigator thought that Tesla was being collaborative with the investigation at the time..."

So this is also a failure of the investigator.

duxup

I suppose it isn't but the sheer scale of effort seems like it should be criminal.

elil17

It is absolutely criminal. Whether it gets prosecuted is a different matter - prosecutors love giving corporate interests a free pass

buyucu

6 months ago it had no chance of being prosecuted. But, if the fallout between Elon and Trump is as bad as it looks from the outside, there might be justice after all.

bcrosby95

Lying to the police is illegal, which it seems like Tesla (employees) did many times.

0cf8612b2e1e

I am rather sick of the AI generated hero image, but this one made me laugh.

addandsubtract

I think it's wild that a legit article would use an image like that. Sure it's funny, but save it for social media, not a news source that's supposed to be based on facts.

breadwinner

It was probably made using Musk's own Grok as other AIs disallow depicting real people.

steveBK123

The longer the farce goes on, the more I think the laggers in the self driving car industry are more trying to wait out regulators than actually get good enough.

That is - gamble GOP alignment leading to to regulatory capture such that the bar is lowered enough that they can declare the cars safe.

AlotOfReading

Despite what you hear from certain media voices, there are effectively no performance-based regulatory barriers in the US. You can claim any autonomy level you want at any time and aside from the small number of states that have a real permit process that rises above a rubber stamp (literally just California), regulators are reactive to headlines of your system failing, not its actual performance.

Even California's system is lax enough that you can drive a Tesla semi through it.

hn_throwaway_99

I get this may be off topic, but does anyone think these cheesy, bad AI-generated headline images help the article's point of view, or heck even make it more engaging?

It just looks stupid to me in a way that makes me more likely to discount your post.

its-summertime

OpenGraph requires an image for your article regardless of if it is useful or not, for the sake of embeds in facebook and discord (and others)

mh-

There's nothing requiring that image to be unique. Lots of sites just provide a higher solution favicon.

npteljes

This one is a such a bad photoshop too! The box's text is clearly AI generated, with an older model, and the "Autopilot crash data" is imposed on it with an image editing tool. Really cheap looking.

akudha

I haven't done it, so I don't have any data to back it up. I suppose it works, at least short term, that is why so many websites, video creators, copywriters, email newsletter writers etc use it?

Negative, cheesy, clickbait, rage inducing etc headlines do seem to get more clicks. There is a reason why politicians spend more time trash talking opponents than talking positively about themselves. Same goes with attack ads.

1970-01-01

It started with YouTube thumbnails and leaked from there. It just gets more clicks compared to a legitimate photo

hn_throwaway_99

I can get it for some rando YouTube video.

For an article that is supposed to at least smell like journalism, it looks so trashy.

perihelions

Journalism? It's literally just a blog—a very successful car-influencer blog that's in the past earned 6-figure payments from Tesla itself[0], for their very successful shilling of Teslas.

Journalism is a thing of its own; blogs aren't it.

[0] https://www.thedrive.com/news/24025/electreks-editor-in-chie... ("Electrek’s Editor-in-Chief, Publisher Both Scoring $250,000 Tesla Roadsters for Free by Gaming Referral Program": "What happens to objective coverage when a free six-figure car is at play? Nothing good." (2018))

kergonath

I’m with you there. Depending on other factors on the page, it can be a smell or a red flag.

KolmogorovComp

People like to think they love literature, yet they read tabloids. People like to say they want better information, yet get their news from social media.

I have not doubt a majority of people will say they despise these pictures like YouTube thumbnails, yet the cold numbers tell the opposite.

null

[deleted]

cryptoegorophy

If you are not familiar with electreck this is basically a nutshell of their business. Tesla bad = clicks. Tesla good = no clicks. It is stupid and it works, as you can clearly see in this particular example in YC

buyucu

Of course they did. This is how Tesla has been operating for a very long time.