Skip to content(if available)orjump to list(if available)

Tesla Cybertruck Drives Itself into a Pole, Owner Says 'Thank You Tesla'

MiscIdeaMaker99

I've owned a Model 3 for years now, and FSD is scary as hell. We haven't paid for it -- and we won't -- but every time we get a free trial of it (mostly recently this past Fall), I give it a whirl, and I end up turning it off. Why? Because it does weird shit like slow down at an intersection with a green light. I don't feel like I can trust it, at all, and it makes me more anxious than just using standard auto-steer and cruise control (which still ghost breaks sometimes). I don't get why anyone uses FSD.

jvanderbot

Don't even get me started. Here's a list of things my model Y regularly does:

- Try to accelerate to 45mph in a parking lot b/c it was within 10ft of the road

- Decelerate from highway speeds suddenly to 30mph, as though it saw something it might hit (I stopped it at 30-ish and hit the gas)

- Decelerate to 50mph because of "emergency vehicles" even though there were no vehicles around (sometimes it mistakes lights that strobe b/c they are seen through median dividers as "emergency lights")

- Take up two lanes because they gradually separated and the car thinks it should stay evenly between the left and right divider line

- choose absolutely bonkers limits, like 30mph on two lane country highways.

- Stop on the highway with a big red screen and a message that says "Take control now fatal error"

- Not so much a problem any more, but when I was first getting used to it, it would beep a message at me, then scold me for looking at the message (and not the road), then ask me to do some kind of hand grip on the wheel to prove I'm paying attention, but I have to look at the message to figure out what it wants.

My wife tells me "Just keep your foot on the gas to keep up the speed and your hands on the wheel to keep it in line" and I am just left wondering what FSD is for

robwwilliams

Drive from Memphis to Nashville in my “long range” Telsa 3 that has almost an amazing 160 mile range at 70 Mph. FSD would periodically do something crazy and then ask me why I disengaged, adding further excitement to my drive.

I am now absolutely convinced that we will have full self-driving from Tesla when we have a beautiful wall all the way from the east to the west coast along both the Mexican and Canadian borders. Both will be beautiful.

rmu09

My tesla model 3 on "autopilot" (just keep speed) will ghost break if it sees some cars merging into an adjacent lane. Really dangerous, nobody expects a car decelerating from 130kph down to 50 on the nearly empty Autobahn. My previous car (VW) got that right. Overall it is a nice car (with a massive and increasing brand toxicity problem), but how can one trust the "full self driving" if it can't/doesn't even keep speed in supposedly trivial cases where the driver has control?

edoceo

> what FSD is for

Hype & Marketing

ben_w

Eeesh. Yeah, I was afraid it might still be that bad after 10 years of between "this year" and "within three years".

At the time, 2016, I trusted their promotional video showing it driving hands-free; I'm not going to make the mistake of taking them at their word again after it was revealed to have not been as it appeared: https://www.businessinsider.com/tesla-faked-video-in-2016-pr...

> I am just left wondering what FSD is for

The vision and promise, or the actually demonstrated use case?

The demonstrated use case is to charge people more money for the same product.

The vision? That is exactly what Musk keeps saying: in principle, a self-driving car never gets tired or drunk, so it can be safer than the mean human even if it only operates at the level of the median human. And it wouldn't need to be limited to median human level, as the whole fleet could learn from every member, so gain experience a million times faster than any human.

But at this point, I'm sufficiently skeptical of all of this, that I think they (and everyone else) should be banned from direct observation of the entire fleet's cameras — it's a huge surveillance network operating on every public road and several private ones.

LeoPanthera

Why on earth are you using FSD in parking lots?

tqi

What else would the "Full" in "Full Self Driving" mean?

esotericimpl

Why are you using full self driving as part of your driving?

Weird ask imo.

nprateem

It's for the share price. Just like it was for uber.

beretguy

I hope you guys got rid of your teslas. Preferably under press. They are a danger to people around you.

brightball

Not based on their safety record.

recursive

They'd probably just replace it with another car, defeating the purpose.

vuln

Just like every other vehicle on the road.

shepherdjerred

I test drove a model 3 ~3 years ago and FSD was terrifying. I had no idea what it could or couldn’t do. IMO Hyundai (and others with similar features) have it perfect with adaptive cruise + active lane assist. I know exactly what it can do, it does 90% of the driving on long trips, and it doesn’t do so much that I’m tempted to put too much trust in it.

ipython

What’s amazing to me is that FSD to this day cannot recognize active school zones. Even my 6 year old Audi onboard cameras can do that. If you put FSD on and you go through a school zones, the Tesla will happily zip at full speed completely ignoring the school zone.

Gigachad

It's terrifying that companies are allowed to beta test buggy software out in the real world by shooting huge machines with sharp pointy corners through schools.

fredfoobar

It's so surprising to hear these issues with FSD and it makes me nervous even though I haven't encountered any problems in v13. I regularly use it back and forth between work and home and mostly rush hour with a lot of difficult merges and weird situations.

qubitcoder

I agree. I've used FSD v13 on a Model Y with hardware version 4 for a couple of months now. Checking my mileage, that's over 2,000 miles, most of which was with FSD enabled (road trips on interstates, backroads, two-lane country roads without lane markings, interstates, highways, etc.). It's been absolutely fantastic.

Even my parents and sister use FSD v13 regularly now in their Teslas.

It's come a long way from the early days when I first started testing it.

It makes me wonder how many people are using Autopilot (included as standard) instead of FSD on a newer Tesla with the new AI hardware?

It's pretty wild to be able to start from park. Tap a button, and go.

Just the other day, it managed merging onto the interstate and then immediately changing 7 lanes to the left to merge onto the next interstate exit heading north. It performed flawlessly.

unregistereddev

I suspect part of the difference is what kind of roads you are on. Whenever I'm in the Bay area (or Southern CA in general), I'm amazed by the quality of the roads. The pavement is even and smooth and the lines are crisp, fresh paint that is easy to see.

Meanwhile in the Midwest, we have potholes, uneven roads, sometimes roads with different surfaces mixed together (gray concrete with black asphalt patches). Lines are often badly worn by the weather and road salt and can be quite difficult to see.

I strongly suspect with no evidence that FSD likely has more problems on roads that are in poor condition.

brightball

It’s a little hit or miss (pun intended). I’ve used it off and on in my model 3 for the past 2 years and the rate of improvement has been significant.

Still though it has quirks.

On long trips, I LOVE it. LOVE it. Being able to just tap in and relax, make phone calls, listen to an audiobook, etc is so nice. The first time I ever used it I had to leave early from the All Things Open conference in Raleigh because I was getting sick. Having it essentially drive me home for 5 hours when I wasn’t well, including stopping to charge, was a huge relief.

It’s also great in traffic jams where you’d otherwise be dealing with stop and go traffic until you get through it. Just tap in and relax til you’re on the other side.

Day to day driving, it’s a little more iffy. I’ve dealt with seemingly random slowdowns on otherwise empty roads. It feels odd especially because it’s sudden.

Early on it would have difficulty on roads without well marked lines too.

I’ve never felt like it was going to run into an object though. Usually it errs on the “too cautious” side and I just take over to get where I’m going quicker.

jimnotgym

> Being able to just tap in and relax, make phone calls, listen to an audiobook, etc is so nice.

My 12 year old Ford Focus does that

karlgkk

It's crazy, because any negative criticism of FSD will have a ton of fanboys pouring out of the walls to tell you how great it is, how great the latest update is, how your anecdotal "evidence" is not typical, etc.

Except all you have to do is go try it and it becomes clear to any layperson that it's probably getting there but, and this is really crucial, it's not there yet.

fredfoobar

I don't get it, what do you expect them to do? just reinforce your view? the data is pretty clear how many people use FSD without issues. What's equally weird is you guys preemptively smearing folks who defend FSD. I don't get it, nothing will make you guys change your mind I guess.

academe

For me, evidence would work. That's why California's regulations to report miles driven and # of accidents and disengagements is so nice. It's a standard to compare, measure and regulate. And when Tesla hides away from such and instead moves to unregulated Texas markets, it makes me predisposed to think they are shying away from gathering just the evidence that would convince me it's safe. If it works for you, great. But this difference makes me happy to live in California.

mbesto

> the data is pretty clear how many people use FSD without issues.

Serious question - what data? And who is supplying that? Tesla? And what is the emotional human equivalent for the level of confidence that we should assume is "safe"? 1% error rate? 0.5%? 0.00001%?

vel0city

> the data is pretty clear how many people use FSD without issues.

What data? The data Tesla chooses to share with you?

It wouldn't surprise me to find out this incident had FSD disengage moments before colliding with the pole, thus continuing 100% FSD safe driving.

bdangubic

exactly... this is HN of course so I expect nothing less. my favorite is when I frequently (as early as couple of days ago) get comments like "go see some videos on youtube before commenting like that" :D soooo funny. the thing is absolute garbage but elon can sell garbage better than anyone that ever lived

decimalenough

Which is why the fan boys always tell you that it's the next version that will fix all the bugs.

agumonkey

Interestingly common trait in fan-boyism, the <thing> is always just a few steps away from being right.

belter

It's like C++ :-)

beretguy

It's the next version will fix all the bugs all the way down.

babypuncher

I feel like FSD has been "getting there" since before even Tesla started marketing it. I remember Google's early self driving cars and everyone thought they were only a few years away from being practical.

I think FSD definitely has utility, but not in the hands of laypeople. There are still far too many edge cases that it just doesn't handle well, and your average person can't be trusted to stay alert and attentive while using a feature so heavily marketed as not needing either of these things.

scottlamb

> I remember Google's early self driving cars and everyone thought they were only a few years away from being practical.

...and that turned out to be a little optimistic, but they really are "there" now IMHO in San Francisco. I rode in one for the first time last year. Subjectively, I felt safer moving through San Francisco traffic in the self-driving car than I do when I'm driving there myself or when I'm being driven by a human in a Lyft. It was attentive, cautious, and smooth, and I got there in a reasonable time with no fuss. And crucially, I see a notable lack of stories about it making dangerous decisions, despite the total passenger miles.

Why is Waymo there now and not Tesla? I think a combination of factors, including: (a) the head start, (b) the willingness to use LIDAR and RADAR to overcome limitations, (c) the focus on self-driving (they design and operate self-driving systems; they don't manufacture electric cars), (d) the service model (easier problem to focus on a mapped region with good weather and monitor everything vs. sell a car expected to work anywhere/anytime without (as much?) telemetry), (e) frankly, caring more about safety and less about hype. Of those differences, the "head start" one is shrinking relatively speaking, but the others will likely remain significant enough that I don't expect to trust Tesla's systems any time soon.

shiftpgdn

I like that you've built a strawman for anyone who might disagree with you. "Oh there aren't any positive reviews they're just fanboys." You may as well write "Anyone who disagrees with me is wrong."

leesec

I'm on AI4 v13 and havent had a safety intervention in several thousand miles. It's incredible and extremely smooth

axus

I always slow down a little bit when speeding through intersections, just in case I need to react to someone illegally putting themselves in my path.

It was this guys fault for not monitoring the car, but also Tesla's for using a double-speak name like Full Self Driving.

If FSD is a statistically significant enough risk factor for injury above Teslas that don't use it, it should be banned.

dexzod

   Thank you @Tesla for engineering the best passive safety in the world. I walked away without a scratch.
I walked away without a scratch. This could have easily killed an innocent pedestrian or bicyclist. How is this best safety engineering? If the FSD failed there should have been some secondary system to detect an imminent collision and apply brakes.

benhurmarcel

Clearly the owner showed that this aspect is not important to them when they ordered the Cybertruck.

porphyra

> If the FSD failed there should have been some secondary system to detect an imminent collision and apply brakes.

There actually is. The Automatic Emergency Braking functions separately from FSD and can prevent collisions in some cases. It doesn't work 100% of the time so I wouldn't rely on it, but at least it works as well as or better than competitors' systems.

Freedom2

The US has no pedestrian safety regulations at all for car design. Some have been proposed but it's 2025 and still nothing enacted.

spiderfarmer

And with the current administration’s attitude towards consumer protection you’ll never see a meaningful change in safety regulation.

dhosek

Given the last three weeks, I would expect rules that are actively pedestrian-hostile.

sandworm101

It is passive-aggressive sarcasm. If you say mean things about Tezla on X there is a chance you may be banned/sued/delisted, especially if it involves a crash. So everything has to be couched in false praise. Nobody really thinks the cybertruck does better in a crash than a merc or bmw. Its just something said by the posfer in order to get thier story to a wider audience.

renewedrebecca

I hope that's the case. It almost read like someone trying to still believe in God right after their mom died.

lcnPylGDnU4H9OF

It does not seem to be the case.

> Big fail on my part, obviously. Don't make the same mistake I did. Pay attention. It can happen. I follow Tesla and FSD pretty closely and haven't heard of any accident on V13 at all before this happened. It is easy to get complacent now - don't.

> I do have the dashcam footage. I want to get it out there as a PSA that it can happen, even on v13, but I'm hesitant because I don't want the attention and I don't want to give the bears/haters any material.

https://x.com/MrChallinger/status/1888546351572726230

ambicapter

Full tweet is below and it doesn't sound like sarcasm. He even says he doesn't want to give the haters ammunition.

i_am_jl

>He even says he doesn't want to give the haters ammunition

That part was what made me question if it was real! "Don't want to give the haters ammo" at the tail end of a story about how his $100k pickup truck drove into a lamp post.

What exactly does he think he's doing?

decimalenough

Passive safety is the art of engineering cars so that when they do crash, the occupants are unharmed.

What you're asking for, though, is definitionally impossible: obviously the cameras didn't detect the obstacle, so FSD or no, they can't react to it. The actual solution would be to do what every other car maker with self-driving pretensions does and augment the cameras with LIDAR or other sensors.

diggan

> Passive safety is the art of engineering cars so that when they do crash, the occupants are unharmed.

Judging by the (illegal in Europe) design, passive safety is the only safety Cybertruck has, and the safety of others have absolutely zero importance. Fits with how the rest of the world sees the typical American as well, so maybe not a big shocker.

> What you're asking for, though, is definitionally impossible

Why is it impossible for the car to stop (legally obviously) if it fails to merge, or even hit the curb, instead of continue straight forward like nothing happened?

halyconWays

>passive safety is the only safety Cybertruck has

Not even remotely true

scottyah

Europe's safety is optimized for its environment: mostly narrow, crooked, and crowded streets with a lot of pedestrians. Most use cases for a pickup truck that's only sold in North America are in the part of America where you're much more likely to crash into a tree, deer, fence post, etc than you are a person.

whyenot

> Passive safety is the art of engineering cars so that when they do crash, the occupants are unharmed.

Passive safety usually is defined as reducing the risk of injury or death to vehicle occupants in an accident AND also protecting other road users. You left off the second part.

nashashmi

Post gets 8m views as of this writing. Owner doesn’t want this message to get viral because then tesla gets flak for this. Takes the blame. Wants to share the message of FSD’s fallibility with everyone. Praises tesla for safety.

My head hurts with how oxymoronic this is. My best guess is he wants to critique tesla without triggering the ego and arrogance of its owner. “Thank you sir for doing treat work and for fixing this problem in the future”

ikanreed

And my guess is there's a touch of ideology to this person that questioning Tesla and FSD's fundamental safety would hurt. "I screwed up" does a lot less to cause cognitive dissonance than "Something I believe is wrong"

There's a lot of possible flavors to that ideology, it COULD be right wing political affinity, but it also could be a belief that technology is superior to human judgement, or that self driving cars are the future, or it could just be that spending 6 figures on an ugly pickup wasn't a waste of money.

sirbutters

The very fact this is a cybetruck owner, that already tells you you're dealing with a fanboy (aka cult member). edit: over simplification of course, but not far off from the truth

averageRoyalty

You can buy weird or unique cars without being into a certain culture or group.

msikora

Not Cybertruck IME

ikanreed

It costs over $100,000. That is a decision you do not make for no reason.

MisterTea

> My best guess is he wants to critique tesla without triggering the ego and arrogance of its owner.

Well I imagine that since Musk was handed the keys to various government agencies and installed his henchmen you can see why you want to tread lightly and kiss the ring. Such a wonderful future this is becoming.

sertraline

[flagged]

xboxnolifes

I think you're reading way to much into how people think.

boringg

I think he's trying to get special attention without passing on blame to the company. Best way to get help is through kindness (at the outset) not accusatory (even if it is the fault of the company).

generalizations

Worth reading the actual tweet, not just the article's truncation of it

> Soooooo my @Tesla @cybertruck crashed into a curb and then a light post on v13.2.4.

> Thank you @Tesla for engineering the best passive safety in the world. I walked away without a scratch.

> It failed to merge out of a lane that was ending (there was no one on my left) and made no attempt to slow down or turn until it had already hit the curb.

> Big fail on my part, obviously. Don't make the same mistake I did. Pay attention. It can happen. I follow Tesla and FSD pretty closely and haven't heard of any accident on V13 at all before this happened. It is easy to get complacent now - don't.

> @Tesla_AI how do I make sure you have the data you need from this incident? Service center etc has been less than responsive on this.

> I do have the dashcam footage. I want to get it out there as a PSA that it can happen, even on v13, but I'm hesitant because I don't want the attention and I don't want to give the bears/haters any material.

> Spread my message and help save others from the same fate or far worse.

https://x.com/MrChallinger/status/1888546351572726230

weaksauce

how is this not a cult? if a toyota or mercedes car veered off into a light pole would they write the same tweets?

unless this is sarcasm that can, at best in the current times, be construed as serious.

scottyah

Most FSD users are in it because they see how great things could be if self-driving existed, and are willing to put in the risk to get the algorithms trained.

barbazoo

The risk is mostly on the people outside without any say on the matter.

davidw

The problem is that it's not just them putting in the risk: it's everyone else around them.

baggachipz

I used to be. Now I just want my damn money back.

boringg

Right the whole point was not to make an article attention about all the negative news. Which is funny because this thread is immediately full of complaints.

msikora

"Big fail on my part, obviously" WTF??? Big fail to buy Tesla and even bigger to use FSD. With my almost 20 year old car I don't have to worry about none of this BS.

nuancebydefault

The cybertruck owner is clearly only interested in their own safety. Luckily, in my country cybertrucks are not allowed on the road, for other people's safety.

luma

Weirdly, they seem more concerned for Tesla the company than they are for themselves.

InitialLastName

One doesn't buy a truck like that out of concern for other people.

nuancebydefault

Driving such a beast seems to give a very "I'm worthy more than others" feeling. I'm puzzled, what drives this behavior?

Gigachad

Cybertrucks seem like the modern day Harley motorbikes. A very expensive way to signal anti social personality.

FirmwareBurner

>in my country cybertrucks are not allowed on the road

In my EU country either however they have already been spotted on the streets with valid license plates. There are loopholes everywhere usually if you classify it as a commercial utility vehicle for a business instead of a passenger car. There's plenty of people with money and no scruples.

smitelli

Some cars, when I see photos of them smashed up, I get very sad. NA Miata, Corvette C4, etc. A totaled Cybertruck, honestly, good riddance. It is an extraordinarily difficult vehicle to love.

Very glad to hear no pedestrians got hit. Really hope the driver takes some kind of lesson away from this experience.

spprashant

I really doubt they are taking any lessons from this. This is the author's second Cybertruck crash in a month.

If I didn't know better, I think they are trying to farm engagement.

thih9

>This is the author's second Cybertruck crash in a month.

Does anyone have a source? If true then this defense of Tesla that we see now is even more bizarre.

spprashant

This is pro-tesla/elon account, but the screenshots are legitimate.

https://nitter.net/WholeMarsBlog/status/1889098514061492517#...

The whole thing is funny cause the guy who is so vehemently defending Tesla and FSD despite totaling his car, is being targeted by Tesla fanboy accounts for being a fraud. Twitter is really bottom of the barrel garbage.

dralley

>This is the author's second Cybertruck crash in a month.

JFC

qwerpy

I drive one and I love it! I wanted a large, robust self-driving family EV and as a bonus it looks unique compared to the blobs that everyone else drives.

It's sad that we CT drivers seem to be caught in a crossfire between Tesla and Tesla/Elon haters, when all we want to do is enjoy our cars.

lcnPylGDnU4H9OF

> It's sad that we CT drivers seem to be caught in a crossfire between Tesla and Tesla/Elon haters, when all we want to do is enjoy our cars.

The good news is that it’s just the ones who drive their truck with the now-known-to-be-faulty FSD turned on who are caught in the crossfire. (I’d say they are actually and should be in the “crosshairs”, but that’s moot.) Doesn’t have to be you.

godelski

There was a post on BlueSky about this the other day. Someone linked a picture of the intersection: https://bsky.app/profile/pickard.cc/post/3lhtkghk6q224

It is worth noting that this picture is a reply to a screenshot of someone saying the following:

  > I've lived in 8 different states in my life and most roads I've seen do everything they can to prevent human error (or at least they do once the human has shown them what they did wrong). The FSD should not have been fooled this easily, but the environment was the worst it could have been, also.

  - Tweet source: https://x.com/MuscleIQ2/status/1888695047044124989
I point this out because I think probably the biggest takeaway here is how often people will bend over backwards to reach the conclusion that they want, rather than update their model to the new data (akin to Bayesian Updating for you math nerds). While this example is egregious, I think we all should take a hard look at ourselves and question where we do this too. There's not one among us that isn't resistant to change our beliefs, yet it's probably one of the most important things we can do if we want to improve things. If we have any hope of being able to not be easily fooled by hype, if we are to be able to differentiate real innovation from cons, if we are able to avoid joining Cargo Cults, then this seems to be a necessity. It's easy to poke fun at this dude, but are we all certain that we're so different? I would like to think so, but I fear making such a claim is repeating the same mistake I/we are calling out.

arghandugh

This is because Tesla‘s implementation hasn’t worked, doesn’t work, can’t work, won’t work, won’t ever work, and has been a decade-long intentional fraud from a con artist that was designed to pump up a meme stock. THAT worked.

padjo

And how

hooo

I am no fanboy, but I have it on a 2023 model Y. It works incredibly well. It is actually already amazing. You all are just blind from Elon hate.

arghandugh

You are a fanboy, and it's "actually" not amazing according to basic statistics, and you are blinded by your previous emotional investments, and everyone on and around the road you drive on is physically threatened by your bad choices.

Please don't comment here further.

hooo

What statistics? Let’s see it

smallpipe

Anothe proof that self-driving cars with human backups should never be allowed on public roads. It's going to be used unsafely because it encourages such behaviour.

luma

Specifically, SAE level 3 should be explicitly prohibited. Humans have proven, over and over, that they can’t handle that level of alertness while also not driving.

mlyle

I'm not sure level 3 is a problem. In L3, the car is required to initiate handover and be able to give you a long time to take over. This may or may not end up OK. (Witness the Mercedes eyes-off, hands-off Traffic Jam Assist). If the car can do something reasonable 85% of the time when the human is unavailable, dead, or drunk and can't take over in 25 seconds that would probably be OK.

Level 2+, though, is a big worry. It fails enough to be dangerous, but many of these systems fail too little for humans to effectively monitor them.

standardUser

Waymo has been operating near-flawlessly for years now in some busy, complicated cities. I don't see self-driving tech as the problem. It's been proven to work. I see irresponsible companies as the problem.

mbesto

Waymo has conditionals (type of car, what roads it will drive on, range, etc) on how and where it can operate, FSD's conditionals are much less stringent.

I'm still waiting for Waymo to safely drive in the snow.

standardUser

> I'm still waiting for Waymo to safely drive in the snow.

I'm not, at least not especially. Technology doesn't need to be flawless to transform society. We put up with lots of limitation when we live in places with serious winters. Why should driverless cars be impervious?

luma

Waymo is level 4 while FSD is level 3. Waymo won't drive in the snow, because they know they cannot fully automate it safely in all situations. FSD will yolo any situation and just suddenly hand you the wheel whenever it likes, resulting in situations like the linked article.

olyjohn

You didn't read where he said "self driving cars with human backups"

standardUser

If it needs a human backup then that's not a self-driving car though is it? That's an already-obsolete concept.

thinkingtoilet

I wouldn't say "never", however it's clear we're not there yet.

curiouser3

if you want to see the true horror, check out https://comma.ai. $2000 and plugs into most cars made in the past few years, works by using cracked security for the cars it is "compatible" with. These people are on the road next to you with a car being driven by a single smartphone camera. They sell it as "chill driving" but they have a discord where people just flash custom FSD firmware.

marssaxman

By "cracked security" do you mean something more than the fact that it plugs into the CAN bus?

At least they are not pretending to offer anything more than level-2 adaptive cruise control and lane centering.

bko

Ah yes, the compensating behavior theory all over again. Replace "seat belts" with "driver assist"

> This paper investigates the effects of mandatory seat belt laws on driver behavior and traffic fatalities. Using a unique panel data set on seat belt usage rates in all U.S. jurisdictions, we analyze how such laws, by influencing seat belt use, affect traffic fatalities. Controlling for the endogeneity of seat belt usage, we find that it decreases overall traffic fatalities. The magnitude of this effect, however, is significantly smaller than the estimate used by the National Highway Traffic Safety Administration. Testing the compensating behavior theory, which suggests that seat belt use also has an adverse effect on fatalities by encouraging careless driving, we find that this theory is not supported by the data. Finally, we identify factors, especially the type of enforcement used, that make seat belt laws more effective in increasing seat belt usage.

[0] http://www.law.harvard.edu/programs/olin_center/papers/pdf/3...

MyOutfitIsVague

It's clearly not comparable, as wearing a seat belt might make you feel more safe, but it doesn't actively encourage you to pay less attention to the road by its design. The act of driving is roughly the same with or without seat belts. Driver assist or self driving drastically alters how you drive.

mingus88

And yet FSD and even lane assist are going to be safer than the driver near you who is scrolling and typing away on their smartphone

Don’t mistake my post as a defense of FSD or Tesla. They’ve been lying about their capabilities for what feels like a decade.

I don’t want to see FSD and human drivers share a road. I want all cars to be meshed and communicating their intents with vehicles around them to avoid collisions. We will never see that in our lifetime

ryandrake

It's terrible that cell phone distraction is not prosecuted as harshly as DUI. Consequently, it's totally normalized. Get drunk and plow your car into a bunch of people, killing them? Many states treat this as a serious felony, up to and including charges like "DUI Murder." Plow into the same bunch of people while scrolling Instagram? It's "Oopsie-doopsie! Accidents happen!" The worst you'll get is something like "negligent vehicular manslaughter" which is less than a year in jail.

toxik

Judging by how often people press the emergency stop button on the escalators where I live, I fear that relying on the sincerity of strangers (and their cars) is maybe not a viable solution.

okanat

Allocating dedicated FSD roads is a terrible future. It will basically kill cities. I find myself agreeing with most of the arguments here: https://youtube.com/watch?v=040ejWnFkj0

marssaxman

We already have dedicated FSD roads, and they work quite well; we just usually call them "subways" or "light rail".

I don't see much point in building additional FSD roads for the inefficient, non-platooned, low-capacity, rubber-wheeled trolleycars Tesla makes, though.

wkat4242

I wish people would just write down their arguments instead of making a 53 minute video of it. I don't have time to watch all that :(

mmastrac

At some point these reckless drivers need to start going to jail. I realize it's not going to happen in the US because the government has been captured, but there's clearly some missing messaging where these drivers don't get the point that they need to be paying attention and not tweeting on their phone while their car drives into a lamppost.

fredfoobar

why stop at FSD users? this should apply to all drivers in general. if they cause an accident they need to go to jail.

mmastrac

... they do already, if the infraction is deemed serious enough?

huijzer

Interesting comments from X:

Snowball: "So FSD failed but you still managed to find a way to praise Tesla. You failed too for not taking over in time. But your concern isn't for the lives of third parties that You and FSD endangered. No, you are worried about Tesla getting bad publicity. You have misplaced priorities."

Jonathan Challinger (the driver who crashed): "I am rightly praising Tesla's automotive safety engineers who saved my life and limbs from my own stupidity, which I take responsibility for. [...]"

Fair points from both sides I think.

stretchwithme

I think it would be wise to physically test as many corner cases as possible under extreme conditions. At night, in the snow, going down a hill, birds flying across the road at the same moment a baby robot crawls on to it.

nomel

This, obviously, is one of the ongoing efforts and goals of Tesla AI, and the reason they collect so much data. There's some talk about it in the Lex Friedman podcast(s)[1].

[1] https://www.youtube.com/watch?v=cdiD-9MMpb0

karlgkk

Fine, that's great and all if you're into that.

But if that's what you need to build a FSD product, then you shouldn't be releasing the existing FSD product onto public streets.

dkenyser

I'm assuming OP is suggesting that Tesla needs to test these conditions, not the end user on a public road where innocent lives are at risk...

I could be wrong though.

TechDebtDevin

Yeah, they also really need videos of the inside of your garage and videos of you with your family in the garage while the car isnt running apparently.