Skip to content(if available)orjump to list(if available)

Tesla blows past stopped school bus and hits kid-sized dummies in FSD tests

AlecSchueler

Why are Tesla related posts still being flagged? Mr Musk stepped out of his governmental role so criticism of his assets is no longer unavoidably political. My understanding was that criticism of Tesla was for the past few months seen as a political action and that many here don't want any inflammatory political discussions about the current US administration, but what's the current reason for flagging? This is surely tech/business news through and through.

colpabar

The comments become unreadable because everyone just argues over musk, so people just flag the whole thing.

2rsf

> "requires a fully attentive driver and will display a series of escalating warnings requiring driver response."

I understand the reasoning behind it, but watching the video () of the test shows that the car did not warn the driver, and even if it did it was speeding too much leaving almost no time for a driver to respond

Disclaimer- I have never used FSD before

() https://dawnproject.com/the-dawn-project-and-tesla-takedowns...

denniebee

> but watching the video () of the test shows that the car did not warn the driver

The warnings occur when you look away or don't touch the steering wheel for a while. Not saying that Tesla is without error (it isn't), but just clarify what the warnings are for.

hulitu

> The warnings occur when you look away

So they are useless. My car warns me even if i don't look.

jlbooker

> So they are useless. My car warns me even if i don't look.

No, they serve a very specific purpose -- (attempting) to ensure the driver is at the controls and paying attention.

Don't confuse the FSD attention "nag" warnings with collision warnings. The collision warnings will sound all the time, with and without FSD enabled and even if you're not looking at the road. If you don't start slowing down quickly, Automatic Emergency Braking (AEB) will slam on the brakes and bring the car to a stop.

reaperducer

So they are useless. My car warns me even if i don't look.

Heck, my car not only warns you, it slams on the brakes for you.

Scared the heck out of me when it happened, but it saved me from hitting something I didn't realize was so close.

Molitor5901

I think the idea of self-driving needs to be strongly re-evaluated. There are countless videos of people in their Tesla's driving down the road.. from the back seat. FSD is simply not feasible right now, but it seems that when people let Tesla Take the Wheel, they are easily duped into assuming it will always work - when it doesn't.

Until there are federal standards and rigorous testing of self-driving vehicle technologies they should not be installed, or advertised.

alexey-salmin

Regardless of one's stance on Tesla, it's sad to see this post flagged.

locococo

Ignoring a stop sign, not even slowing down, thats a major safety flaw.

I am wondering if there is a safety certification body for self driving technology. If not, one is needed because consumers can't be expected to be aware of all the limitations of the latest update they have installed.

There must be basic safety standards these systems need to meet, a disclaimer can't be the solution here.

antennafirepla

There isn’t. We won’t regulate until there is public outcry over tens or hundreds of deaths.

addandsubtract

Or you pay Trump more money than Elon.

potato3732842

The real problem is that it didn't recognize and stop for the stop signs on the school bus. The child is basically an afterthought designed to appeal to the emotion of those whom logic fails. Even if no kids materialized the way a bus stop works (bus stops, then kids cross) means that detecting the kid really shouldn't be the primary trigger for stopping in this situation, the stop sign needs to be recognized and acted upon. Any ability to detect and stop for pedestrians is secondary to that.

b3orn

I don't agree with this. Not hitting pedestrians should not just be an afterthought. Of course the car should recognise the stop sign, but there are cases in which stop signs are obstructed or missing, and in those cases pedestrians should still not be hit by a car.

alexey-salmin

Yes, but recognizing a pedestrian when he jumps in front of your car is useless -- you don't have time to stop anyway.

What you WANT to recognize is conditions when such an event is possible (obstructed vision) and to slow down in advance even if you don't see/detect any pedestrians at the moment.

This obviously includes the case with the school bus and the stop sign but, as you correctly point out, is not limited to that. There are more cases when a pedestrian, especially a child, can jump under your car from behind a big vehicle or an obstacle.

Recognizing these situation and slowing down in advance is a characteristic trait of a good-intentioned experienced driver. Though I think that most of the time it's not a skill you have straight out of driving courses, it takes time and a few close calls to burn it into your subconsciousness.

cameldrv

There are a lot of videos Waymo has posted of split second swerves they’ve had to do in SF and Austin. It looked to me like a combination of hard braking and swerving could have avoided the collision. Now to be fair to Tesla, the dummies in this test didn’t look very realistic, but not even slowing down for the school bus shows that FSD is not close to being ready for unsupervised use.

BobaFloutist

At 25 mph, which I would hope would be the speed limit on roads next to schools, slamming on the brakes even seconds before colliding with children can make an enormous difference in how fast the car is going when it hits the kid.

Speed is the factor in collisions (other than weight), and modern brakes are incredibly good.

Not to mention that the car, with it's 360 degree sensors, could safely and efficiently swerve around the children even faster than it can brake, as long as there's not a car right next to you in another lane -- and even if there is, hitting another car is far less dangerous to their life than hitting the children is to yours.

These things should be so much better than we are, since they're not limited by unidirectional binocular vision, but somehow they're largely just worse. Waymo is, at best, a bit better. On average.

potato3732842

>I don't agree with this. Not hitting pedestrians should not just be an afterthought.

You're disagreeing with something I didn't say. There's a difference between afterthought and the primary initiator of the stop in a situation like this.

>Of course the car should recognize the stop sign, but there are cases in which stop signs are obstructed or missing, and in those cases pedestrians should still not be hit by a car.

The surprise pedestrian test is one that any vehicle can be made to fail by sheer physics. Avoiding errant pedestrians like in the video will likely only come as a byproduct of better situational behavior by self driving vehicles. The overwhelming majority of drivers know to ignore the speed limit if the situation is rife with pedestrians or otherwise sus and are generally fine with missing/obstructed stop signs. I don't know what route self driving software will take to approximate such behavior but it likely will need to.

ChoGGi

> The surprise pedestrian test is one that any vehicle can be made to fail by sheer physics.

There's different degrees of failure as well, did the Tesla try to brake beforehand or apply brakes after hitting the doll?

lawn

Bah. The car should slow down if the view is restricted or when it's passing a bus, especially a school bus, regardless if there's a stop sign or not.

ryandrake

Merely slowing down is not enough. If the car doesn't come to a complete stop and remain stopped while the bus has its stop sign extended, it's driving illegally.

instaclay

They're saying if there's a school bus inside of a narrow corridor, that a prudent driver would slow down and use caution IN ANY CIRCUMSTANCE.

They're obviously not arguing that the car shouldn't stop with a sign deployed.

Arguing from a point of bad faith doesn't advance the discussion.

BobaFloutist

I want them to drive legally, but I also want them to be able to react to objects even if they don't have signs on them.

Signs are often obstructed by trees or are simply inadequate for safe driving, even when combined with the rules of the road. Any even "partially" automated driving should be able to trivially recognize when its view is compromised and proceed accordingly.

angusb

This has done the rounds on other platforms. A couple of important points:

- the failure here is that the car didn't stop for the bus on the other side of the road with the extended stop sign. (Obviously a kid running out from behind a car this late is fairly difficult for any human or self driving system to avoid)

- the FSD version for the robotaxi service is private and wasn't the one used for this test. The testers here only have access to the older public version, which is supposed to be used with human supervision

- the dawn project is a long-time Tesla FSD opponent that acts in bad faith - they are probably relying on false equivalence of FSD beta vs robotaxi FSD

Nevertheless this is a very important test result for FSD supervised! But I don't like that the dawn project are framing this as evidence for why FSD robotaxi (a different version) should not be allowed without noting that they have tested a different version.

fabian2k

I don't see why Tesla would deserve the benefit of the doubt here. We cannot know how well the actual Taxi software will work, I think it is fair to extrapolate from the parts we can observe.

angusb

re. extrapolation: I agree with that, but remember there's sampling error. The crashes/failures go viral but the lives saved get zero exposure or headlines. I don't think that means you can just ignore issues like this but I think it does mean it's sensible to try to augment the data point of this video with imagining the scenarios where the self driving car performs more safely than the average human driver

fabian2k

I absolutely do think that self-driving cars will save many lives in the long run. But I also think it is entirely fair to focus on the big, visible mistakes right now.

This is a major failure, failing to observe a stop sign and a parked school bus are critical mistakes. If you can't manage those you're not ready to be on the road without a safety driver yet. There was nothing particularly difficult about this situation, these are the basics you must handle reliably before we even get to alle the tricker situations those cars will encounter in the real world at scale.

locococo

No! Ignoring a stop sign is such a basic driving standard that it's an automatic disqualification. A driver that misses a stop sign would not have my kids in their car. They could be the safest driver on the racetrack it does not matter at that point.

dzhiurgis

Also they've repeatedly tested closer and closer distances until Tesla failed aka p-hacking.

reaperducer

I agree with that, but remember there's sampling error.

Ma'am, we're sorry your little girl got splattered all over the road by a billionaire's toy. But, hey, sampling errors happen.

locococo

I thin their test is valid and not in bad faith because they demonstrate the Teslas self driving technology has no basic safety standards.

Your argument that a newer version is better simply because it's newer does not convince me. The new version could still have that same issue.

angusb

> Your argument that a newer version is better

I actually didn't say that and am not arguing it formally - I said what I said because I think that the version difference is something that should be acknowledged when doing a test like this.

I do privately assume the new version will be better in some ways, but have no idea if this problem would be solved in it - so I agree with your last sentence.

Topfi

> […] I think that the version difference is something that should be acknowledged when doing a test like this.

Did they anywhere refer to this as Robotaxi software over calling it FSD, the term Tesla has chosen for this?

interloxia

It is also a failure that it does a cruise and continues to drive over the thing/child that was hit.

angusb

that should have been in my list, you're right

mxschumacher

it's always the next version that will go from catastrophic failure to being perfect. This card has been played 100 times over the last few years.

jantissler

Exactly what I wanted to add. Every single time there is hard evidence what a failure FSD is, someone points out that they didn't use the latest Beta. And of course they provide zero evidence that this newer version actually addresses the problem. Anyone who knows anything about Software and updates understand how new versions can actually introduce new problems and new bugs …

locococo

excatly this

bestouff

> Obviously a kid running out from behind a car this late is fairly difficult for any human or self driving system to avoid

That's why you slow down when you pass a bus (or a huge american SUV).

sokoloff

In this case, you stop for the bus that is displaying the flashing red lights.

Every driver/car that obeys the law has absolutely no problem avoiding this collision with the child, which is why the bus has flashing lights and a stop sign.

ethbr1

I'm honestly shocked that all versions of Tesla assisted/self-driving don't have a hardcoded stopped schoolbus exception to force slow driving.

Worst case, you needlessly slow down around a roadside parked schoolbus.

Best case, you save a kid's life.

pas

if I put a stop sign somewhere is that legal? or there's some statute (or at least local ordinance) that says that yellow buses on this and this route can have moving stop signs?

Sharlin

Exactly. If your view is obstructed by something, anything, you slow down.

chneu

Everytime Tesla's FSD is shown to be lacking someone always says "well that's not the real version and these people are bias!"

ryandrake

Don't forget the standard "And the next version surely will be much better!"

Zigurd

The simulation of a kid running out from behind the bus is both realistic, and it points out another aspect of the problem with FSD. It didn't just pass the bus illegally. It was going far too fast while passing the bus.

As for being unrepresentative of the next release of FSD, we've had what eight years ten years of it's going to work on the next release.

BobaFloutist

>(Obviously a kid running out from behind a car this late is fairly difficult for any human or self driving system to avoid)

Shouldn't that be the one case where self driving system has an enormous natural advantage? It has faster reflexes, and it doesn't require much, if any, interpretation or understanding of signs or predictive behavior of other drivers. At the very worst, the car should be able to detect a big object in the road and try to brake and avoid the object. If the car can't take minimal steps to avoid crashing into any given thing that's in front of it on the road, what are we even doing here?

jofzar

Before anyone says it's on the responsibility of the driver, that's only while there is still a driver.

https://www.reddit.com/r/teslamotors/comments/1l84dkq/first_...

ndsipa_pomu

I don't know why this is flagged unless it's just the Tesla/Musk association. I thought that self-driving vehicles are a popular topic on HN.

lawn

It's obviously because of Musk. Anything that paints him in a bad light is flagged ad nauseam.

sigmoid10

Why on earth can't this be done by normal testing agencies? Why do things like "Tesla Takedown" have to participate in it? Even if the test was 100% legit, that connection to mere protest movements taints it immediately. It's like when oil companies publish critical research on climate change. Or Apple publishing research that AI is not that good and their own fumblings should not be seen as a bad omen. This kind of stuff could be factually completely correct and most rational people would still immediately dismiss it due to conflict of interest. All this will do is flame up fanboys who were already behind it and get ignored by people who weren't. If real goal is to divide society, this is how you do it.

sitkack

You mean the ones gutted by said owner of said company?

Comparing "Tesla Takedown" with ExxonMobile is way too quick, you should have said Greenpeace. I'd say that TT has to do this, Is part of the point.

cannonpr

In normal times, perhaps, today…

https://www.theverge.com/news/646797/nhtsa-staffers-office-v...

When regular in theory bipartisan mechanisms fail, protest is all you have left.

fragmede

> NHTSA staffers evaluating the risks of self-driving cars were reportedly fired by

Elon Musk, who also owns Tesla.

ethbr1

[flagged]

philistine

You seem to ignore the historical reasons why testing agencies exist in the first place. There wasn't a consensus back then that they even needed to exist. People like Ralph Nader needed to hammer the point again and again and again to will those standards into existence. The pressure groups fighting against Tesla are doing the same harsh difficult job that Nader did.

https://en.wikipedia.org/wiki/Unsafe_at_Any_Speed%3A_The_Des...

randomcarbloke

Well quite, it's an indictment of americas institutions if investigations held in the public interest must be conducted by third parties and not the establishment.

What other hidden dangers slip by without public knowledge.

ethbr1

If only there some proven way to hold abusive power to account in the public consciousness. https://en.m.wikipedia.org/wiki/The_Jungle

ecocentrik

For the same reason that nonprofit consumer advocacy and safety organizations like Consumer Reports and the Center for Auto Safety exist. Lobbyists and wealthy private individuals exert a lot of influence over the operations of publicly funded oversight agencies. They have the power to censor these agencies or as we've seen recently, fire all their staff, defund them or close them completely.

As for the fanbois and f*cbois, they have always existed and will always exist. They are the pawns. Smart mature people learn to lead, direct, manipulate, deflect and ignore them.

ndsipa_pomu

In my view, the onus to conduct tests and prove that it is safe to use should be conducted by the manufacturer and should meet the requirements of the regulating authority (presumably NHTSA in the U.S.) and appropriate city/states if they agree to allow limited public road usage.

bjord

In theory, that sounds great, but in practice, why should we trust manufacturers to reliably test their own products?

ndsipa_pomu

I agree, but it's fairly common practise. I think a "trust, but verify" approach should be used and jail board members for attempts to fool the regulators (c.f. VW emissions fraud)

orwin

I've read somewhere that the NHTSA people working on mandatory tests for self driving were fired by DOGE.

lawn

Tesla has been lying for years. Why would you trust them to conduct their own tests? That's completely backwards.

Independent tests are what's needed, and preferably done by agencies who won't get gutted because they find something that's inconvenient.

Eisenstein

Which testing agencies are doing these tests?

sigmoid10

sjsdaiuasgdia

That's crash testing, which is about the safety of the people inside the vehicle.

This test is about whether Tesla's self driving technology is safe for the people outside the vehicle.

fragmede

Or cigarette companies having doctors show that cigarettes don't cause cancer. It turns out that the bar for science is really really high, and for stuff that's less obvious and takes time and lots of money and effort, it's really hard to show. You'd think I was a loon if I told you gravity wasn't a decided matter of physics, since we're all glued to this Earth every day, but we still don't know that dark matter and the ramifications for the theory of gravity are. Science still doesn't know how to test psychedelic medicine because it's obvious to the control group that they're on the placebo. That doesn't mean we should lower the bar for science, just that we should be aware of shortcomings in our beliefs.

So if you want to get into the details and figure out why a video from "Tesla Takedown" should or should not be believed, in all ears, but I'm some random on the Internet. I don't work at NHTSA or anywhere that could affect change based on the outcome of this video. It's not going to affect my buying decisions one way or another, but it'll only divide people who have decided this matters to them and can't get along with others.

ndsipa_pomu

It seems crazy to me that the U.S. allows beta software to be used to control potentially lethal vehicles on public roads. (Not that I have much faith in human controlled lethal vehicles)

null

[deleted]

chrisco255

[flagged]

q3k

Advance what? Car dependence?

chrisco255

Advance robotics, automation and artificial intelligence as well as the state of the art in transportation. And yes, that obviously includes automobile transportation. What year are you living in? 1892? Do you realize how impossible modern society would be without cars?

chris_wot

Surely you jest? The downside is it kills and maims people, including children. That’s not an acceptable trade off.

Most countries have authorities who feel the same way. If you think killing and maiming is “advancing”, then you have a weird view of society.

maleldil

I don't think they're joking.

> If you think killing and maiming is “advancing”, then you have a weird view of society.

Classic accelerationism. "Some of you may die, but it's a sacrifice I am willing to make."

chrisco255

You do understand that tens of thousands die on the road in the U.S. every year as status quo, right? The status quo is that human error kills that many. That's the upside and the need to test and iterate in actual real world environments is obvious. That's why FSD is supervised. Very simple.

ndsipa_pomu

[flagged]

tomhow

> You're either stupid or being disingenuous if you can't think of any other approach

Please don't comment like this here, no matter how right you are or think you are.

ethbr1

The thing armchair auto-focused urbanists don't understand is the infeasible cost of changing urban placement: because it accretes over decades/centuries.

It's trivially easy for a city to paint itself into a corner from which it's (financially) impossible to escape, with 'best intentions' at every step.

Next thing they know, major employment hubs are too far apart, with major buildings that cannot be relocated, without continguous right of ways, and... they're fucked.

anArbitraryOne

Hopefully the dummies were from the board of directors