Tesla drives into Wile E. Coyote fake road wall in camera vs. Lidar test
258 comments
·March 16, 2025desixavryn
vasco
If anything is misrepresenting it's both the name autopilot and the name "full self driving" for two things that neither are autopiloting or full self driving.
razemio
I think I already had this discussion on HN. I am not sure why most people think autopilot is hands of the wheel without looking. There literally is no autopilot, which does not require constant attention from a human. This is true for planes, ships and cars. Hell this true even for most drones.
FSD is very much correctly named. It does not say you can go for a sleep. It just means, the vehicle is capable of full self driving, which is true for most conditions and something most cars are not capable of. How would you have named it?
PostOnce
> I am not sure why most people think autopilot is hands of the wheel without looking.
Well, because they named it autopilot.
Autopilot means it pilots... automatically. Automatic pilot. Not Manual Pilot, not with intervention, automatically.
They could have named it "driver assist", "partial-pilot", "advanced cruise control", "automatic lanekeeping" or anything else, but they named it Autopilot. That's Tesla's fault.
sorenjan
It's not unreasonable for people to think that "autopilot" means something that automatically pilots a vehicle. According to a dictionary automatic means "having the capability of starting, operating, moving, etc., independently". Whether or not that's how actual autopilots in airplanes and ships works is irrelevant, most people aren't pilots or captains. Tesla knew what they were doing when they chose "autopilot" instead of "lane assist" or similar, like their competitors did. It sounds more advanced that way, in line with how their CEO have been promising full autonomous driving "next year" for a decade now.
It's also worth noting that the recent ship collision in the north sea is thought to be because the autopilot was left on without proper human oversight, so even trained professionals in high stakes environments make that mistake.
oxfordmale
Mercedes has gained approval to test Level 4 autonomous driving. Level 4 is considered fully autonomous driving, although the vehicle retains a traditional cockpit, and the driver can request control at any time. If a Level 4 system fails or cannot proceed, it is required to pull the vehicle over and bring it to a complete stop under its own control.
I would argue that it is getting very close to what people think autopilot can do. A car that, under certain circumstances, can drive for you and doesn't kill you if you don't pay constant attention.
digitalPhonix
> There literally is no autopilot, which does not require constant attention from a human. This is true for planes, ships and cars.
Can’t comment on ships, but autopilots in planes definitely match SAE level 3:
- Hands off
- Do not require constant attention
- Will alarm and indicate to the user when attention is required
ie. “when the feature requests, you must drive [fly]” from the SAE nomenclature: https://www.sae.org/blog/sae-j3016-update
And this is autopilot that’s been in commercial aviation for decades (so long that it’s started filtering down to general aviation).
crooked-v
> It does not say you can go for a sleep.
That means, by definition, it's not "full" self driving.
akmarinov
> It just means, the vehicle is capable of full self driving, which is true for most conditions and something most cars are not capable of.
But that’s not true.
It’s not capable of fully driving itself, hence why the supervised part was added
ryandvm
I think you're ignoring that most car drivers are not versed in the tools and jargon of the air and shipping industries. It really isn't relevant what "autopilot" means in various professional contexts, what matters is what somebody with a high school education (or less) thinks that "autopilot" means.
jbs789
This is an interesting point. Maybe the problem is most people don’t drive boats or planes so are not familiar with the experience in those contexts. I think you’re right from the boat standpoint a “auto pilot” means you set a heading and the sails/rudder is adjusted to get there.
quink
Here's what the official Tesla website has to say about FSD vs. Autopilot:
> In addition to the functionality and features of Autopilot and Enhanced Autopilot, Full Self-Driving capability also includes:
> > Traffic and Stop Sign Control (Beta): Identifies stop signs and traffic lights and automatically slows your vehicle to a stop on approach, with your active supervision.
> > Upcoming: Autostreer on city streets
Since I don't see a stop sign, or a traffic light, I cannot imagine how that makes any difference or can in any way be considered a complete f*k up, or how that's a "HUGE misrepresentation of facts". These things, listed here copied verbatim from the website of the manufacturer, are completely irrelevant to what was being tested here. It's like arguing that a crash test is invalid because the crash test dummy had a red shirt instead of a yellow one.
quink
Furthermore:
> Active Safety Features
> > Active safety features come standard on all Tesla vehicles made after September 2014 for elevated protection at all times. These features are made possible by our Autopilot hardware and software system [...]
No mention of FSD anywhere in that section. Tesla fanboys, pack it in.
TheAlchemist
The good old argument about the "latest" version on the "latest" hardware... As Tesla influencers have been saying for the past 5 years 'it will blow your mind'.
In the very first phrase he says "I'm in my Tesla on Autopilot"...
modeless
The video title says "self driving car". This is clearly dishonest when they intentionally did not test the feature named "self driving" in the car shown in the video thumbnail, nor disclose the fact that such a feature does exist and is significantly better than what they tested.
This video is a real missed opportunity. I would love to see how FSD would handle this and I hope someone else takes the opportunity to test it. In fact, testing FSD is such a trivially obvious idea that the fact that it's not even mentioned in the video makes me suspicious that they did test it and the results didn't match the narrative they wanted.
sschueller
If he doesn't know the difference how is the average car buyer that sees Elon sell such features supposed to know?
bryanlarsen
It's $8000 for FSD. You're going to know whether you bought it or not.
jayd16
You won't know you haven't, which is the whole point.
Hamuko
What if I buy second hand? I heard there's quite a few Teslas available on the used market.
null
zaptrem
To be clear, this is like buying a car that has traffic aware cruise control available as an option, but turning it down, then insisting the TACC is broken and dangerous because it doesn’t work on your car.
tredre3
That's a poor analogy because what Mark was testing was emergency breaking and collision avoidance, which is part of Autopilot.
https://www.tesla.com/support/autopilot#active-safety-featur...
XorNot
Why would FSD be any more capable? Like why would anyone expect that? I get how this could happen, but this isn't advanced navigation it's basic collision avoidance of things in front of the car, something I'd expect autopilot to do at a bare minimum.
hokumguru
They use completely different machine learning models for each of the two features
nickthegreek
So braking that won’t hit children is soft locked behind a paywall? I don’t think that is a great argument for the perspective buyer.
iknowstuff
Entirely different software stack.
XorNot
Like I said, I get how this could happen. But it is wild for a company proposing they've "almost" solved FSD to not be able to execute basic collision avoidance - a core, fundamental capability of FSD - to not have deployed that capability into the hardware of a car they claim is completely capable of FSD with that hardware.
null
Prickle
That should also be entirely irrelevant.
It's emergency breaking. There shouldn't be differing results on whether you paid an extra $8000 usd or not.
FSD does not advertise better emergency breaking as a feature. (Last I checked anyway.)
jiggawatts
It has better depth-from-vision estimation that not only uses stereo vision, but can combine multiple frames.
In theory, it can handle the painting on a wall scenario. In theory.
I’d like to see it tested!
This wasn’t it.
losvedir
For a while, FSD was a totally different system from Autopilot. FSD was the giant new neural net training approach from end to end for all kinds of driving, while Autopilot was the massive explicitly tuned C++ code base for highway driving from the early days. The goal was to merge them, but I haven't followed it closely lately, so I don't know if they ever did.
iknowstuff
It wasnt even on autopilot when he crashed it. At best, he was testing automatic collision avoidance while pressing the accelerator, not autonomy.
mbreese
You see how this is worse, right? If automatic collision avoidance doesn’t work, why would you expect FSD to do better? (Or more to the point - why would a prospective buyer think so?)
And if collision avoidance doesn’t work better, then why isn’t FSD enabled on all cars — in order to fulfill this basic safety function that should work on all Teslas. Either way you look at it, this isn’t good. Expecting owners to buy an $8K upgrade to get automatic stopping to work is a bit much. Step one - get automatic stopping to work, then we can talk about me spending more money for FSD.
(And yes, I’m still bitter that my radar sensor was disabled).
iknowstuff
Teslas automatic collision avoidance is the best on the market lol you can trivially google it
Whatarethese
It was. It disengaged when it knew it couldn't prevent the crash. You can see it in AP before the crashes.
root_axis
So are you suggesting the 8k upgrade can detect some dangerous obstacles that the standard version can't? I doubt that.
timeon
I'm afraid HW4 is still not lidar-like.
modeless
Adding to the weirdness of this video, it appears Mark Rober faked his footage to make it look like he was using a Google Pixel to record screen video, but he was actually using an iPhone as can be seen in the screen reflection. And he put the "G" logo in the wrong orientation in the faked shot.
Also it's weird that he's acting like he's so special for having seen the inside of Space Mountain as if it's some kind of secret. Millions have seen it all lit up. Back when the PeopleMover/Rocket Rods attractions were running it was a common sight, as the track ran through Space Mountain and sometimes it would be under maintenance with the lights on. And of course in emergency situations they turn the lights on as well.
Another one: he claims they use thin curtains to project ghosts on in the Haunted Mansion which is true, but while he's talking about it he shows footage of a part of the ride that uses Pepper's ghost which is a completely different (and more interesting) technique. Some of the ghosts he shows while talking about it could not be achieved with the curtain method.
Come to think of it, Pepper's ghost could fool lidar. Maybe that's why he didn't talk about it even though it would have been more interesting. It would have been inconvenient for his sponsor. Someone setting up a reflective surface across a road is probably about as likely in the real world as a Wile E. Coyote-style fake road wall.
dgrin91
I watched the video, the Wile E. Coyote fake wall stuff is a gimmick meant to draw kids in. That, however, par for the course of his videos; they are designed to hook kids into engineering with silly things and secretly teach real engineering before getting to the punch line.
In this case the real engineering is that Tesla's choice of relying on only visual camera has fundamental issues that can not be solved with cameras only. Namely, visually blocking elements, such as heavy rain, fog, or even blinding lights just pretty much can not be solved by camera-only sensors.
(though I guess one "solution" would be for the system to say I can't see enough to drive, so I'm not going to, but then no one would buy the system)
nashashmi
It is also a promotion for Lidar tech, Telling future engineers to be cautious of camera vision-only driving systems. I agree with Mark. But not because of the tests he intentionally created to make Lidar look better.
I really do hope camera only tech will do better than this. But I also hope that lidar technology will eventually make it better. Right now , LIDAR needs much heavier computing power to be reasonably powerful.
_aavaa_
Fog and heavy rain are not “tests he intentionally created to make Lidar look better.”
nashashmi
The extreme fog and heavy rain are dramatic conditions that no existing driver can navigate within. These conditions make a strong case for involving advanced tech, i.e. a better LiDAR. And that is what happened here: a scenario that no driver without assisted tools can pass, and neither can a Tesla.
Do you remember when Tesla had radar? The idea was brilliant at a time when LiDAR tech was difficult to get en masse. Radar plus camera does what LiDAR could do at lower computing powers.
thefourthchime
It wasn't fog, it was smoke. Lighters can't see through fog well
Gigachad
A human driver would at least (usually) recognise that there’s poor visibility and slow down. While Tesla autopilot finds it acceptable to go full speed through and obliterate a child.
Mo3
Aside of all the other obvious reasons to not get a Tesla these days this is #1 imo. Camera feeds and a neural network are not enough for self driving, no matter how much they're training. Never ever.
nickthegreek
At the very least they seem to have downsides that the can be easily overcame with a lidar system/combination of them. That alone is enough to help me decide when its my family that would be the passengers.
crooked-v
And every other modern auto-braking safety system, except for Subaru for some reason, incorporates at least basic proximity radar.
bpodgursky
I am not claiming that Tesla FSD is at this point, but it is obviously possible to use cameras and neural networks to drive a car, because that is literally how humans drive.
seszett
Do Tesla cars have stereo vision, though?
I don't think they build a 3D model of the world around them at all, while humans do (not only based on our stereo vision) and largely rely on that to drive.
sorenjan
They do build a 3D model: https://youtu.be/6x-Xb_uT7ts?t=129
Although I think it's interesting that even in his demo there are cars popping in and out of existence in the 3D visualization of the environment. I don't think that makes much sense, if a car is observed and then occluded the logical conclusion should be that it's probably still there, maybe add an increasing uncertainty to its location, instead of assuming it disappeared.
andsoitis
Apparently, Tesla has stated that they do not use paired cameras for ranging or depth perception, which is the standard approach for stereoscopic vision.
Instead of stereoscopic vision, Tesla's neural net processes the data from the cameras to create a 3D view of the surroundings.
https://electrek.co/2021/07/07/hacker-tesla-full-self-drivin...
iknowstuff
They do have 2-3 front facing cameras and their e2ee does necessarily get the same understanding of the world around it.
vel0city
Humans have more senses than just vision. Also a few cameras aren't completely covering the range of human vision.
intrasight
Yes, but I don't think you could argue the point that the visual sense is probably 95% of it. But even so, it could be decades before computers achieve the sensory capabilities of the human visual system. Why not, during those hundred years, add some lidar to the sensory mix. Just because it didn't evolve in humans doesn't mean it's not a good thing. Bats and dolphins use the biological analogue very effectively.
bpodgursky
? Do you use your sense of smell to drive a car? Are deaf people allowed to drive cars?
No, the cameras we have now, or at least the data processing, is probably not there yet, but it's absurd to claim it's "never" possible. It's obviously either possible in a year or fifteen years, all basic hardware is advancing fast.
aplummer
Also humans suck at driving compared to what we expect of machines
Detrytus
1. Human "neural network" is few orders of magnitude more powerful than the best neural network available right now.
2. This might be obvious question but: Humans have two eyes and they are few centimeters apart so that we could use stereoscopic vision to estimate the distance. Does Tesla FSD even attempt to do something similar?
bpodgursky
It is legal and safe to drive with 1 working eye.
Humans have many ways of modeling distances in the real world that do not rely on stereoscopic depth.
ryandvm
The problem with that argument is that a Tesla isn't nearly as clever as a human. I have never thought a white tractor trailer disappeared as it crossed in front of me against a bright sky. I know that I should drive a little bit slower on Halloween evening because there are kids running around. I know that I need to be a little more cautious driving past the playground or dog park. I have object permanence.
As it is, AI just isn't smart enough to rely on only vision and they ought to be taking advantage of every economically viable sensor to make up for the difference. Hell, I thought the goal was to do better than humans?
stnmtn
I get your point, but do you think, over the course of a decade, the average human driver or the average car with tesla FSD is more likely to have an accident where the the fault is their own?
ModernMech
Eyes are not cameras and brains are not neural networks. Until we have artificial techniques that can match the performance and functionality of this amazing biological equipment we have, our cars must rely on other sensors like LiDAR to make up for the deficiencies in our artificial perception networks that are not overcome by current techniques.
manquer
The actual problem statement is can such a system do so safely. If it is just to drive a car all you need are some stepper motors and a raspberry pi.
The goal is to reduce accident and fatalities not eliminate jobs .
If LiDAR has 1/10th the fatality rate of camera setup and is less costlier than 10x the value of human life(as used in legal proceedings) then it still the only viable option
rtkwe
Neural nets generally don't have the same level of interconnections and neurons as the human brain because it's tougher to train. I agree in principal that the human brain should be possible to replicate in a computer but I'm not sure we can do it with the current design of single direction connections between nodes. I highly doubt we'll be able to scale it down to a machine that's reasonable to fit in a car any time soon though and that's what Tesla is promising they can do with the current hardware package installed in cars (never mind that they also promised it'd be possible with the last major revision and had to walk that back and will have to install upgraded electronics in people's vehicles, unless they're just going to strand all HW3 owners who paid of FSD but the hardware is to wimpy to handle it unless that's changed since the robotaxi event).
nickthegreek
Don’t we want it to be like way better than humans at driving? The dream pitch was that we wouldn’t have tens of thousands of preventable deaths every year. So install the damn lidar. At some point it is coming down to penny pinching and that means that preventable deaths number will not sink to the promise.
FeloniousHam
Have you actually used FSD for, say, a week?
I'm a fan and everyday user of FSD. It's not perfect, but it's immensely useful, and frankly amazing. It works. It drives me to work and back everyday, with maybe 1 or 2 interventions each way.
I've never met any actual Tesla driver who was confused by the marketing (they're all tech guys). Some like it, some don't think it's worth the money, but everyone understands what it is.
(I'm not arguing against more detection hardware, but engineering is about trade offs.)
bingofuel
But not every Tesla owner is a tech guy. Not every Tesla driver lives in perfect sunny conditions. And in this case, a misinformed (or not informed at all) buyer's "engineering trade off" is death.
thefourthchime
It drives at night and in pretty heavy rain just fine.
hnburnsy
Agreed, but I believe that camera feeds and infrastructure changes could be enough.
lowmagnet
I do wonder if this is provable via information theory.
sMarsIntruder
You started with a bias and ended with another one.
dist-epoch
> Camera feeds and a neural network are not enough for self driving
I guess we should ban humans driving then.
sMarsIntruder
I’m noticing how comments with common sense gets easily downvoted.
desixavryn
Have you used latest FSD on HW4 recently? If not, please try it out for a few days and then come back to correct your comment :-)
windward
Honest question: why does the HW4 matter? Hasn't FSD been sold as a service all cars with older hardware can use?
Hamuko
Have they managed to fix the "parking sensors"?
desixavryn
Using vision only on my HW4-Model-Y it seems to work fine.
mordymoop
I don’t live near a lot of Wild E. Coyote fake road walls. I get the sense it’s more of a Midwest thing.
viraptor
But that wall may come to you. There's a few large truck trailers I've seen completely covered on a side with a picture which could be interpreted as a road / sidewalk / smaller car / whatever.
rocauc
In the 2019 fatal Tesla Autopilot crash, the Tesla failed to identify a white tractor trailer crossing the highway: https://www.washingtonpost.com/technology/interactive/2023/t...
ModernMech
And to be clear, this is the second time a Tesla failed to see a tractor trailer crossing the road and it killed someone by way of decapitation, the first being in 2016:
https://www.theguardian.com/technology/2016/jul/01/tesla-dri...
Notably, Tesla's response to the 2016 accident was to remove RADAR sensors in the Model 3. Had they instead taken this incident seriously and added sensors such as LiDAR, that could detected these obstacles (something that any engineer would have done), the 2019 accident and this "cartoon road" failure would have been avoided. For this reason I believe the 2019 accident should be considered gross professional negligence on the part of Tesla and Elon Musk.
crooked-v
Also, any wall at a T-intersection with a mural that looks vaguely landscape-y.
taneq
I dunno, I generally try to avoid driving off the road into a landscape.
That said, I very well might drive into a high-res billboard of the exact scenery behind the billboard, if I wasn’t expecting it on a long straight country road. The in-car view in that video looks pretty damn convincing, sure you’d know something was off but I wouldn’t bet on 100% of drivers spotting the deception.
Maybe next video he can show that Autopilot won’t detect a land mine, or an excavation just after the crest of a hill.
gruez
>There's a few large truck trailers I've seen completely covered on a side with a picture which could be interpreted as a road / sidewalk / smaller car / whatever.
Truck trailers are typically 3-4 ft off the ground, and have obvious borders.
thrill
They're obvious to you and me - are they obvious to a camera only Tesla "autopilot"?
viraptor
Are you prepared to try that the "obvious border" is enough? Would you fully trust that there will be no issue?
mikequinlan
>Midwest thing.
Southwest I think.
timeon
It failed also on kid mannequin. Do you have many kids around?
eldaisfish
Here is the original Mark Rober video - https://www.youtube.com/watch?v=IQJL3htsDyQ
I dislike the fact that Mark's videos appear to increasingly borrow from the Mr Beast style, which is very distracting. There's also the fact that half the video has nothing to do with cars in the first place.
The main result here is not surprising - Tesla's vehicles are plagued by a litany of poor engineering decisions, many at a fundamental level. Not using Lidar for depth detection is beyond stupid.
nickthegreek
I found the disney side video extremely interesting and entertaining. It would have been nice to just pad each of them out individually with enough content to have them stand on their own while meeting the criteria he needs to appease the algorithm.
jbaber
I know what you mean. But he's trying his best to be the new Mr. Wizard and youtube has a very long list of demands.
2OEH8eoCRo0
The first half explains lidar in laymen's terms
pavel_lishin
> I dislike the fact that Mark's videos appear to increasingly borrow from the Mr Beast style, which is very distracting.
Yeah. My daughter likes him, but this latest thumbnail made me roll my eyes.
jnwatson
Don't hate the player, hate the game. This is what attracts the eye balls.
pavel_lishin
I think I'm allowed to dislike the game, and to stop watching his videos.
sitkack
We already knew this to be true by the clusters of Tesla fatalities around certain bay area off ramps.
93po
this is factually just not true, at all. like outrageously not true. why would you just completely make something up like this?
sitkack
What are you talking about?
1) numerous reports of teslas not seeing tractor trailers or fire trucks.
2) numerous reports even on this site where teslas under lane assistance repeatedly and predictably behaved erratically on CA off ramps.
Any optical only system will suffer from optical illusions, this cannot be avoided.
rozap
Still love my truck though.
sMarsIntruder
This statement goes against any report and analysis on basic autopilot, not even FSD.
Just like the video, that if you’d take some time to watch l, you’d see that’s just basic autopilot.
Data is saying other things, but if we want to deny gravity, I’m ok with it.
davidcbc
What data?
If you link to Tesla statements that's marketing not data.
sMarsIntruder
https://www.tesla.com/VehicleSafetyReport
Ok, this is marketing.
sitkack
Chill bro, you can have FSD when it comes out.
hnburnsy
Mark posted a video on X showing him getting up to speed, engaging autopilot 4 seconds before the wall, and autopilot disengaging 1 second before hitting the wall.
https://x.com/MarkRober/status/1901449395327094898
>Here is the raw footage of my Tesla going through the wall. Not sure why it disengages 17 frames before hitting the wall but my feet weren’t touching the brake or gas.
The playing field level here was significantly slanted comparing a production Tesla, driven by Mark, engaging 10 year old autopilot technology, against a non-production test vehicle, not driven by Mark, using I would assume the latest in LiDAR technology from Luminar.
Volvo sells the EX90 with a Luminar LiDAR sensor (not active it looks like). Why wasn't it used with Mark driving?
morgannewman
[dead]
anotherboffin
Oh but no worries, FSD is a “solved problem” and should be done in 18 months or so…
devnullbrain
Oh dear, your timeline casts doubts on the ability for a Tesla to self-drive from LA to New York before the end of 2017
sMarsIntruder
In fact, he didn’t use FSD.
timeon
Not one the other car either but that one have not failed.
ravenstine
Remember, you gotta break some eggs to make an omelette; every time something crashes, explodes, or kills – that's a good thing! /s
wnevets
Tesla also drives into tractor trailers because they think they're clouds
deedubaya
Where can I buy the alternative lidar based car?
randerson
That looked like a Lexus with its insignia blacked out, so I'm guessing it was a custom build. But if you want a car that comes with LIDAR, look at the Volvo EX90.
hnburnsy
Not in use yet on the EX90, just collecting data...
https://www.volvocars.com/us/cars/ex90-electric/features/
>Lidar equipped on early production vehicles will begin in data collection mode and over time the technological capabilities of the lidar will expand to include additional sensing scenarios as part of our Safe Space Technology, including the ability to detect objects in darkness at highway speeds up to 820 feet ahead.
crooked-v
GM is working on it, but no indication yet when the lidwr version will be released. Currently their Super Cruise uses radar and cameras, plus pre-scanned lidar maps of roads that it compares everything against in real time.
hnburnsy
Volvo EX90 has the sensor, but it is not active.
jayd16
If you get a 2020(19?) model 3 you get the proximity radar as well.
bspinner
Since it could be misleading without this info: radar has been removed from newer model years.
wmf
Polestar 3 is the only car with lidar in the US that I know of.
hnburnsy
Not yet...
https://www.polestar.com/us/polestar-3/safety/#lidar
>LiDAR upgrade subject to availability based on production cycles. Deliveries to start mid-2025.
chvid
In China.
rocauc
I wonder how long until techniques like Depth Anything (https://depth-anything-v2.github.io/) provide parity with human depth perception. In Mark Rober's tests, I'm not sure even a human would have passed the fog scenario, however.
I am a massive fan of Mark Roeper. Unfortunately he completely f**d up this one. He tested using Autopilot, not the latest FSD on HW4, which is worlds apart in capabilities. It is possible that the latest FSD would also crash, but that would be a valid test of FSD capabilities. Testing using Autopilot and calling it "FSD crashes" is a HUGE misrepresentation of facts. I am hoping Mark will post an update to the video.