Tesla Robotaxi Videos Show Speeding, Driving into Wrong Lane
162 comments
·June 23, 2025TulliusCicero
energywut
Watching one of the videos, the car went 38/39 in a 30, and was the only car nearby. It was not keeping pace with other vehicles in its lane. It seemed like a lot of the roads it was driving on were signed 35 or 40 (it was a lot of stroads), but a few were signed 30, which it just flatly ignored.
mslansn
If it was the only car nearby does it matter that it went 39 in a 30?
tene80i
Yes. People should follow the law as a general rule, and particularly when operating a giant metal object that can kill. Software should do the same. Moreover, if it cannot, or chooses not to, why should we trust it to drive at all? What other rules might it flout? Who decides? You? Elon?
energywut
Yes. The energy in a vehicle increases with the square of the velocity. A car going 40 has nearly DOUBLE the kinetic energy of a car going 30 and the braking time required to stop a car is therefore also significantly increased.
That lower speed limit was set for a reason.
Additionally, predictable car movement is important for wildlife, pedestrians, and other vehicles. If the cars on that street always go 30, the others there will learn to predict the movement of a car. If some rando car start speeding along at 40 out of the blue, that will disrupt the calculations and decision making of everyone else.
Finally, are we comfortable saying, "Any company can decide whether it thinks obeying the speed limit is important, and is free to go over the speed limit by 1/3rd if they deem it safe"? Are we sure that precedent is healthy and desirable? I'm not. I don't think automated cars should just decide to add 33% to a speed limit and go that fast in normal operating conditions.
So yes, it does matter.
vel0city
Yes, the car should be traveling the speed limit if it is the only car around and conditions allow it. It should not be speeding by default on its own.
rs186
True, but AFAIK Waymo strictly follows speed limits.
dzhiurgis
Which is why it couldn’t use highways for ages, as otherwise they’d been honked off the road
rs186
Absolutely false.
I have driven many hours at posted speed limits on highway in the rightmost lane in the US, and I yet need to hear one honk. And I have seen cars driving even slower -- absolutely no problem. Anyone that wants to move faster just changes lanes and moves on.
harmmonica
I've seen a few folks in this thread say this, but where you're from do people genuinely get honked at if they go the actual speed limit? I live in an area that's pretty famous for people driving a) with entitlement b) like shit and c) fast. Unless someone is camping out in the left lane (assuming a multi-lane highway and left-hand drive car) and they're driving the speed limit, I've literally never seen someone honk at a person going the speed limit in my lifetime. Even when folks are camped in the left lane I'm not sure I've seen that, though aggressive tailgating and/or flashing high beams is a regular occurrence. And then more often than not the left-lane blocker doesn't even do anything in response to that (or so those aggressive drivers stuck behind them tell me).
I just don't think there's going to be any issue in most geographies if they go on highways/freeways and follow the speed limit (I guarantee that's true in the US; I'd go out on a limb and guess it's true almost everywhere).
edit: grammar
awongh
I would go so far as to say that it's dangerous to drive what the speed limit is supposed to be, if everyone else is driving 15-30 miles an hour faster than you.
Which also highlights one of the inherent flaws with laws and rules like speed limits. They don't ever actually mean, drive at this velocity. They always mean, drive at a safe speed and here is the number we all collectively agree on, unless we don't.
JumpCrisscross
> otherwise they’d been honked off the road
Slowing down surrounding traffic to the speed limit would be an amazing safety multiplier for AVs.
potato3732842
They could've done it just fine. You only get treated like shit on the roads if your behavior is both selfish/unnecessary and inconveniences others
Garbage truck leaving a slowly but as fast as he can -> no problem
Tesla M3 leaving a light slowly -> dude deserves ever honk he gets.
BMW M3 weaving through traffic cuts people off but is gone as fast as he came -> nobody cares
Prius doing 55 in the right lane -> no problem
Prius doing 55 in the left lane because entitlement -> lots of middle fingers
I have complete confidence waymo would have "done the right thing". The real problem is that there still would have been a million rear endings. Eventually the insurance companies would have sued them because no amount of idiots online screeching about the "rules" actually makes it ok to habitually create the preconditions for an accident. It would have been a big expensive legal fight. The state doesn't want high speed limits to reflect how fast people actually drive. Waymo doesn't wanna be involved in that. The insurance companies don't wanna be involved in that. It's just a no brainer not to given the constraints.
Though I can't help but wonder if insurance rates would've eventually gone down for everyone if the issue got forced and it resulted in some sort of "after the 3rd rear ending they're on you by default" rule or something.
crazygringo
> If the speeding is only going with the flow of traffic, that's not a huge deal imo.
If it's intentionally programmed to violate speed limits, then sure.
If it's intending to follow speed limits but is failing, then that's terrifying, because where is its wrong information coming from, and how catastrophic could the errors be?
duxup
"We're at the corner"
Um ... kinda...
andrewmcwatters
Nah, I want a system that obeys the law. I don't need software breaking the law on my behalf. Waymo has proven that it's an exceptional ride experience when you have an automated driver that is cautious and obeys the law, which is more than what 99% of drivers do in the Phoenix metro. Go count how many people obey traffic laws for fun.
And guess who's at fault when you're caught?
Edit: No, crossing double-lines to avoid obstructions isn't breaking the law. People in the Comma Discord bring up these weird edge cases that are outlined in law, too. No, you're never forced to break the law when driving.
You can downvote me all you want. You're wrong. The police would find you wrong, a traffic law court would find you wrong. People are animals and don't want to obey the law. Just obey the law. Drive the speed limit, keep right unless passing or turning left.
nradov
There are circumstances where breaking the law is necessary, like crossing a double yellow line to safely get around a lane obstruction.
nash
Generally that's not actually breaking the law. Avoiding accidents, following reasonable directions from road crews and the like explicitly override other laws.
moralestapia
Excuse me, what?
I get that nobody's perfect and you might do it by mistake (going 35 on a unmarked road that was 30 max or something), but the way you write your comment makes it sound like it's something one does in a deliberate way?
Drivers like that shouldn't be on the road.
jstummbillig
I think there are two distinct types of "bad" here, and to me, one is way more interesting than the other.
If we mean, that this is dangerous: Yes, of course. It's an obvious, somewhat dangerous error (and I say "somewhat", only because I assume currently everyone participating is cognizant of the fact, that this is in some kind of testing stage)
But I think the more interesting question is: How quickly can this issue (and others like it) be fixed? If the answer at this point is "we will have to retrain the entire model and just hope for the best" that sounds, like, really bad.
m101
I know that Tesla sometimes feels like a religion but there just might be a good time to short the stock. Before going out and saying it's bad software, short the stock first and then ethically berate it. I just did.
It's a 200 pe stock, sales are falling, so it won't have earnings to speak of next quarter. High pe stocks need growth to justify their multiples. Tesla is not growing.
Also if this robotaxi service isn't pulled off the road soon then it will be limited to a very select set of locations. If someone has to sit in these cars to monitor them all the time then Tesla may be losing money on every journey.
This premature move in releasing the robotaxi is certainly stock pumping.
ink_13
As I write this comment, it's up 8.23% today alone (from ~322 to ~348).
TSLA is a meme stock at this point to which ordinary expectations do not apply. The price is unmoored from what the company does. _By the book_ I agree, it should be shorted, but I wouldn't.
RivieraKid
> TSLA is a meme stock at this point to which ordinary expectations do not apply.
Eventually they do. At this point the bull thesis is still plausible to a lot of investors but this won't last forever.
m101
What better day to short than today. There is a limit to where the market can whip around a 1tn market cap stock. This is not meme stock 10x downside territory.
TheAlchemist
Tesla is going bankrupt or bought by a competitor at some point and Elon is going to prison at some point. I would be shocked if that's not the case. It's all smokes and mirrors, manipulation, deception and it will eventually come back to earth.
However, there is a little saying - 'Markets can remain irrational longer than you can remain solvent" that you seem to want to learn the hard way !
PS. I'm short the stock - but it's a hugely risky bet that I would not recommend to anyone.
pfisherman
You mean Tesla will merge with Twitter / XAI, and market cap will soar to $50T on Elon’s promises of “AGI next year”.
LZ_Khan
I just shorted it as well.. 3 days before you.
And I got roasted. Invest with caution.
m101
I feel you. I got lucky.. for now. Sometimes you just got to pick the right day as it was better than yesterday.
mikestew
Good luck with that. Net profits down 71% YOY? Stock goes up 10% on the news. You think it's going to go down because Robotaxi sucks? Someone will be along shortly (as they always are) to reuse a cliche about irrationality.
tonymet
profits go down with new strategic investment.
BrandonLive
Tesla’s revenues and profits are down for one reason and one reason only: Musk has personally alienated a large swath of the customer base.
mikestew
Profits go down when fewer people are buying your product, too. Which seems to be the case with TSLA for the most part.
lern_too_spel
> This premature move in releasing the robotaxi is certainly stock pumping.
It is. Inevitably, when the promised rollout ramp up doesn't happen, there will be another pump. This is not his first rodeo. The only thing that can stop another pump is competitors scaling first, not repeated Robotaxi failures (which I fully expect).
misiti3780
A lot of people have shorted TSLA and lost a lot of money - See Jim Chanos. I actually agree the price is going to go down before it goes up, but I wouldnt waste money trying to time it
Long term I think they are fine - I think they will have solved FSD and no one will remember Elon's politics.
tsimionescu
It's funny to think that Tesla, the only self driving company that has no working form of self driving, is the one that will "solve FSD".
Look at Waymo or Mercedes if you want to see what working FSD looks like. Tesla isn't even in the same ballpark yet.
Workaccount2
I have lost enormous money shorting Tesla over the years. I'm even short right now.
I can tell you though, Tesla is the only company doing two things:
Making full self driving car that doesn't need an expensive and expansive sensor suite. Just off the shelf cameras and a GPU.
Making full self driving that can drive anywhere, no pre-mapping needed.
Waymo probably could do without pre-mapping, Mercedes probably not. Right now though Tesla is the only player with a car that you can buy and will "self drive" anywhere in the country. The other car companies cars only work on select pre-mapped roads, and IIRC Mercedes only works on highways and over 40mph.
The bet against Tesla is that they will not be able to pull this off. Not that competitors will beat them to the punch. If Tesla did get FSD with cameras on par with Waymo, Waymo would likely be unable to compete.
mebizzle
Its really funny that comments like yours are treated as fact these days and I have a model 3 sitting outside that drives itself lol
I can get into my car, plug in a destination and not have to touch the car. Nothing but Tesla does that right now unless every video on the subject is lying regardless of whatever the defined SAE levels are.
leesec
amazing people say this when it gave hundreds of safe self driving rides yesterday
m101
Tesla was actually growing then. This time is totally different.
misiti3780
If you think Tesla is a car company, then yes I agree with you, it looks and acts like a meme stock. I think they're an AI company that has amazing manufacturing processes that makes cars. Of course I might be wrong, but the value can be justified if I'm right.
laidoffamazon
It seems far too early to say they will solve FSD?
m101
I believe they will but that doesn't negate my comments on price.
misiti3780
I'm a daily user and biased, but my anecdotal experience is they will solve it. Also this article is about them taking the first step to solving it.
leesec
So much hate in this thread lol. Headline could have easily been "Tesla launches Robotaxi, gives hundreds of rides without incident".
They're closer than ever to solving driving at scale. In fact they're the only company close to doing it. Waymo still has less than 3 thousand cars. Keep shorting if you want. The pace of improvement in the last year and a half makes success obvious. I haven't had a safety intervention in many months on just random roads. Cheers haters
jesenpaul
9 years of scamming people into paying for FSD (which I've have since Dec'16). This is what they still release to pump stocks and sell more FSD
sjsdaiuasgdia
And the market seems to love it for some reason. I don't get it. Waymo's been doing the robotaxi thing w/o any employee in the vehicle for several years now. Tesla's launch yesterday is laughable in comparison.
jandrese
Tesla is a meme stock. Any relation to their fundamentals is purely a coincidence.
kiernanmcgowan
The market can stay irrational longer than Tesla can stay solvent.
robjeiter
Would not necessarily call it a meme stock as long as people are trying to value it based on future uncertain cashflows. Nobody knows for growth stocks how high they'll be; that's why they're very volatile. Yet I'd argue that most people buy for the future cashflows and it's not a meme cult like GME
golergka
Waymo owns the fleet, Tesla doesn't. Completely different economic model which would work much better (economically) if it would work (technically).
sjsdaiuasgdia
Speaking to what was launched yesterday, pretty sure those are cars owned by Tesla.
The theoretical future of Tesla owners being able to put their cars into robotaxi mode when they're not using them does not exist yet, and may never.
vel0city
Every Telsa "driverless taxi" is owned by Tesla today.
jsemrau
I still hold the belief that the removal of Radar and LIDAR in 2021 was only detrimental to FSD's performance.
lexandstuff
It surely means these things will never operate safely in rain, snow or even bad fog. That seems like a pretty big issue for a taxi operation.
socalgal2
Yes, because somehow humans can't drive without LIDAR in the day or night or rain.
I'm not saying Tesla FSD is any good but the idea that a robot could never drive without just cameras seems to be false given 1.6 billion human drivers that drive with only eyes.
cameldrv
Your eyes and your visual system are much higher performance than any camera and computer currently available for the task of driving. Tesla also has far from the best available hardware, both in cameras and compute.
Still, even with this high performance human sensor suite, people commonly get into accidents in bad weather.
The Waymo approach of using other sensing modalities in order to compensate for the ways in which the cameras/processing aren't as good as a human makes a lot of sense, and in addition, it gives them the ability to exceed human performance using lidar and radar when cameras are having a hard time.
Once we have mass produced lidars and radars, the cost will come down, and not many people are going to care about an extra $1000-$2000 worth of sensors on a car if it significantly improves the safety.
lexandstuff
And there's a huge increase (apparently around 36%) in road fatalities during bad weather.
People have much higher safety standards for self-driving cars than they do for human drivers. Just look at how one fatality led to the total abandonment of both the Cruise and the Uber self-driving program.
archagon
Humans also have, y'know, a working brain. Not to mention parallax cues that come from sensory input other than pure vision, such as head movement.
RankingMember
Did they ever have LIDAR? I completely agree that the removal of radar was a cost-cut too far and a decision made with uneducated hubris (his deeply-flawed "humans only need vision so cars only need that too" thought process). If they ever want to actually compete in this space, I expect we'll see one of these (or both) quietly re-added.
freerobby
No, they never had LIDAR.
jsemrau
Tesla was/is? Luminar's biggest customer [1]. So they used it for something, likely ground-truth validation [2]
[1] https://www.reuters.com/technology/luminar-says-tesla-is-big... [2] https://www.teslarati.com/tesla-no-longer-needs-lidar-ground...
awongh
They could still add it later, maybe once the solid state ones become more affordable?
comrade1234
I wish people drove the speed limit but now I have to deal with robotaxis speeding?
I have my motorcycle license and drove an sv650 when I lived in CA. Normally on a motorcycle the left lane on the highway is the safest because people are merging in and out of the right lane when they enter and exit the highway, so the left lane is usually steady and no turning.
So you drive with traffic in the left lane as much as possible. The problem in CA is that you're driving at times 95mph just to stay with traffic. I had a turn near Ventura on the 101 where I had to lean my bike so far over that it was like I was on a racecourse.
Since moving to Europe it's much more sane. There are speed cameras everywhere so people don't speed. When you can predict what every on else is doing on the road it's so much safer.
os2warpman
Apparently Tesla has three billion miles of driving data.
How many billion miles more are needed for the computer to not drive over a double yellow line?
Is there any evidence that three trillion miles would work?
yall_got_any_more_of_that_driving_data_dave_chappel_meme.gif
BrandonLive
Driving over a double yellow is expected and legal in normal driving, such as when making a left turn or going around an obstruction.
In this example it looks like it oscillated between two different routing choices (turning left and going straight), and by the time it decided the correct route was to go straight, it found itself misaligned with the lane it should have been in. Instead of moving all the way back to the right, it kind of “cheats” its way into the upcoming left turn lane. This isn’t something it should do in this situation, but it’s likely emulating human behavior seen in situations like this which appeared in its training data, where people cut across the center line(s) ahead of the turn lane forming when they can see that it is clear.
The thing a lot of people get wrong is that they think the most valuable data for Tesla to collect are the mistakes or interventions. Really what they need the most is a lot of examples of drivers doing good job of handling complex situations. One challenge, though, is separating out those good examples from the less good or bad ones, as human drivers are notoriously bad at, well, driving.
Zigurd
3 billion miles of two dimensional 1080P images is what they've got plus telemetry for speed, acceleration, etc. A lot of the imagery was acquired at night or in foul weather or taken by a camera with a dirty or broken lens.
If it turns out that lidar is necessary, there goes the value of the installed base of Tesla, vehicles. It would be like starting from scratch. Clinging to vision only AV is some variant of the sunk cost fallacy.
If you're thinking, surely they must've thought of all that and have a plan, I'd point to the design of starship and ask, is that ever going to be human rated?
BrandonLive
They collect video, not images, along with other sensor and control data.
It’s not a sunk cost fallacy, it’s a technical strategy that is very logical and showing compelling results (though as of yet unproven for achieving robust L3 or L4 autonomy, demos notwithstanding).
socalgal2
Waymo is working great with far less miles (40 million?). I use it all the time. And know many other friends who use it all the time and love it.
harmmonica
One thing is for certain: this will be one of the most scrutinized product launches in a long while. Assuming they're able to continue rolling this out (that assumes no major incidents) we will know every last misstep these cars make en route to a proper "driverless" service (ie no safety driver). Then the question will be whether the lack of trust amongst some members of the public towards Tesla/Elon costs them adoption, or if there are enough Tesla boosters/agnostics to get them the growth they want/need. And then at some point, if those rides with the boosters/agnostics all go well (months or even a years from now?), I would assume all but the most fervent anti-Tesla folks will become customers.
Zigurd
Until they are no longer needed in the vehicle, the safety monitor with his thumb on the kill switch is one per vehicle. Waymo has built out remote monitoring infrastructure, and in car intelligence that enables one remote monitor to cover a large number of vehicles. The vehicles ask for help when they are unable to confidently make a decision about how to proceed. The remote monitors never actually steer a Waymo vehicle. They pick from a set of choices, or they decide to take the vehicle out of operation.
In the short to medium term, safety monitoring will be the limiting factor.
TheAlchemist
"product launches" - keep in mind this is Tesla's product launch.
They launched Tesla Semi 8 YEARS ago, and the truck is nowhere to be seen. Not even official specs have been released.
This product launch was for influencers only, paying 4.2$ per ride.
I'm not gonna lie - I hate the world where it's possible to become the richest person in the world, while being a snake oil salesman.
sundaeofshock
You are missing a key factor: Waymo. Tesla has to deliver a product that is at least as good as Waymo. If they can’t do that, why would a person switch to Tesla?
Waymo has a multi-year head start; that’s going to be hard to overcome without a superior experience from Tesla.
harmmonica
I don't think I'm missing it. I won't rehash a bunch of my comment history on Waymo, but if Tesla can actually do this, without the costs Waymo has, it'll be quite the coup. And as others have pointed out, Tesla already has all the manufacturing in place and though I'm super doubtful they'll offer a "hire your car out" service anytime soon (or ever!) having the whole vertical offering and using fewer sensors puts them at a distinct advantage from a margin standpoint so they could conceivably offer rides at a significantly lower price than Waymo.
Obviously the "if Tesla can actually do this" is doing a lot of work here! I would not surprise me if they had a crash sooner than later and that scuttles the whole thing. But it's also possible that doesn't happen and this works. Or at least I'm not smart enough to know for sure that it won't.
Zigurd
At current prices, Waymo may actually be generating operating profit with the current vehicle costs and opex. Driver 6 hardware and Ioniq 5 cars will cut the vehicle costs sharply. I don't think Waymo wants to make a big deal out of this because it would be disruptive, but they are either at the point or very near to it where they can undercut human powered ride hailing.
A big cost driver is invisible, at least once Tesla gets their safety monitor out of the vehicle, which is the ratio of safety monitors to vehicles. You need three shifts of safety monitors seven days a week. It adds up. Google wouldn't be expanding if they had to hire a building full of safety monitors.
alphabettsy
Is there that substantial of a difference in hardware and manufacturing cost?
absurdo
As certain as forgotten about in a week’s time.
I_dream_of_Geni
In today's environment, pretty much hits the nail on the head....
breadwinner
Sounds like the software isn't ready for primetime, but they also don't have the right hardware (no LiDAR).
duxup
Do the Robotaxi's use the same software as the regular Tesla "FSD"?
owenwil
I don’t think this is a surprise to anyone who actually owns a Tesla (I own a Model Y). Full self-driving is just _bad_ in comparison to technology from Waymo et al; it slams on the brakes suddenly for shadows, veers into the wrong lane, hesitates in pretty standard intersections, and doesn’t even understand basic concepts like school buses or trains. Here in BC, it completely ignores the 30kph school safety zones, which seems pretty basic.
My experience with FSD is that while it feels “magic” at times, it’s like a teenage driver that you have to baby sit constantly. It’s genuinely impressive how well it works given the really limited hardware, but if you use it routinely you know it will make at least one weird/dangerous choice on every trip.
Generally, I really don’t trust it in most situations except properly delineated highways, but even then it can be a crapshoot. If you’ve experienced FSD then get in a Waymo, they are night and day different—a lot more predictable, and able to navigate uncertainty compared with what Tesla has built. It’s likely down to a combination of both software and their insistence that radar doesn’t matter, but it clearly does.
I would never get in a Tesla that purports to drive itself, there’s no way it’s safe or worth the risk. I won’t even use it with my family in the car.
I know a handful of others who own Teslas and feel the same, despite what the fans spout online. I generally like my Model Y, but I definitely do not trust FSD—I find it hard to believe that it’s even being taken seriously in the media. Not a great endorsement if even your own customers don’t trust it after use it.
manjalyc
A fun anecdote - a lot of people may remember Roomba from forever back with their automated little vacuums. Roomba's market share declined significantly because they failed to adopt Lidar technology as quickly as their competitors, instead they depended on the bumper for as long as possible. This put them at a disadvantage in navigation and efficiency as their competitors started using Lidar. Combined with aggressive pricing from rivals, expiry of its patented roller in 2022, a weird insistence to not combine vacuum and mopping into one device, Roomba (or iRobot now) is a just little fish in the sea it made.
moogly
> Roomba (or iRobot now) is a just little fish in the sea it made.
Perhaps more like plankton.
> The [...] company warned in its earnings results [on 12 March 2025] that there’s doubt about whether it can continue as a going concern.
havaloc
https://maticrobots.com/ - Lidar seems like a stopgap, check out this robot vacuum which works with vision only. I am not conflating a car and a vacuum, but it's an interesting technological exposition.
manjalyc
The reason I brought up roomba wasn't to talk about Lidar or vision necessarily. It's more a story about how the first-mover in a technological space became entrenched in what works and became resistant to investing in newer technologies. The result was rival companies taking away marketshare from a market roomba once defined. Roomba has since incorporated lidar and other innovations after being stagnant for a decade, but its too late - their competitors now dominate the market.
To complete the analogy, Tesla is invested in vision-only technologies, while its competitors are making gains with Lidar and other tech that Tesla refuses to acknowledge. It's very reminscent of Roomba in the mid 2010s.
The Matic is a cool little robot though.
sjsdaiuasgdia
Weather is not a problem inside a house. It kinda is outside.
owenwil
100%—I enjoy telling life-long Roomba users about how far behind the technology is when they try to convince me to buy one! I've been using Roborock for a long time and it's pretty astounding how far ahead they are; full on item analysis + avoidance (including poop!) being the big one for us, let alone just knowing their exact location within the house. And there's a number of others that have pushed it a whole bunch... the folks at Matic seem to have pushed it even further (not ironically, with just vision, which actually feels appropriate here) it's a shame it's not available in Canada and no obvious plans to roll out here, would love to buy one: https://maticrobots.com/
Meanwhile Roomba seems to have done...pretty much nothing? Reminds me of the death of Skype when everyone transitioned to literally everything else while they floundered around.
shreezus
I mostly agree with you. I use both FSD on my Tesla & Waymo regularly (LA region), and Waymo just feels way safer in comparison. While FSD has improved significantly the last few iterations, I have seen it do "strange" things often enough that I don't feel safe just sitting in the backseat like I would with a Waymo.
Even if it's hypothetically 99% as good as Waymo at the moment, 99% is not good "enough" when it comes to something as critical as driving.
BrandonLive
Today’s “FSD” has its limitations and requires supervision, but your description of it is not anything like my experience even on a HW3 vehicle. In fact, in many years of using Autopilot and various “FSD Beta” and “FSD (Supervised)” versions for several tens of thousands of miles I’ve literally never seen it “slam on the brakes suddenly for shadows” or “veer into the wrong lane”. I’m not a cult member and my next car won’t be a Tesla because I cannot support Musk after the horrible things he has done these last 2-3 years, but “FSD” is phenomenal when used appropriately and with the right expectations about what it is and what it isn’t. And it has improved a ton over the years, too.
The end-to-end solution was a real game changer, and while the previous solution was still useful and impressive in its own right, moving to the new stack was a night and day difference. With V13 finally taking advantage of HW4, and all the work they’ve been doing since then (plus upcoming HW5 introduction), it’s totally within the realm of possibility that they achieve viable L4 autonomy beyond this kind of small scale demo (and I hope some form of L3 maybe on HW4 before long for customer vehicles).
bdamm
It's probably also got to do with some magic behind the curtains; Waymo hasn't expanded much beyond their initial regions, so they've been static for years in terms of region. They very well may have hand-crafted expected paths, and obviously as the region coverage goes up that kind of hang-crafting doesn't scale (due to changes in the environment, at least) economically. So we can't really say how much work Waymo is putting into each mile. That's true of Tesla also, except that kind of work is totally antithetical to their entire approach from the very beginning, so it would be really surprising to find Tesla getting stuck in that specific local minima, wheras we can almost expect it from Waymo.
As a counter anecdote, I do use FSD with my family in the car, I also have used it on snowy roads, logging roads, and it does quite well. Not unsupervised well, but better than I expected given that I'm running FSD on a nearly 6-year old car. The number of trips around town that have been totaly interventionless has definitely been going up lately, and usually interventions have been because I wanted to be more aggressive, not because the car was making a major error or even being rough.
mebizzle
I have a Model 3 and it drives amazingly well, in Florida of all places. Ive taken it all over back roads in FL/GL with a little bit of AL and MS as well. The issues you described were much more prevalent when I got my car in '23, and I have genuinely been watching them become fewer and further between as time passed. I've driven at least 10000 miles on it in the last two years and I have only had to intervene twice.
I have no motivation to be positive; I own no Tesla stock or position and just like it because its the best car for me currently. I cannot emphasize enough just how different my lived experience has been from how you describe it.
owenwil
I am confident that two things can be true: a) it can be better significantly in some places than others, especially like Florida, which has a lot of large, wide roads, that are probably mapped more than a lot of places which creates a more stable experience and b) Their choice of hardware and software approach is obviously less safe given their limitations, and has a number of compromises that introduce unpredictability vs other approaches.
It definitely has come a good way since I first got my car, but it's still _unpredictable_ and even seems to progress, then randomly regress, between releases. The big one is just navigating unpredictable environments, which is where Waymo is clearly far, far ahead.
In the real world, I think their approach has clearly hit a ceiling and I definitely feel a lot safer sitting in a Waymo than a Tesla, I'm not sure the gap is going to narrow unless something drastic changes.
nwienert
Shift some nouns and this is basically exactly what LLMs for coding are like, except the downside risk is “git revert”
misiti3780
I have had the exact opposite experience. I literally dont drive anymore and I never have phantom breaking problems. Im on HW3
FireBeyond
> doesn’t even understand basic concepts like school buses or trains
Yeah, it would be hilarious if it wasn't so horrifying, I remember watching a level crossing be represented as a weird traffic light that would go from red to off to red erratically, with a similarly erratic convoy of trucks representing the train.
Mind you I remember people claiming FSD was "nearly done" because they'd "tackled all the hard problems, and were now in clean up", and how as a result that meant they could let their FSD take itself through a roundabout, not just straightlining it through. Never underestimate the power of denial.
etchalon
If you watch the videos posted, the cars are behaving it what I would consider eerily human-like ways.
That's explainable by, perhaps, the data training set, but an "Actually Indians" approach seems more likely (and more inline with the bullshit Musk has pulled in the past)
RankingMember
He's essentially admitted as much:
https://www.usnews.com/news/top-news/articles/2025-06-20/exp...
decimalenough
Tesla's AI is trained on actual driving data, so it is behaving in human-like ways, both good and bad.
If the speeding is only going with the flow of traffic, that's not a huge deal imo.
The left turn fuckup is really bad though, as is the instance of the robotaxi dropping someone off in the middle of an intersection after they hit the "drop off earlier" button: https://sh.reddit.com/r/SelfDrivingCars/comments/1liku3o/tes...