I also have a 2024 Tesla with FSD, but stopped trusting it. Here’s the thing, it works great for 30-40 minutes, until it doesn’t and makes a completely wrong move almost causing an accident without user intervention. And yes, I’m talking about v12.5.1
In the last month, it’s driven onto grass where there used to be an off ramp that was redone last year, cut across 3 lanes of highway traffic within 200 feet of an off ramp, and almost ran a semi truck off the road (yes, we had the right of way, but he weighs 20,000 lbs and was in no way going to be able to stop in time).
It a cool toy to show off when you’re being hyper vigilant about keeping an eye on it, but there is no way it should be allowed on the public roads yet.
I 1000% would advise against purchasing it unless you have the extra cash and want to try it out. It’s not even close to production ready.
I suspect this is the same hyper-vigilance a spouse gets when the other is driving. Somehow I can go 20 years without an accident but everytime the spouse is in the car it's not the same as 'their' driving so they constantly feel the need to backseat drive and press the imaginary break. Not saying Tesla driving is perfect but it's better than a lot of drivers I know.
> Here’s the thing, it works great for 30-40 minutes, until it doesn’t and makes a completely wrong move almost causing an accident without user intervention
interesting. this wasn’t my experience. i did SF <> Lake Tahoe (which can stretch to 5hrs) a number of times when i was in SF and didn’t encounter any major issues. small issues sure, but it was definitely better than my driving.
It's an SAE Level 2 system, they haven't indicated that's ever changing on current cars. They're even calling it "Full Self-Driving (Supervised) (also referred to as Autosteer on City Streets)" now [1]
A lot of companies have Level 2 systems. That's still a far cry from full automation.
Back in 2016, Tesla CEO Elon Musk stunned the automotive world by announcing that, henceforth, all of his company’s vehicles would be shipped with the hardware necessary for “full self-driving.” You will be able to nap in your car while it drives you to work, he promised. It will even be able to drive cross-country with no one inside the vehicle.
They have one of the best Level 2 self-driving implementations on the market. It’s so good that I don’t even question the FSD tag. That said, I would never let my family ride in a Tesla robotaxi running on the existing suite of sensors and FSD. Does anyone know what needs to happen for Tesla to reach Level 5 autonomy? I get nervous letting the current FSD handle complicated intersections.
A single person's evidence isn't helpful. People doing testing at scale of Tesla's solution (even the latest version), have found a few interventions happen an hour (or at least per day), especially on busy streets or in situations where weather isn't complimentary.
It cannot be truly an autonomous robotaxi without VERY HIGH reliability. One intervention per hour is one too many.
Cruise had driverless robotaxis on the streets while they had 2.5 to 5 miles per intervention. [1]
I think FSD 12.5 is way beyond that --- I drove over 20 miles yesterday with zero interventions. Also, having ridden Waymo in San Francisco many times, I find that the FSD is actually slightly smoother and handles stuff like going around obstacles and blockages more naturally, although, as you are no doubt aware, there are still some rough edges in rare cases.
Once Tesla has reasonable remote human assistance infrastructure in place to help out with the extreme edge cases, and the software improves at the current rate, I don't see why they couldn't roll out a robotaxi service.
As a Tesla owner I can promise you there will never be reasonable human assistance infrastructure.
Have to ever tried to get in touch with a human at Tesla short of driving to a service center. Almost impossible. It would be easier for me to get the president on the line.
Having just purchased a new Tesla, I tried for 2 weeks to communicate with Tesla prior to purchase. The closest I even got was a phone tree, which after 7 levels sent me to a voicemail box that was full. Am I’m talking every day for 14 days. Had my wife not wanted it so bad I would have cancelled my deposit on the spot.
I requested service on it last week. The earliest service date is Nov. 12. I have yet to hear from an advisor on the app.
Tesla does a lot of things right, but supporting their products with actual humans is not one of them..
Dang, that's not been my experience at all. I've had them come out 3x for tire repair (construction site alongside our commute), and they've always come out to to my house to fix the tire in my garage, same day, after a couple of back and forths via text. I've never had a better car maintenance experience.
That said, I don't think they want to talk with you pre-purchase outside of one of their showrooms, and the showroom isn't even a large part of their sales model. I imagine that if you're trying to go against the flow, it'd be hard.
Not to defend Cruise too much, but the 2.5 to 5 miles stat is misleading in that they aren't real time disengagements. They are instances where the vehicle proactively identified a situation where it wasn't confident enough to proceed and then safely stopped while awaiting a response. This is obviously way too often in terms being a courteous and legal road user but its completely different from a driver taking over as the vehicle attempts an unsafe maneuver.
I don't have any clue if it assumes that, I was just illustrating the difference in failure modes. In either case differently unsafe is not mutually exclusive with being orders of magnitude safer overall which Cruise is compared to current FSD when unsupervised.
> Cruise had driverless robotaxis on the streets while they had 2.5 to 5 miles per intervention. [1]
And they got (rightfully) pulled off the roads.
> there are still some rough edges in rare cases
Waymo has substantially lower interventions.
And, they have a huge fleet of humans running around the city attending to the cars. A few weeks ago I saw one stuck - whoever had last used it, managed to trap a seatbelt in the door. It was sitting for about 5 minutes when a guy pulled up and fixed it and sent it on its way. I'm not saying Tesla can't build that, but they're going to have to.
That is a very important question, to be honest. And it definitely seems like it's getting more and more difficult for all of the self driving companies to reach "that next step".
> What if it’s one an hour now, one a day next year and one a week in 2026?
That is a MASSIVE "what if".
What's if it's 2036? What if it's 2056? Hell, if it's 2030 then Tesla is in _serious_ trouble.
Mine still makes a lot of grave mistakes on local roads, stops for overly long at stop signs (long enough to confuse other drivers), and has very poor merging behavior, especially when there are trucks around. It works fine 99% of the time.
It doesn't matter because it's a meaningless statistic. But sure, I'll entertain it. Last figure I heard was 1 disengagement per 200 miles. The distance between SF and LA is 380 miles. If my car can drive me from SF to LA and I only have to intervene once, that's already incredible and leagues ahead of what Waymo can offer. And since you asked, FSD 12.6 will reduce disengagements per mile by 5x.
Waymo doesn't require a driver to supervise the car, they're supposedly running Level 4 cars. FSD requires continual supervision, you must be responsible and attentive at all times. How is this a reasonable comparison?
There's a big difference between "someone at central command can take over when the car signals" and "someone must watch the car constantly and take over when they see it do anything bad".
The disengagement events on recent videos of FSD are still the likes of "oops it almost turned into oncoming traffic" or "oops it almost ran into a pole", that's the sort of thing you have to catch before it happens, not after.
Waymo was doing .64 disengagements per 1000 miles in 2015 and wasn't comfortable launching a taxi service on that. Even after 12.6, Tesla will be behind. The point is Tesla can't launch a driverless taxi service on its current system, not from SF to LA, not within SF, not anywhere.
No, he's telling you that Waymo could do the same thing (and better) in 2015, yet it took them 8 more years to launch a robotaxi service. So, Tesla robotaxi in 2032 maybe?
For the record, you no longer have to keep your hands on the wheel at all times.
I think it’s a mistake to conflate the actual capabilities of the system with the user instructions for how the system is being used. SAE levels are primarily about the latter and about who takes liability for the operation of the vehicle. Conflating the two punishes car manufacturers who are cautious about the current state of their self-driving system.
You do realize you cannot buy a Waymo for your own use?
You do realize Waymo will only operate in geofenced areas in select cities that have been premapped down to the millimeter?
Waymo is not even remotely close, nor attempting to solve the same problem. This is coming from someone who lives in SF and takes Waymo regularly. Waymo is a cool tech demo and that's about it; FSD is a real tool that people everywhere can actually use to take them where they want to go.
I don't know what to say man, Waymo takes me where I want to go several times a week (why would anyone be a repeat customer if it was just a demo?). My Tesla with FSD can't take me anywhere without me monitoring it.
Everywhere they want to go? In an electric car? These taxis would probably only drive in big cities where there are enough charging stations and service centers.
No, remote operators are never in control of the vehicles. They give the computer hints about how to handle situations it's unable to resolve for itself, but the computer is ultimately still responsible for driving and maintaining the safety invariants.
This is fundamentally different from FSD, where the human is always responsible for driving and maintaining the safety invariants.
It entails a completely different division of responsibilities and safety profile. Specifically, it's one of the critical differences between SAE levels 2/3 and level 4:
To give an analogy, let's say you use a credit card. A machine processes the payment most of the time, but occasionally something looks suspicious, so it denies the payment and sends a message to a human (you) asking whether the next payment should be allowed. Do you consider yourself to be a "driver" in this system?
If so, imagine a system where all payments flash by onscreen for a human that's tasked with stopping erroneous approvals in realtime. Are humans doing the essentially the same job in this system such that both roles are "drivers"?
Aside from supply issues due tot he pandemic and demand, I there’s a strong incentive to be conservative in how you roll out self-driving: Cruise and Uber both ran into serious issues because they rolled out self-driving cars too aggressively. It won’t take that many publicized incidents to cause significant financial, legal and reputational issues for Tesla. So, selling FSD as a level 2 system while they are training their networks and gathering performance data makes a lot of sense.
Sure, but saying every earnings call 'it will definitely fully work this year' and then not delivering every year is rather. wrong. He knows it won't work this year, or next. It needs more training and testing as you say; needs either that or a significant AI breakthrough.
Would you trust it without a break pedal and steering wheel you can take control over? How confident you are Tesla can deliver this within one or two years? If your answer is not a resounding yes for both this product its as good as nothing at this stage. Considering Tesla track record of delivering on timelines and expectations for this kind of tech.
I frequently drive in the SF Bay Area in a non-Tesla and it's incredible how many Teslas I've seen nearly hit us, randomly swerve or brake hard for literally no reason.
I always encounter a lot of Teslas while walking my dog and it's clear they're no safer than many people who shouldn't be driving regular cars.
It's wild, most Teslas I encounter drive fine, but every once in a while I see one driving completely erratically. Stopping 20 feet too soon at a stop sign, very weird positioning in lane, sudden acceleration or deceleration. I think I can pretty much tell if a Tesla has FSD enabled if I follow it for a minute or so.
I disagree completely. Letting it drive is the exact same as teaching my 16 year old kid to drive. It’s nice to let them take the wheel, but you never know when there going to make a mistake and almost crash into traffic.
Human drivers use the bike lane, shoulder or oncoming traffic lane to drive around cars turning left and to avoid hazards all the time. It’s not unsafe if you check to make sure no one is there and my Tesla detects people around me basically as well as a human can.
On the other hand, FSD won’t try to pass me on winding mountain roads with a double yellow line.