Right, nothing should ever be built unless it can be 100% correct immediately.
I’m stunned that it hasn’t, so far as we know, actually hit anything yet. I’m not sure how, but clearly they’re doing something right between choosing drivers and writing software.
There were several high publicity collisions with a Tesla colliding into stationary objects -- a highway barricade [1], a blue truck [2], a parked police car[3], two instances of a parked ambulance[4]...
Teslas driving themselves hit objects all the time.
So, devil's advocate. We know the Autopilot ignores stationary objects, so a lane with a parked vehicle (emergency or otherwise) is therefore a reasonable place to travel at the set speed on the cruise control, etc.
But, I believe this discussion is about the new FSD software which is supposed to be more capable. Have we had reports about the new one doing the old tricks?
"autopilot" versus "full self driving* capable" versus "full self driving" versus "autonomous mode" seems like marketing hype instead of actual improvements. After all, "autopilot" was supposed to drive itself, so what's the new one do differently?