The car probably thought the pedestrian was an emergency vehicle, given the person was wearing a bright red coat and Teslas on FSD have a habit of crashing into them.
To Downvoters: Well it is actually true. [0] and just recently another crash involving an emergency vehicle. [1]
So there is a strange habit with Tesla FSD and red objects in its view. Given those incidents, care to explain why I am wrong?
Because even in cases where cars have hit emergency vehicles, it's because the software didn't see the vehicles at all and just continued driving straight in its lane. Whatever flaws the Tesla vision system may have, the idea that it is programmed to deliberately seek out and crash into emergency vehicles seems pretty far-fetched (much less that it would mistake a person wearing a red coat for an emergency vehicle and therefore attempt to crash into it); I assume this is why people are downvoting you.
The no right turn sign appears to be for the other side of the street, since at the intersection itself, there's a one way sign indicating right turns are possible, and no do not enter signs.
Some of the discourse suggested that previously it had had problems with the pillars for the monorail running down the median, and the owner/driver was trying it again to see if it had improved.
One of the big limits of this kind of AI is that it does not provide human-legible explanations for its actions. You cannot put it in front of a tribunal.
Actually, all kinds of data pertaining to the decision making process is recorded (at least for some of Tesla's competitors, not sure about Tesla), and in great detail. The data is specifically designed to make the AI driver "debuggable", i.e. it includes all kinds of details, intermediate representations, etc that an engineer would need to improve a poor decision, and thus certainly to understand a poor decision.
Whether that kind of logging is always on or was specifically on here, I don't know, but I'd expect Tesla can analyze why this happened: the car does have the ability to explain itself; it's just that owners and drivers do not have access to that explanation.
The box isn't as black as you might think; they're not training some monolithic AI model, there are separated systems involved. Also, the models aren't entirely freeform; i.e. engineers embed knowledge of how the world is structured into those networks.
They can use those intermediates to project a kind of thought process others can look at - and you've probably seen videos and images of that kind of thing too; i.e. a rendered version of the 3d intermediate world it's perceived, enhanced with labels, enhanced with motion vectors, cut into objects, classified by type of surface, perhaps even including projections of likely future intent of the various actors, etc.
Sure, you can't fully understand how each individual perceptron contributes to the whole, but you can understand why the car suddenly veered right, what it's planned route was, what it thought other traffic participants were about to do, which obstacles it saw, whether it noticed the pedestrians, which traffic rules it was aware of, whether it noticed the traffic lights (and which ones) and how much time it thought remained etc.
...at least, sometimes; I don't know anybody working at Tesla specifically.
And while they emphasize their lidar tech, I bet Tesla's team, while using different sensors, also has somewhat similarly complex - and inspectable - intermediate representations.
IIRC, in the incident where the Tesla [Edit: Uber self driving car] collided with a pedestrian pushing a bicycle in Arizona, the Tesla repeatedly switched between calling the input a pedestrian and a bicycle. And took no evasive actions while it was trying to decide.
>the incident where the Tesla collided with a pedestrian pushing a bicycle in Arizona
That was Uber's self driving car program. Notably, the SUV they were using has had pedestrian detecting auto-stopping for several years, though I'm sure it's not 100%
>That sign applies only to the lanes to the left of the pillars. It is legal to turn right there from the right lane. I've done it myself. Yes, it is confusing.
That sign applies only to the lanes to the left of the pillars. It is legal to turn right there from the right lane. I've done it myself. Yes, it is confusing.