That's a false equivalence. Autonomous vehicles need to be significantly safer than human drivers to be allowed on the streets. If a human driver kills or injures somebody or damages property, they are responsible and will face consequences. An autonomous car won't.
The typical tech person will reply to this with some variant of "that shouldn't matter". Well, it does.
You are responding to this thread as if we were arguing for these cars "being allowed on the streets". That is not the discussion here. Instead, we are talking about AI capacity.
Saying that a Tesla can drive autonomously from LA to NYC, except it can't in reality because other cars are on the road and it might kill someone, is an odd way to frame it.
It's like saying Windows 95 doesn't have any security flaws, as long as you don't connect it to the internet.
You mean your "self-driving" Tesla where you're still sitting in the driver's seat with your hands on the wheel (they are on the wheel, unless you want to admit to a crime)?
Even asking the question would seem to indicate that respondents here, do in fact believe that the technology is good enough to drive people around day after day without my hands on the wheel. (It is.)
I don't know what the questions about texting are relevant to here.
Apologies, I was unclear. I mean that the law and what people do are two seperate affairs. When people act like law dictates reality I'm always perplexed.
Here, if you say your Tesla drives you to work hands-free, I've no problem accepting that as a part of the world.