I think the core problem there is liability. I'm as good or as bad of a driver as I can be, but no matter what I'm responsible for the driving. If I get into one accident a year that's one accident a year.
A self-driving car might be 5x better than me at driving but logically I can't be liable for what it does. The company making it has to be. 5x better would be 0.2 accidents a year. But multiply by that the 100,000 cars the manufacturer has sold... they don't want that liability. That's why Telsa's autopilot is still supervised, because they want its mistakes to be your problem.
It presents a lot of thorny problems. If I am a persistently dangerous driver I can have my license taken away and be taken off the road. But if a self driving car is judged to be too dangerous for the road you'll suddenly have thousands of people who lose access to their car (assuming a future with self-driving only cars) through no fault of their own. What's their path to getting back on the road?
Your liability is covered by your insurance company. And it costs you on the order of $1000 a year for that privilege.
If the self-driving car company takes on that liability it'll save you the $1000/year. So assume they're either going to charge you an extra $10K up front or an extra $1000/year. For that kind of cash they should be quite willing to take on the risk or they can find an insurance company to do so, if their car is actually safer than an average driver.
This should work in most countries. Perhaps not the US with its pattern of massive punitive damage awards.
Right, but the companies still aren't going to if they don't have to. Otherwise Tesla would be doing it today.
OP said:
> self driving will be statistically significantly better than human drivers, but because it isn't perfect we won't allow it anyway.
My contention is that it's not that everyone is a luddite, it's that while companies are legally allowed to provide quasi-self driving they have no liability for they will do exactly that. And that is what will hold us back.
A self-driving car might be 5x better than me at driving but logically I can't be liable for what it does. The company making it has to be. 5x better would be 0.2 accidents a year. But multiply by that the 100,000 cars the manufacturer has sold... they don't want that liability. That's why Telsa's autopilot is still supervised, because they want its mistakes to be your problem.
It presents a lot of thorny problems. If I am a persistently dangerous driver I can have my license taken away and be taken off the road. But if a self driving car is judged to be too dangerous for the road you'll suddenly have thousands of people who lose access to their car (assuming a future with self-driving only cars) through no fault of their own. What's their path to getting back on the road?