> If you ran comma.ai in your car and had a serious crash, you could possibly be found criminally negligent.
I don't know that it's much different than using cruise-control and then having a serious crash. If there's criminal negligence when using cruise control, it's because you weren't aware-enough to shut it off when the conditions cruise-control is built for no longer applied; the mere fact of using cruise-control isn't criminally negligent, as long as you're "just waiting for it to slip up so you can take the wheel."
For that matter, the same is true of teaching a teenager to drive. There's no negligence to having them at the wheel if you're ready to supercede their bad driving at all times.
What's suddenly different, if the "thing that doesn't really know how to drive yet, but which you're ready to supercede the bad driving of" is an AI instead of a teenager?
You can get a learners permit to teach your teenager completely legally. You couldn't do the same with a 10 year old...
If a serious crash involving comma.ai went to trial, it would look like the driver was using unapproved, unregulated software, unnecessarily, putting others at risk. Recent Autopilot incidents show us that expecting a human to take over within a second is not reasonable. I would be very worried that the court would indeed find the driver negligent.
> I would be very worried that the court would indeed find the driver negligent.
I use open pilot on the highway, and I love it. I had to physically tap into the CAN bus for my car's safety system, that made me very nervous. But the installation is completely reversible, and I'm much more comfortable with it's limitations. I just treat it as a really great adaptive cruise control and active lane keep assist.
But that's all it is. I would be completely responsible in most situations if my car caused an accident while open pilot was active. Open pilot's design has it immediately drop all control of the car if you touch the brake or gas, it does not aggressively follow cars, and the driver is still required to control the car. So if I start to become uncomfortable I need to immediately take over.
Assuming that open pilot will always act within it's safety guidelines doesn't mean that its safe. Toyota's paid tens of millions in fines and settled multiple lawsuits over unintended acceleration. I believe that using openpilot on the highway means I'm a safer driver. But I do understand that if I did get in an accident proving I'm not negligent might be extremely difficult.
I agree with you, liability is a very important part of self driving functionality. I won't think a car has full self driving capability until I can get in the back of that car, go to sleep, have it drive me somewhere, and hold no legal or financial responsibility for any accidents that occur. Openpilot is nowhere close to that, but I do think it's really good for what it's meant to do.
What if the crash is caused by the device failing? In the case of your car's built in systems, they've been approved by regulators and you probably wouldn't be held liable. If it was an unregulated device that you installed into your own car that caused the crash, I bet the legal circumstances would be different.
I don't know that it's much different than using cruise-control and then having a serious crash. If there's criminal negligence when using cruise control, it's because you weren't aware-enough to shut it off when the conditions cruise-control is built for no longer applied; the mere fact of using cruise-control isn't criminally negligent, as long as you're "just waiting for it to slip up so you can take the wheel."
For that matter, the same is true of teaching a teenager to drive. There's no negligence to having them at the wheel if you're ready to supercede their bad driving at all times.
What's suddenly different, if the "thing that doesn't really know how to drive yet, but which you're ready to supercede the bad driving of" is an AI instead of a teenager?