Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Tesla has attempted to insulate itself from blame by requiring drivers to take full responsibility for autonomous acts of the vehicle under their supervision. (Perhaps also by having autonomous modes disengage before impending collisions — presumably it helps with PR/legally to say autonomous systems were not active at the time of collision?)

Most don’t think autonomous systems will become safe or accepted as safe until manufacturers are willing to assume liability and indemnify users, as Mercedes has.l, by contrast.

https://insideevs.com/news/575160/mercedes-accepts-legal-res...

[Edited to note the attempt, so as not to assert success — I don’t think that’s settled]



> Tesla has insulated itself from blame by requiring drivers to take full responsibility for autonomous acts of the vehicle under their supervision.

Well, they've made drivers feel more exposed by doing that. I don't think you can actually negate product liability law that way, but if you make people feel like they bear all responsibility, it might help marginally even if it isn't legally effective.


>if you make people feel like they bear all responsibility, it might help marginally even if it isn't legally effective.

Since we are talking about a system that needs a human on alert and ready to take over at any time to function safely, I wholeheartedly agree. Human nature is still human nature, people will zone out and look for diversions regardless, but fear will motivate some to be a bit more vigilant in spite of the boredom of staring intently at a road that you aren't personally navigating.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: