Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Can you explain how Teslas data is cherry picked? Is it unfair to compare crash data for just one particular area? Wouldn’t it make a lot of sense that some roads are inherently more dangerous because of poor designs or conditions?


To me it often feels like they are comparing the best conditions to all conditions. So for self driving only dry overcast well marked roads with no construction or surprises. Where as comparison include all conditions, from blizzard to blinding sunlight in between road construction on poorly maintained roads...

So not really apples to apples comparison. Unless FSD is as safe in middle of blizzard when positioning system is being jammed it fails for me...


Why does it "feel" like that? What is known about the sources of the data collected so far?


It's not even especially "cherry picked". If you compare accidents in vehicles driven on "autopilot" versus every other vehicle, you'll see that they have far fewer accidents per mile.

However, that's not a fair comparison.

Tesla's "autopilot" only works on clear motorways with nothing "exciting" going on. It can't cope with sudden changes in conditions very well, and hands over to the human driver. It can't cope with two-lane roads with cars coming at it at all, never mind single-track country roads.

So, if you compare Tesla "autopilot" with cars driven in the same conditions as when "autopilot" is in use, you see that they're no safer or just a little bit worse.

TL;DR Tesla "autopilot" only works in conditions where cars don't crash anyway, but they are compared against cars driven in all possible conditions.


Is the raw data publicly available for public scrutiny?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: