Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The point I was trying to make, clearly not very well, was that nobody would be surprised by a human making a mistake like this so why are we expecting perfection from a software controlled car? They will make mistakes but I'd also expect these mistakes to become less and less common. I'd only worry if the rate of change of performance isn't fairly positive...


Don't know even by low human standard it's bad.

I would not expect a human to drive two blocks while failing to go back to the correct lane; I would expect him to realize after 50m that there are too many and slow down and go back behind the uni cycle. In the same way I do not expect a human driver to stop after hitting someone then start again and drag the body with it.

The problem is that it's easy to pump the statistic while driving miles without issue then it utterly fails when the situation is not well-defined anymore.


> They will make mistakes

Machines don't make "mistakes". A mistake require intent. In case of autonomous driving, any problem is a defect in the system that the whole fleet share.

> I'd also expect these mistakes to become less and less common

So the people should just tolerate it in the mean time "for the greater good"? oh please, it's an enterprise.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: