Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Are you sure level 5 autonomous driving is specific? What would be the exact goal behind decisions in such a system? Not even talking about the trolley problem, would the software optimize for speed or not harming people, for example? Obviously we would want a combination of both, otherwise people can either get harmed or not get anywhere in time. But then, how fast should it take a corner? How much chance of human harm should it allow to get somewhere fast? Furthermore, what do we mean by human harm? The system would obviously need to know what human harm is, to be able to avoid it. Which requires defining human and defining harm, both of which are incredibly difficult to do specifically - more on this by Rob Miles here: https://www.youtube.com/watch?v=7PKx3kS7f4A I don't think level 5 autonomous driving is specifically defined. We just don't have systems intelligent enough for this to be a problem yet.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: