Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Look, I'm all for taking self-driving cars seriously. I've written about how they're effectively WMDs if the software update computer gets owned. I've also frequently commented about the unethical practices that Uber has taken in the past.

That being said, let's not over-react. This is a single death. These are going to happen. Right now self-driving cars are essentially teenagers learning how to drive. En mass there is going to be a death here or there, but, also en mass, we're going to have a permanently safer road once they've learned, if we can secure them from cyber attack.

Try to avoid all-or-nothing thinking here. Definitely advocate for regulations, and maybe Uber wasn't safe enough, but people are either going to die from self-driving cars or they're going to die from self-driving cars taking too long because of public outrage.



This didn't need to happen. It's indefensible that it did.

And there's still no proof that self-driving cars are safer than humans when confronted with novel situations, i.e., the situations most likely to result in injuries and fatalities. If anything, the evidence thus far is that self-driving cars are more dangerous than ones driving by humans.


How are we to judge that self-driving cars are more dangerous from simply one death?

The proper comparison to be made should be between miles driven from autonomous vehicles vs. those of humans and then looking at the incident rates.


The research I've seen is that for areas with good conditions year round they're currently safer than a subset of the population (those in at least one automotive accident in the past 5 years, or something like that).

Still not better than your average driver, but even if we start by replacing alcoholics and routinely poor drivers the net impact on safety goes up.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: