Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It is a regular OTA update.

Negligent drivers have killed people, not autopilot capabilities. Take the example in the recent Washington Post article where a driver ran through a stop sign at a T intersection, left the road, and killed a bystander.

What the article didn't say is that the system reduced cruise set point to 45 mph due to the road type, but the driver was _holding down the accelerator pedal_ to maintain 60 mph as it flew through the stop sign. The driver admitted they knew they were responsible for the controlling the car.

What is your evidence that drivers not knowing Autopilot capabilities, and not intentional misuse by negligent drivers, killed several people?



> It is a regular OTA update

Tesla is mailing a letter to all affected owners about this. Does Tesla normally do that for regular OTA updates?


> What the article didn't say is that the system reduced cruise set point to 45 mph due to the road type, but the driver was _holding down the accelerator pedal_ to maintain 60 mph as it flew through the stop sign. The driver admitted they knew they were responsible for the controlling the car.

Do you have a source for that claim? The earlier news coverage said that the driver set the speed to 44mph, and that the data was unclear whether the autopilot or driver accelerated. If you have something like a court document that would be great to clarify.


That wasn't my comment, but I happened to have the source already open: https://twitter.com/Tesla/status/1734374558105293081?s=20


Thanks - I hadn’t seen more recent sources than the NYT story which had it at unclear a couple of years ago:

https://www.nytimes.com/2021/08/17/business/tesla-autopilot-...


1. Every time one of these articles comes out, the investigation concludes 10 months later that the driver was at fault.

2. The hero metric is what is the rate of crashes _compared to the general population not using Autopilot_.

Tesla make it abundantly clear that the driver is still responsible for monitoring. This is analogous to a dev just deploying copy-pasted code from ChatGPT without checking it. You still supervise/monitor the result.


That's because legally, they are. However, it hasn't stopped Tesla from doing things like hold press conferences (in the case of a fatal collision with a Semi) to talk about how the car had warned the driver to hold the steering wheel, and neglect the trifling detail that it had done that once, and it was eighteen minutes prior to the accident. Tesla will happily throw you under the bus to try to protect the reputation of FSD/AP.

> Tesla make it abundantly clear that the driver is still responsible for monitoring.

Tesla had to be dragged kicking and screaming into doing this. When AP first launched, the steering wheel checks were every fifteen minutes, then reduced to five. It has only been through repeated cajoling that it is down to where it is now.

Let's not forget that "The driver is only in the seat for legal purposes. The car is driving itself." is still on Tesla's website to this day. (And in a sense, it is accurate - the legal purposes just include trying to shield Tesla from as much liability as possible). Oh, and Smart Summon. "Bring your car to you while you deal with a fussy child" (and still maintain complete awareness of your car...)


It was an update forced by the NHTSA after an investigation and negotiation.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: