> The car did eventually detect the obstacle and brake prior to impact
The driver is claiming he braked, not the car: "Huang says he slammed on the brakes once he noticed the truck, however, it was too late to stop the nearly two-ton sedan traveling at reportedly 68 miles per hour."
> The driver is claiming he braked, not the car: "Huang says he slammed on the brakes once he noticed the truck, however, it was too late to stop the nearly two-ton sedan traveling at reportedly 68 miles per hour."
The pedestrian is seen in the image, and he has the reaction time to step aside to avoid collision from the Model 3, how was the driver not aware of that and only decided to brake after avoiding him (as seen with locking the brakes)?
This seems entirely negligent, and a combination of both user error as well as Tesla's failure in AP obstacle detection. Not sure how it plays out, but I hope it doesn't deter from AP being refined. I'm sure they will have the onboard footage to detect what the driver was focused on before the accident and it will be settled based on that.
That's kind of the biggest issue I have with Autopilot - it makes drivers feel far more comfortable relying on it than they should. It's gotten to the extent that in Tesla-dense cities there's a good chance you'll pass a Tesla and see the driver not even close to paying attention to the road. Autopilot is in the dangerous Level 3 domain of autonomous driving where it's good enough for drivers to think it's Level 4 while it happily plows into parked objects once in a while.
Personally, on the occasions when I drive a Tesla, I don't even think about turning it on.
> Personally, on the occasions when I drive a Tesla, I don't even think about turning it on.
Same. I only used it when I want to show it to other people. I enjoy the acceleration and the tight steering, especially on the performance model 3, way too much to not play with it. I did 790 miles in one night just driving up from OC to Santa Barbara down to San Diego and back again I was having so much fun. I only stopped for charges along the way and a bite to eat in K-town in LA.
> That's kind of the biggest issue I have with Autopilot - it makes drivers feel far more comfortable relying on it than they should.
I would argue that's an external factor in personal decision making we can't possibly undo, lest me remove Free Will entirely from Humans. I've seen this in other things as well: when I was in university I often drove without insurance to make end-meet and I took so many precautions before I set off, whereas my girlfriend had insurance and her father owned a dealership so if she got in an accident it was no big deal and since her dad would replace her car if needed she'd often text me while driving (this was during the dumb phone era mind you!) and since she worked as a Physical Therapist was often on the phone while driving to and from worksites.
I wouldn't even reach for food or water while driving, let alone a phone, because I was so paranoid of not seeing everything on the road and having a plan B reaction as I was still driving my 75% track car on the street (essentially my entire savings account and Life's savings rolled into one) during my Junior and Senior year of University and I once got pulled over twice in a day before I had enough money to buy a beater to daily. Luckily by then I had reduced my attendance requirements by 1/2 as gas shot up to $5 in SoCal and I got 17mpg highway.
I used to commute 65 miles each way, and then left my car in the student parking lot to take the tram to work.
I live in Boulder now and there are tons of them around here. I never take the time to look anymore since the Model 3 ramp up. I do know that when I ride on 2 wheels (push bike or motorcycle) that just because they're driving them its no indication they're any more competent than a ICE driver.
I just arrived to the conclusion Humans behind wheels are dangerous in general and anything that removes them for them equation is going to be me messy but worth it in the long run.
> however, it was too late to stop the nearly two-ton sedan traveling at reportedly 68 miles per hour.
Its interesting how the brakes get pumped hard by the pedestrian and let go as he passes him. Willing to bet somebody looks at the data and finds the driver was overriding. I don't own a tesla but I do own 2 cars with forward collision detection and braking. Its pretty scary when your car makes a lot of racket and you look up to see something like a guy in the road. Kind of startles and distracts the mind in ways where I could see somebody focusing too much on that to see a box truck in the road.
Driver wasn’t paying attention, not surprising. You have to actively ignore the warnings to pay attention or hack together a rig to put torque on the wheel.
I don't understand how paying attention can prevent this sort of thing.
Let's say you're paying attention with all your might. You see something up ahead, like an overturned tractor trailer. You expect autopilot will sense it, but you're preparing for it not to. But how do you know it's not going to sense it? Well, obviously, when it hasn't started reacting in a reasonable time. But then it's too late!
In order for a human to have time to figure out that the software isn't working it would have to routinely operate with (much) slower reflexes than a human. But then it would be useless, as you can't slow down normal driving, and in any case, that's not how they designed it.
The hypothesis that people have unreasonable expectations of something called "Autopilot" seems unnecessary and irrelevant to me.
There's a notable time gap between the time when the autopilot should start braking at a reasonable rate, and the last possible time when a full-strength brake will be sufficient. That time gap should be sufficient for an attention-paying human to apply the brakes.
I am however, doubtful think it's really reasonable for humans to have this expectation of 100% attentiveness given an autopilot that 99.9% of the time doesn't crash into things .
Ideally there'd be obvious visual feedback that reflects the current state of the Autopilot and what it recognizes. Something that can't be ignored (front and center on the windshield maybe) and that can give the driver sufficient information to make a decision about whether to interfere or not. This doesn't seem to be on the radar for anybody though. It's like everybody assumes they'll figure out a 99.99% working driving AI where driver intervention is never necessary.
It's even more egregious with Tesla, who is already shipping a half-baked self-driving feature in its current state where accidents like in the article are inevitable.
I could even imagine the windshield outlining everything that the Autopilot sees. If you don't see an outline around a object, you know the Autopilot isn't aware of it. No idea how technically feasible that is.
This sort of problem can be generalized I think to a lot of software design problems, including the sort of things I'm assigned at work (even though very mundane and boring as I'm not a real software engineer).
People say "make a program that does <thing> for me". And maybe you whip up something that does 50%, or 80% or 99.5%. But that only makes them more unhappy when it fails. A partial solution is no solution.
So, you have to come up with a model for humans to be augmented by the software. Rather than trusting it to make the decisions, the software needs to take a large amount of data and clarify and distill it so humans can more easily make the decisions.
But people always want to avoid making decisions. That's why managers/leaders have so much power even though they tend to be despised and seem incompetent.
> I don't understand how paying attention can prevent this sort of thing.
The simple answer is that you take over when something absurd is happening in front of you, like a trailer flipped on its side blocking your lane.
You don’t wait till the last moment to take over. You would immediately move over to the right lane, manually. AutoPilot helps in this case by keeping you perfectly in lane while you check your blind spot before getting over. That’s how it’s supposed to work anyway, and that’s how I use it.
In this particular case, maybe it was possible to avoid it, but in general, small changes in trajectory in a car can cause big consequences. This tractor trailer may have been huge and crosswise, but many things you are supposed to come within inches and so long as you miss it by millimeters, there's no problem.
What I tell my mom, who also drives a Tesla with AP, “If the AP isn’t driving better than you could at that very moment, if you think ‘Oh I’d rather be just a bit further over there’ - that’s when you disengage it. Don’t wait to see what will happen next.”
Turns out that driving is 99.9% utter monotony and AP actually drives better than me during all that time.
I'm sorry but if you see a truck on its side and expect autopilot to stop and it doesn't you are 100% at fault.
And if you saw it on its side, why wouldn't you at the very least change lanes?
It's fun to bash on Tesla whenever autopilot fails but they do make it very very obvious and those of you saying otherwise either don't own a Tesla or really don't pay enough attention.
Yes I would say so if it’s something you’d avoid. It’s one of those things where if you’re going to ignore all of the warnings and agreements when you get the car... how is that not negligent on the drivers behalf? Normal ACC doesn’t always stop either but no one yells at those cars. My Passat and egolf would all the time fail to stop when traffic was at a stand still.
The driver is claiming he braked, not the car: "Huang says he slammed on the brakes once he noticed the truck, however, it was too late to stop the nearly two-ton sedan traveling at reportedly 68 miles per hour."
https://www.thedrive.com/news/33789/autopilot-blamed-for-tes...