Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I drive a lot across Europe: as in, really a lot, long trip across several countries, several times a year. I drive enough on the highways to know a few scary situations, like the truck driver in a big curve slightly deviating out of his lane and "pushing" me dangerously close to the median strip for example.

To me driving requires paying constant attention to the road and being always ready to act swiftly: I just don't understand how you can have a "self driving car but you must but be ready to put your hands back on the steering wheel and your foot on the pedal(s)".

I have nothing against many "recent" safety features, like the steering wheel shaking a bit if the car detects you're getting out of your lane without having activated your blinker. Or the car beginning to brake if it detects an obstacle. Or the car giving you a warning if there's a risk when you change lane, etc.

But how can you react promptly if you're not ready? I just don't get this.

Unless it's a fully self-driving car, without even a steering wheel, a car should help you focus more, not less.



> But how can you react promptly if you're not ready? I just don't get this.

You cannot, that's the simple truth. You're supposed to focus on the road anyways and should be able to take over once any sort of autopilot or assist system starts working erroneously, yet in practice many people simply assume that those systems being there in the first place mean that you can simply stop focusing on the road altogether.

It feels like the claim of "fully self driving vehicle" is at odds with actual safety, or at least will remain so until the technology actually progresses far enough to be on average safer than human drivers, moral issues aside. Whether that will take 15, 50 or 500 years, i cannot say, however.

That said, currently such functionality could be good enough for the driver to take a sip from a drink, or fiddle around with a message on their phone, or even mess around on the navigation system or the radio - things that would get done regardless because people are irresponsible, but making which a little bit safer is feasible.


It's nothing (well certainly not everything) to do with people's assumptions. There's a ton of research around how people simply stop paying attention when there's no reason for them to pay attention 99% of the time. It doesn't even need to be about them pulling out a book or watching a movie. It can simply be zoning out.

Maybe, as you say, it's feasible today or soon to better handle brief distractions but once you allow that it's probably dangerous to assume that people won't stretch out those distractions.


We have empirical data showing how safe actual level 2 self driving cars are in practice. So there’s no reason to work from base assumptions. Yes, level 2 self driving cars cause avoidable accidents, but overall rate is very close to the rate people do. The only way that’s happing is they are causing and preventing roughly similar numbers of accidents.

Which means people are either paying enough attention or these self driving systems are quite good. My suspicion is it’s a mix of both, where people tend to zone out in less hazardous driving conditions and start paying attention when things start looking dangerous. Unfortunately, that’s going to cause an equilibrium where people pay less attention as these systems get better.


> We have empirical data showing how safe actual level 2 self driving cars are in practice.

Do we? Where does that come from? The data Tesla provides is hopelessly non-representative because it makes the assumption that the safety of any given road is independent of whether a driver chooses to switch on the system there.


Only overall numbers actually mater here, if self driving is off then that’s just the default risk from human driving in those conditions. Talk to your insurance company, they can give you a break down by make, model, and trim levels.


I am pretty sure that if I call Geico they will not provide me with those data. Am I wrong?


Mine did, but I don’t use Geico. If they don’t give you the underlying data you can at least compare rates to figure out relative risks.


I feel like driver monitoring can keep it safe, and should even be available without autopilot enabled.

Comma.ai makes the monitoring more strict when the system is less certain or when in denser traffic.


These are exactly my arguments to my girlfriend on why she shouldn’t use the Autopilot on our Tesla. Your mind will stray, the feature is exactly meant to do that to you. The feedback loop goes the wrong way. Then boom you don’t see emergency vehicles at a wreck apparently. I do blame Elon, he did the Silicon Valley thing of just promise a lot of untested stuff before the laws have solidified. Uber, Lime scooters, etc. The Tesla is a great car, but self-driving is orders of magnitude harder than he thinks.


Agreed. I'd also add that other car manufacturers have made tradeoffs on safety issues for decades.

So I wonder if it's more about Telsa capitalizing on the hype of self driving cars (with the expensive self-driving add-on) in the short term and less about him misunderstanding the magnitude of difficulty.

Telsa is using the proceeds from that add-on to make them seem more profitable and fund the actual development. It's smart in some aspects, but very risky to consumers and Telsa.


If you go back a few years, there were clearly expectations being set around L4/5 self-driving that that have very clearly not been met.

I still wonder to what degree this was a collective delusion based on spectacular but narrow gains mostly related to supervised learning in machine vision, how much was fake it till you make it, and how much was pure investor/customer fleecing.


I learned this playing Gran Turismo video games way back when. The game has long endurance races (I seem to remember races that ran about 2 hours, but there may have been longer ones). Eventually you get hungry or thirsty or have to use the bathroom, so you pause the game, take care of business, and resume. It's really easy to screw up if the game was paused while your car was doing anything other than stable, straight travel. A turn that I successfully handled 100 times before can suddenly feel foreign and challenging if I resume there with little context.

Obviously that's not exactly the same thing as taking over for a real car when the driver assistance features give up, but seems similarly challenging to take over the controls at the most precarious moment of travel, without being sort of "warmed up" as a driver.


500 laps at Laguna Seca in a manual transmission car let's go!


I see a lot of comments here postulating how autopilot is a terribly designed feature from people who appear not to be speaking from first hand experience and now I feel compelled to comment, exactly following that HN pattern someone posted about how HN discussions go. That said thanks for keeping this discussion focused & framed as a system design one, doesn't feel like a Tesla hate train so I feel comfortable hoppin' in and sharing. This is a little refreshing to see.

Anyway, perhaps I'm in a minority here, but I feel as though my driving has gotten _significantly safer_ since getting a Tesla, particularly on longer road trips.

Instead of burning energy making sure my car stays in the lane I can spend nearly all my time observing drivers around me and paying closer attention farther down the road. My preventative and defensive driving has gone up a level.

> I just don't understand how you can have a "self driving car but you must but be ready to put your hands back on the steering wheel and your foot on the pedal(s)".

I've not hit animals and dodged random things rolling/blowing into the road at a moment's notice. This isn't letting autopilot drive, it's like a hybrid act where it does the rote driving and I constantly take over to quickly pass a semi on a windy day, not pass it on a curve, or get over some lanes to avoid tire remnants in the road up ahead. I'm able to watch the traffic in front and behind and find pockets on the highway with nobody around me and no clumping bound to occur (<3 those).

To your suspicion, it is a different mode of driving. Recently I did a roadtrip (about half the height of the USA) in a non-Tesla, and I found myself way more exhausted and less alert towards the end of it. Could be I'm out of habit but egh.

Anyway, so far I've been super lucky. I don't think it's possible to avoid all car crashes no matter how well you drive. But I _for sure_ have avoided avoidable ones and taken myself out of situations where they later occurred thanks to the extra mental cycles afforded to me by auto-pilot. My safety record in the Tesla is currently perfect and I'll try and keep it that way.

I don't think autopilot is perfect either but I do think it's a good tool and I'm a better driver for it. Autopilot has definitely helped me spend better focus on driving.


This expresses the mindset I find myself in when I use Autopilot. It's like enabling cruise control, you're still watching traffic around you but now you don't need to focus on maintaining the correct speed or worry about keeping your car perfectly in a lane. You can more or less let the car handle that (with your hands on the wheel to guard against the occasional jerky maneuver when a lane widens for example) while you focus on the conditions around you.


Exactly. It frees the driver from increasingly advanced levels of mundane driving (cruise control manages just speed, adaptive cruise also deals with following distance, lane keeping deals with most of the steering input, etc) allowing the driver to focus more on monitoring the situation and strategic portion of driving rather than the tactical. Of course, this relies on the driver to actually do that. They could just use devote that extra attention to their phone.


my 2021 Subaru Forester does all of these things and I do feel like I am safer with them on and paying attention to the rest of driving.


Exactly this. I treat AP like I'm letting a learner drive. Constantly observing to make sure it's doing the right thing. I've been on long road trips and with AP my mind stays fresh for much longer compared to with other cars.


The problem is, even if your subjective idea of how Autopilot affects your own driving is correct, it appears not to be the case for a significant subset of Tesla drivers, enough that they've been plowing into emergency vehicles at such an elevated rate as to cause NHTSA to open an investigation.

Also, your subjective impressions may be what they are simply because you have not yet encountered the unlucky set of conditions which would radically change your view, as was surely the case for all the drivers involved in these sorts of incidents.


There's zero Tesla hate here and certainly zero EV hate here, on the contrary: I just feel the interior build quality on the Tesla could be a bit better but I'm sure they'll get there.

I wouldn't want my, strangely enough upvoted a lot, comment, to be mistaken for Tesla hate. I like what they're doing. I just think the auto-pilot shouldn't give a false sense of security.

> I've not hit animals and dodged random things rolling/blowing into the road at a moment's notice.

> I don't think it's possible to avoid all car crashes no matter how well you drive.

Same here... And animals are my worst nightmare: there are videos on YouTube just terrifying.

For I do regularly watch crash videos to remind me of some of the dangers on the road.


I think you two are talking about different things.

You're talking about Autopilot which is just driver assistance technologies; lane keep assistance, adaptive cruise control, blind spot monitoring, etc. It's not to replace driver attention, it's just monitor sections of the road the the driver can't pay attention to full time. The driver is still remaining in control and attentive to the road.

The person you're responding to seems to be talking talking about the Full Self Driving feature who's initial marketing implied that the driver need not be mentally engaged at all or too impaired to drive normally. Which was later back pedal led to say that you need to pay attention.


Some people activate cruise control and then rest their right foot on the floor. I activate cruise control whenever possible because while it is activated, I can drive with my foot resting on the brake pedal. I like being marginally more responsive to an event that requires braking since I don’t need to move my foot from the accelerator.


What I always tell people is that together me and my car drive better than either of us on their own (Tesla Model S 70D, 2015, AP1.5).


I drive a Tesla and don't use the self-steering feature exactly because of this. What I do instead is enable the warnings from the same software like the ones you describe. That is actually a large gain. I'm already paying attention as I'm driving the car at all times and the software helps me catch things I haven't noticed for some reason. Those features seem really well done as the false positives are not too frequent and just a nuisance but the warnings are often valuable.


Does it have/use emergency braking in case of danger, if you don't use self-driving?


Yes.


Yes but they are phasing out radar in favor of vision-only. Model 3 and Y have been shipping without radar braking for the past few months.


They still do emergency braking regardless of the sensor technology.


The vision-only system has passed all required tests for certification and Tesla themselves consider it to be a much safer system now.


It can be really jarring too when a car behaves differently than you expect: I regularly use cruise control on my Kia, which makes driving much less stressful. It keeps the car centered in the lane, more or less turns the car with the road, and of course, matches the speed of the car in front of it with reasonable stopping distance. I wouldn't call it "self-driving" by any means, but if not for the alert that gets ticked off if your hands are off the wheel too long, it'd probably go on it's own for quite a long time without an incident.

However, I also once so far have experienced what happens when this system experiences a poorly-marked construction zone. Whilst most construction sites on the interstate system place temporary road lines for lane shifts, this one solely used cones. While I was paying attention and never left the flow of traffic, the car actually fought a little bit against me following the cones into another lane, because it didn't see the cones, it was following the lines.

It doesn't surprise me at all that if someone gets too comfortable trusting the car to do the work, even if they think they're paying attention, they could get driven off the roadway.


I was thinking about this the other day - driving in construction. The town I live in is currently doing water main replacement. So, lots of torn up roads, closed lanes and even single-lane only with a flagger alternating directions. No amount of safety cones will make it obvious what's going on.

How do automated systems deal with flaggers? Visibility of the stop/slow sign isn't sufficient to make a determination on whether it's safe to proceed (not to mention "stop" changes meaning here, entirely, from a typical stop sign). Often, whether or not you can proceed comes down to hand gestures from the flagger proper.

Not that I expect any reasonable driver to be using something like autopilot through such a situation, but we've also seen plenty of evidence that there are unreasonable drivers currently using these systems, as well.


Conceivably in the somewhat-near future (10 years+), most cars on the road will have some sort of ADAS system, in which I'd presume it'd start to make sense for construction to use some sort of digital signalling. Something like a radio signal broadcast that can send basic slow/stop flagging signals to a lane of traffic.

Of course, the problem is, if we haven't developed it today, the ADAS systems of today won't understand it in ten years when there's enough saturation to be practical to use it. Apart from Tesla, very few car manufacturers are reckless enough to send OTA updates that can impact driving behavior.

Lane-following ADAS systems of today, mind you, can work relatively fine in construction areas... provided lane lines are moved, as opposed to relying solely on traffic cones.


It is my belief that the most ideal form of truly self driving vehicles will not happen until a time when vehicles can talk to each other on the road to make each other aware of position and speed data. I don't think this has to be full GPS coordinates at all. This is about short range relative position information.

A mesh network of vehicles on the road would add the ability for vehicles to become aware of far more than a human driver can ever know. For example, if cars become aware of a problem a few km/miles ahead, they can all adjust speed way before encountering the constriction in order to optimize for traffic flow (or safety, etc.).

Of course, this does not adequately deal with pedestrians, bikes, pets, fallen trees, debris on the road, etc.

Not saying cars would exclusively use the mesh network as the sole method for navigation, they have to be highly capable without it. The mesh network would be an enhancement layer. On highways this would allow for optimization that would bring forth some potentially nice benefits. For example, I can envision reducing emissions through traffic flow optimization.

Remember that electric cars still produce emissions, just not necessarily directly while driving. The energy has to come from somewhere and, unless we build a massive number of nuclear plants, that somewhere will likely include a significant percentage of coal and natural gas power plants.

The timeline for this utopia is likely in the 20+ year range. I say this because of the simple reality of car and truck ownership. People who are buying cars today are not going to dispose of them in ten years. A car that is new today will likely enter into the used market in 8 to 10 years and be around another 5 to 10. The situation is different with commercial vehicles. Commercial trucks tend to have longer service lives by either design or maintenance. So, yeah, 20 to 30 years seems reasonable.


Yes. This also makes me kind of nervous when just using normal car adaptive cruise control. I feel as though my foot needs to be hovering near the pedal anyway and that's often less comfortable than actually pushing on the pedal and controlling it myself.


I agree with everything you said. I do hope that eventually the tech gets to the point where it can take over full time. We recently took a road trip for our vacation and the amount of road rage we witnessed was ... mind boggling. Don't get me wrong, not everyone is a raging asshole, but there were enough to make me wonder just why so many people are so freaking angry.


Exactly, the only driving assistance feature I use is adaptive cruise control, and I don't have plans to use anything more. If I trust autonomous systems too much, I would not be ready when it matters.


To me they are really aids. Of course you keep being concentrated, but I found that it takes out a lot of mental load like keeping the car straight, constantly tweaking the accelerator, etc..

It just makes the trips easier on the brain, and thus, for me, safer overall: its easier to see the overall situation when you've got free mental capacity


The most useful button on my car is the speed limiter. Everything else can go.


Also, these cars know the speed limits for the road but let you set cruise control/self driving above the speed limit. Seems like for safety purposes that should not be allowed. Not only are people paying significantly less attention but they also are speeding.


> Also, these cars know the speed limits for the road

Does it always get this correct, or does it sometimes read a 30mph sign on a side road and then slow the car on the motorway down to that speed?


Different manufacturers probably use different systems but no. BMW attempts to read the speed limit signs using the frontal camera with a mix of some sort of stored info - it knows that the speed limit is about to change (Mobileye?), but it is very often that it won't catch a sign in the bend or when the weather is bad. Also, it does not recognize time restricted speed limits, for example 30kph from 7:00 to 17:00 Monday to Friday so it would keep driving 30kph outside of those hours while 50kph is allowed. In some places in Germany, it does not recognize the city limits and carries on showing 70kph for a kilometer longer than it should.


I'm not sure how the cars know the speed limit. Maybe someone else knows? My guess is combo of GPS/camera to position correctly on road and the lookup of known speed limit data. Perhaps it reads signs though?


My car shows the speed limit of roads it knows. It uses GPS and stored limits. It also doesn't know the limits of non-major roads and doesn't attempt to show a limit then. My car is a 2013, and I've not paid the $$ to update the maps in that time (seriously, they want $200-$400 to update the maps).

Since I bought my car, Illinois (where I live) has raised the maximum limit on interstates by 10 MPH. My car doesn't know about it. If my car limited me to what it thought the limit was, I'd probably be driving 20 MPH slower than prevailing traffic, a decidedly unsafe situation.


The rental car I am using now certainly a) reads road signs for speed limit information, b) is definitely fooled by signs on off ramps etc.

It’s hard to imagine how speed limit systems would work without some sort of vision capabilities — a database of speed limits would never be up to date with roadworks and so on.


Nobody should be using autopilot driving through roadworks anyways.


Going slower than traffic is actually unsafe and increases the chances of collisions with other drivers.


Going slower than traffic happens all the time. Over the road trucks often have speed governors set to 60-70 mph for example.


Almost nobody drives at or below the speed limit. It’s dangerous to do so in many places.


That’s a feature accommodating realities of driving on public roads, not a bug.

If you drive on a 60mph speed limit highway, no one is driving 60mph, everyone is going around 70mph. If you decide to use autopilot and it limits you to 60mph, you singlehandedly start disrupting the flow of traffic (that goes 70mph) and end up becoming an increased danger to yourself and others.

Not even mentioning cases when the speed limits change overnight or the map data is outdated or if a physical sign is unreadable.


Over the road trucks often have speed governors, some companies limit their trucks to 60 mph because it saves a lot of fuel and leads to a much (50%) lower risk of collisions.


Apples to oranges. Stopping distance of a 16-wheeler is magnitudes larger than that of a typical sedan, so in their case it makes sense.

For specific numbers (after subtracting reaction distance being the same for both):

55mph: car 165ft, 16-wheeler 225ft. 65mph: car 245ft, 16-wheeler 454ft.

As you can see, the gap between a car's stopping distance and a 16-wheeler's stopping distance increases with speed increasing, and non-linearly at that. Not even mentioning the destructive potential of a car vs. a 16-wheeler.

I would agree with your point if majority of the roads were occupied by 16-wheelers, but it isn't the case (at least in the metro area that I commute to work in).

Source for numbers used: https://trucksmart.udot.utah.gov/motorist-home/stopping-dist...

Note: I agree that it would be safer if everyone drove the exact speed limit, as opposed to everyone going 10mph above the speed limit. However, in a situation where everyone is driving 10mph above the speed limit, you are creating a more dangerous situation by driving 10mph slower instead of driving 10mph above like everyone else.


> like the truck driver in a big curve slightly deviating out of his lane and "pushing" me dangerously close to the median strip for example

This is a situation that you simply shouldn’t put yourself in. There is no reason to ever drive right next to a large vehicle, on either side, except for very short periods when overtaking them on a straight road.


This just isn't realistically possible on most highways except in the lightest traffic conditions. You are gonna spend some time beside trucks whether you like it or not.


Spending time right next to a truck is completely optional. You can either speed up or slow down, either of them will put you in a position where you are no longer right next to them.


What if there is a more or less uninterrupted row of trucks in the right lane?


We can play “what if” all day, but I’m not interested. In 99,9% of cases you can and should avoid driving next to a large vehicle.


Fully automated Self driving cars is either a pipe dream or decades away in which many more people will be killed on the road in the name of technological progress.


Fully autonomous cars are already a reality with Waymo in AZ and AutoX, Baidu in China. I don't know how safe the Chinese companies are, but Waymo's safety record [1] is nothing short of stellar.

[1] https://waymo.com/safety


Good for Waymo and hopefully Google keeps up this science project. But it's a very limited and almost as perfect an environment as you could have outside of a controlled test area. Those who were saying L4/5 would be decades at least away seem to be those who were on the right track. Kids growing up today are going to have to learn to drive.


L5 may be decades away. I think we will see L4 in some major metro areas in the US by end of this decade. SF is heating up with Cruise and Waymo's heavy testing. Their progress will be a great indicator for true city driving.


>we will see L4 in some major metro areas in the US by end of this decade

I think you're far more likely to see L4 on limited access highways in good weather. A robotaxi service in a major city seems much more problematic given all the random behavior by other cars, pedestrians, cyclists, etc. and picking up/dropping off people in the fairly random ways that taxis/Ubers do. (And you'll rightly be shut down 6 months for an investigation the first time you run over someone even if they weren't crossing at a crosswalk.)

And for many people, including myself, automated highway driving would actually be a much bigger win than urban taxi rides which I rarely have a need for.


Waymo selected the one state willing to entirely remove any safety reporting requirements for self-driving cars as the place to launch their service. Regardless of what they claim to the contrary, if they had confidence in their safety record, they would've launched it in California, not Arizona.

Waymo has lied about the capabilities of their technology regularly, and for that reason alone, should be assumed unsafe. A former employee expressed disappointment they weren't the first self-driving car company to kill someone, because that meant they were behind.


> Regardless of what they claim to the contrary, if they had confidence in their safety record, they would've launched it in California, not Arizona.

California only months ago opened up permits for paid robotaxi rides. So no, they couldn't have launched it in CA. If you've noticed, they actually are testing in SF with a permit.

> Waymo has lied about the capabilities of their technology regularly, and for that reason alone, should be assumed unsafe.

What lies? Their CA disengagement miles are for everyone to see, their safety report is open, they have had 0 fatalities in their years of operation. Seems like you just made this up.


> California only months ago opened up permits for paid robotaxi rides. So no, they couldn't have launched it in CA.

Well, yeah, that's the logic of an established business. Disruptive startups flout laws rather than following them.


I recall a particular incident where Waymo was marketing their car being able to drive a blind man to a drive-thru, way before the thing could safely drive more than a mile on it's own. My understanding is that in 2021, it still can't navigate parking lots (which would preclude using it for drive-thrus).

Later, they were talking about how sophisticated their technology was: It can detect the hand signals of someone directing traffic in the middle of an intersection. Funny that a few months later, a journalist got an admission out of a Waymo engineer that the car wouldn't even stop at a stoplight unless the stoplight was explicitly mapped (with centimeter-level precision) so the car knew to look for it and where to look for the signal.

https://www.technologyreview.com/2014/08/28/171520/hidden-ob...

The article is seven years old at this point, but it's also incredibly humbling in how much bull- Waymo puts out, especially compared to the impressions their marketing team puts out. (Urmson's son presumably has a driver's license by now.)

In at least one scenario, the former Waymo engineer upset he had failed to kill anyone yet ("I’m pissed we didn’t have the first death"), caused a hit-and-run accident with a Waymo car, and didn't report it to authorities, amongst other serious accidents: https://www.salon.com/2018/10/16/googles-self-driving-cars-i... Said star Waymo engineer eventually went to prison for stealing trade secrets and then got pardoned by Donald Trump. Google didn't fire him for trying to kill people, they only really got upset with him because he took their tech to Uber.

I'd say Waymo has a storied history of dishonesty and coverups, behind a technology that's more or less a remote control car that only runs in a narrow group of carefully premapped streets.


> I recall a particular incident where Waymo was marketing their car being able to drive a blind man to a drive-thru, way before the thing could safely drive more than a mile on it's own.

How is a marketing video relevant from 2015 relevant to their safety record? They weren't even operating a public robotaxi service back then.

> My understanding is that in 2021, it still can't navigate parking lots (which would preclude using it for drive-thrus).

Completely false. Here is one navigating a Costco parking lot (can't get any busier than that) [1]. If you watch any videos in that YouTube channel, it picks you up and drops you off right from the parking lot. Yes, you can't use it for drive-thrus, but it doesn't qualify as "lying about capabilities".

> Later, they were talking about how sophisticated their technology was: It can detect the hand signals of someone directing traffic in the middle of an intersection. Funny that a few months later, a journalist got an admission out of a Waymo engineer that the car wouldn't even stop at a stoplight unless the stoplight was explicitly mapped (with centimeter-level precision) so the car knew to look for it and where to look for the signal.

Here is one recognizing a handheld stop sign from a police officer while it stopped for an emergency vehicle [2].

[1] https://www.youtube.com/watch?v=p5CXcJD3mcU

[2] https://www.youtube.com/watch?v=MpDbX1FViWk&t=75s


The workers doing road repairs in my neighborhood don't even use handheld stop signs. Just vague and confusing gestures.


I think in those cases a Waymo vehicle would probably require remote assistance. It's a really difficult scenario for a computer to make sense of.


which many more people will be killed on the rise in the name of technological progress.

Seeing as car crashes are the leading cause of deaths from people aged 1-54, it may be an improvement from the status quo

More than 38,000 people die every year in crashes on U.S. roadways. The U.S. traffic fatality rate is 12.4 deaths per 100,000 inhabitants. An additional 4.4 million are injured seriously enough to require medical attention. Road crashes are the leading cause of death in the U.S. for people aged 1-54.


I'd say it depends on how many of those deaths are caused by the driver doing something unsafe. I'd be more comfortable with higher traffic deaths that primarily affect bad drivers than a lower number of deaths randomly spread across all drivers by a blackbox algorithm.


If you are texting while driving and hit a stopped car or run a red light, you are very lightly to kill others. Actually more likely, as a side impact is more dangerous than a frontal one.


But the car doesn’t need to drive itself to avoid those factors, it just needs to have radar auto braking


> Road crashes are the leading cause of death in the U.S. for people aged 1-54

This isnt true according to the CDC. Cancer and heart disease lead for the 44-54 group, and while "accidental injury" does lead from 1-44, if you break down the data, in many cases vehicle based accidents are not not the largest single source. For example:

Drowning is the largest single cause in 1-4

Cancer is the largest single cause in 5-9

Suicide is the largest single cause 10-14

https://wisqars-viz.cdc.gov:8006/lcd/home


Will changes such as machine-readable road markings, car to car communications, and traffic management systems make this happen quicker.

For example, couldn't emergency vehicles could send out a signal directly to autonomous vehicles or via a traffic managagemnt system to slow down or require the driver to take over when approaching. An elementary version of this is Waze which will notify you of road hazards or cars stopped on the side of the road.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: