Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Any anomaly makes the space non-driveable. It’s a road. An autopilot should not be entrusted to decide further, beyond anomaly detection sufficiently far ahead to allow for braking.


If there were anywhere that could have had full autonomy even a decade ago it would be train networks, and it's not for lack of obstacle detection technology that they don't. The problem is contextually dependent enough even on an entirely linear rail network that humans need to add their problem solving finesse regularly.

I just don't think autonomous cars is a problem worth working on if the requirement is that they need to work on existing road infrastructure. We are going to get half baked, not-really-autonomous vehicles and the manufacturers are going to shift blame to the drivers because they should have known better than to use their vehicle's advertised autonomy in (x) specific circumstances, where x is the infinitely variable realities of the outside world.


The idea originally expressed by jvanderbot presumes dense non-visual (e.g. LIDAR) scanning and simple anomaly detection without AI “black boxes”.

With that in mind, is this really a question of context? The presence of an entity that reflects LIDAR should be enough of a reason for the system to brake.

Yes, this would imply lower threshold of what would be a situation that warrants braking, and hence on many ordinary streets autopilot may become unusable or car movement could be perceptually too slow.

To your point about rail network, autonomous rail lines do exist—I’ve used the new light rail system in Macau, and trains are entirely driverless. (From my observation of their behavior, they do not appear to be controlled remotely either.) I imagine some factors that impede innovation in this space do apply to cars (e.g., track security), but others don’t (guaranteeing stopping distance for long trains with heavy cargo, keeping people employed, arguably less flexible infrastructure).


Of course it's still a question of context, there are more factors are play than "Should I stop, Y/N", such as "Will stopping actually work given the conditions, my payload, is the object moving toward me. Should I swerve instead?"

My point is if the best we can do is a LIDAR detection auto-stop then that's not full autonomy, and I question how close we'll actually get to contextually aware autonomy.

> To your point about rail network, there are autonomous rail lines—I’ve used the new light rail system in Macau, and trains are entirely driverless.

That is actually really cool to hear. I know I come across as a Luddite by having a pessimistic view on car autonomy, but I will be happy if we can crack it.


> Will stopping actually work given the conditions, my payload

If there is rain and/or your payload is heavy, that increases your braking distance, so accordingly detection should either work further ahead or your speed would have to be reduced to allow for safe braking.

> is the object moving toward me

This is a very valid point. If not all vehicles are autonomous, even on a well-protected highway it is possible to have a cascading accident caused by human mistake. This would even be an issue if all vehicles are autonomous, since some could be hijacked by owners.

In my view this type of “dumb” object detection for emergency braking purposes should work together with contextually aware AI-based overall autonomy, not instead of it. Upthread I wrote “an autopilot should not be entrusted to decide further”, which was perhaps ambiguous—meant that in context of an anomaly ahead.


Without some incredible breakthrough in AI, trying to adapt autonomous cars to the regular road network is like swimming up a cascade. We're far more qualified to adapt the road infrastructure to autonomous driving, perhaps even remove most of the cars and driving in the process, leaving only the bare minimum.


But of road becomes non "drivable" at a point ahead where you can't stop in time you need to evaluate what to do (hit it, try evade, etc.) which are all hard decision.

> An autopilot should not be entrusted to decide further, beyond anomaly detection sufficiently far ahead to allow for braking.

This isn't possible. Not just technical but theoretical. Situations in streets can change way to fast. Both through external factors and other cars. Even on a straight highway with fences on the side to prevent anything from going on it this would still but work. Once it happens an intermediate decision is needed which can not be reached on time by delegation to the driver (as the driver want driving you can't except a good reaction speed even if that person had the eyes on the road).


> This isn't possible. Not just technical but theoretical. Situations in streets can change way to fast.

Then drive slower. It's a speed limit not a speed minimum.

When I was training for my drivers license (in Denmark) my instructor told me to never drive so fast that I wouldn't be able to break to a complete stop within the current visible distance.


My instructor (in the UK) gave me the same instruction. To be fair, it's pretty intuitive. It's also remarkable how few people follow it (e.g. they don't slow down sufficiently around blind bends, or when there are pedestrians close by).


Eliminate situations where you can’t stop in time. If impossible, then autopilot should not be marketed as such.

On highways, enough clearance from the sides of the road and wide enough detection beam cone may be able to help. Yes, I imagine in such a case high-speed motorways would warrant security measures similar to those around high-speed train tracks.

On denser city streets where restricting pedestrian access is not feasible nor desirable[0], appropriate (yes, probably pretty low) speed limits could reduce braking distance and hence the chance of a tragic accident to below what currently happens with human drivers.

[0] This concerns existing streets. For new areas, infrastructure that separates car and pedestrian traffic with tunnels/bridges and accessible over-/underpasses could address that naturally.


Love this idea. If autopilot cannot be engaged on an unsafe road, then maybe we will actually complain enough to improve roads and supporting infrastructure




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: