Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Here’s a Tesla failing to detect a train and crashing into a railroad crossing gate: https://www.reddit.com/r/SelfDrivingCars/s/AghLi791rO

This isn’t just “a few iterations away”.

Most of what you said has been repeated for years from people who just watch curated YouTube videos. I’ve driven on FSD v12 and I’ve intervened multiple times almost every single drive. It’s nowhere near ready and will likely never be with that sensor suite.



I'll have to take your word for it. I don't have a Tesla and have no experience with FSD. The only thing I can say is that the videos I have watched are recent with no jump cuts or editing. Were they curated trips among many other trips that weren't posted? I have no way of knowing. The feedback seems to be that FSD 12 is noticeably better than previous versions.

I suspect that Tesla is paying attention to the disengagement events and working hard to minimize them in the future, but I truly have no idea.

Also I am sure this question has been asked before but what is good enough for FSD? Perfect in every situation? Better than the average human driver? At par or better than an expert professional driver? I don't have an answer personally but I am curious what others think.


Not everyone records themselves using FSD like YouTubers do. So you’re not seeing drives where it screws up.

What’s good enough for FSD is being able to do it without a driver present like Waymo does. Their crowdsourced reliability data in https://www.teslafsdtracker.com/ suggests they need at least 3 orders of magnitude improvement to remove the driver.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: