Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>pretending you could add self-driving with just cameras in an over-the-air update (Tesla)

I have watched enough recent Tesla self-driving ride along videos on YouTube to suspect you might be mistaken on this point. Tesla intends to launch a cybertaxi fleet and their software looks like it will be good enough to get them there without lidar or additional sensors.



There are no Teslas that have ever taken a trip without an operator behind the wheel. The idea that there will be a near-future discontinuity after which a Tesla will be able to serve as a robotaxi is pretty ridiculous.

I just watched the latest video from AIDRIVR on YouTube. AIDRIVR is a TSLA pumper-and-dumper who has dedicated their channel to uncritical praise of FSD. In the first third of the video FSD v12 runs two stop signs, once directly into oncoming traffic in a 1-way traffic control and once at a stop where the cross traffic does not stop. This stuff is not even a little bit ready for fully supervised operation. https://youtu.be/fpoXr_z_6a4?t=565


>This stuff is not even a little bit ready for fully supervised operation. https://youtu.be/fpoXr_z_6a4?t=565

Thanks for sharing. I skimmed through the video and watched a fair amount of it. I got a different impression.

I thought it was impressive how FSD 12 navigated narrow winding roads with parked cars and oncoming traffic and flaggers holding signs that alternate between stop and slow. My impression was while it's not perfect, it's a few iterations away from having very few situations that require disengagement. And keeping in mind that every incident of disengagement is a learning and improving moment for FSD, the following iterations of FSD will continue to get more impressive.


Here’s a Tesla failing to detect a train and crashing into a railroad crossing gate: https://www.reddit.com/r/SelfDrivingCars/s/AghLi791rO

This isn’t just “a few iterations away”.

Most of what you said has been repeated for years from people who just watch curated YouTube videos. I’ve driven on FSD v12 and I’ve intervened multiple times almost every single drive. It’s nowhere near ready and will likely never be with that sensor suite.


I'll have to take your word for it. I don't have a Tesla and have no experience with FSD. The only thing I can say is that the videos I have watched are recent with no jump cuts or editing. Were they curated trips among many other trips that weren't posted? I have no way of knowing. The feedback seems to be that FSD 12 is noticeably better than previous versions.

I suspect that Tesla is paying attention to the disengagement events and working hard to minimize them in the future, but I truly have no idea.

Also I am sure this question has been asked before but what is good enough for FSD? Perfect in every situation? Better than the average human driver? At par or better than an expert professional driver? I don't have an answer personally but I am curious what others think.


Not everyone records themselves using FSD like YouTubers do. So you’re not seeing drives where it screws up.

What’s good enough for FSD is being able to do it without a driver present like Waymo does. Their crowdsourced reliability data in https://www.teslafsdtracker.com/ suggests they need at least 3 orders of magnitude improvement to remove the driver.


Waymo spent a long time at about that level as well, and it's a common situation with AI: you can get 80% of the way there really quickly, and then that next 19% takes orders of magnitude longer, and the next 0.9% is even harder, etc (pretty much anyone who's ever tried to actually apply a neural net for anything will have encountered this). Self-driving cars need a level of reliability basically unheard of for AI or even a classical software system of similar complexity (there's more complex software systems, and systems with higher reliability, but the product on self-driving cars is extreme).

A liability-viable self-driving car needs to be reliable enough that you would expect to see zero significant errors in a typical journey. That's around the point where you will only have a few articles about one of your thousands of cars going wrong each month. Commercially viable needs better than that.


Just a timeline of how Musk predicts that FSD will be solved in the next year every year since 2015: https://motherfrunker.ca/fsd/.

It is one thing to cherry-pick flawless drives on a sunny day and upload it to YouTube while having someone behind the wheel ready to take over the glorified driving assistant system. It is another to run a commercial driverless service open to the public 24/7 in one of the biggest urban areas, knowing that riders will record everything, assuming accident liability, and keeping a nice safety record without someone behind the wheel.


> cherry-pick flawless drives on a sunny day

Tesla FSD sucks extra bad on sunny days in fact, due to its basic optical systems.


I think they do great! Except for the occasional stationary emergency vehicle, bridge pile, etc. https://www.theguardian.com/technology/2024/apr/26/tesla-aut...


Tesla 'intends' to do a lot of things, but rarely seems to be able to actually DO them.


Ya, other than becoming the best selling EV company and revolutionizing the entire EV industry...


Doesn't necessarily mean they'll succeed at everything they try.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: