Waymo is currently under investigation for multiple incidents, not all of which it had previously disclosed to the NHTSA [0]. The recent light pole incident also doesn't help [1].
If they are doing 50k rides a day, then they would appear to have a remarkable safety record.
It will be interesting to see if these investigations lead to a repeat of the Cruise debacle or if this will become the price of doing business.
Anecdata, but watching the Waymo cars compared to Cruise (preban) was night and day. Before Cruise was banned in SF, I would often see them violate traffic laws and fail to navigate basic intersections. Waymo isn't perfect, but its better than Cruise and the average SF driver, which is good enough for me.
that's cool. until it's not. it's very easy to release an upgrade of stopping less on stop signs and see data increasing profit and not increasing accidents. same with code updates that will make cyclist life worse, unless there's actual change in a kpi they track. you're not really their main concern, specially after they ipo and get acquired by Apollo or billionaire du jour
In the United States there are few legal repercussions when a human driver kills someone as long as they are sober and utter the phrase "I didn't see them". Therefore, biking on US roads means trusting in the inherent goodness (and attentiveness) of the drivers around you.
Driverless cars run by a company protecting itself from reputational and legal risk seems less dystopian than the status quo.
Yes, I don't understand how anybody who's ever ridden a bike in a major American city isn't super excited about high-quality self driving vehicles. The crazy stuff I see on a daily basis while out biking in Seattle (and statistically we are one of the best places to bike in the US) means I can't wait until these things take over :-)
I apologize if I was unclear. In response to a cyclist saying they prefer being near Waymo vehicles to human drivers you said:
>> that's cool. until it's not...same with code updates that will make cyclist life worse...you're not really their main concern
I agree and expect that the wide safety tolerances driverless cars currently have will become tighter as they gain more experience, and that this will make them more efficient but potentially less pleasant to be around than they used to be.
But even if pedestrian and cyclists lives are not a main concern for self-driving car companies, some concern is better than none. For some human drivers their concerns seem to be things like not getting arrested, getting to their destination as quickly as possible, checking social media to satisfy their boredom, and not scratching the paint on their vehicle. Some drivers consider vulnerable road users like cyclists to be sub-human [1].
My point is that the bar in the US has been set so incredibly low that even if the code updates make their products worse for cyclists than they used to be or even killing some vulnerable road users that may still be safer and preferable to the incompetence and complete indifference on the part of human drivers.
Having said that, the same calculus may not apply in countries that don't issue drivers a license to kill people, so the bar for driverless cars is likely to be much higher in such places.
If Waymo hits a cyclist which leads to death, and Waymo is found to be at fault, that's definitely going to make headlines and potentially lead to a pause of the entire operation.
That's just being a cynic for cynicism sake. They are already owned by a billionaire company, so there is no ipo. And they still have at least a couple of decades where the game they need to play is get riders and legislators to trust them, so they are incentived to make their car very safeful so they can roll out to more cities and countries. It takes one bad accident to get the public to turn against them, and there is no technological edge that can save you if the government decides to make your entire business illegal.
Because once the current safety scrutiny has passed you might get more trips done by setting the ai to be more aggressive in traffic. Then you are into VW style software updates with a profit motive and no mechanism to hold them accountable?
>And not sure why you think running stop signs or any anti safety measures would increase profits.
Because this big companies like Google are actually evil. As an example the mobile YouTube app does not let you use it if you turn off the screen. So Google decided that wasting energy and killing batteries is an acceptable thing to do, this is pure evil - I would accept they adding more advertising or whatever but killing the life span of a device and wasting energy is truly evil shit.
> The car went onto a freeway, where it travelled past an on-ramp. According to people with knowledge of events that day, the Prius accidentally boxed in another vehicle, a Camry. A human driver could easily have handled the situation by slowing down and letting the Camry merge into traffic, but Google’s software wasn’t prepared for this scenario. The cars continued speeding down the freeway side by side. The Camry’s driver jerked his car onto the right shoulder. Then, apparently trying to avoid a guardrail, he veered to the left; the Camry pinwheeled across the freeway and into the median. Levandowski, who was acting as the safety driver, swerved hard to avoid colliding with the Camry, causing Taylor to injure his spine so severely that he eventually required multiple surgeries.
> Levandowski and Taylor didn’t know how badly damaged the Camry was. They didn’t go back to check on the other driver or to see if anyone else had been hurt. Neither they nor other Google executives made inquiries with the authorities. The police were not informed that a self-driving algorithm had contributed to the accident.
> According to former Google executives, in Project Chauffeur’s early years there were more than a dozen accidents, at least three of which were serious. One of Google’s first test cars, nicknamed kitt, was rear-ended by a pickup truck after it braked suddenly, because it couldn’t distinguish between a yellow and a red traffic light. Two of the Google employees who were in the car later sought medical treatment.
It was a long time ago, but Larry Page was well aware of it, and imagine if that incident received fair coverage and investigation.
I am having trouble imagining this scenario in a way that makes Waymo look as bad as you imply. It sounds like the human-driven vehicle if it was "boxed in" on an on-ramp needed to slow and merge, rather than racing to pass on the right, running off the road, and causing a spectacular single-vehicle wreck. The way it's described in that paragraph seems to be ironclad proof of the need to promptly relieve humans of driving tasks.
> It doesn't make the tech look bad, but to me it makes the safety driver & the other executive look callous and uncaring.
The safety driver was Anthony Levandowski, who left Google for Uber, taking with him a bunch of stolen IP, at Uber ran a cowboy self-driving car division that got pedestrians killed, Levandowski got sued by Google, ended up in prison and Uber laid off the entire division. Later he was pardoned by Trump.
So good news - the callous and uncaring safety driver has been fired, sued, and imprisoned.
Yeah, the worst read about the car here would be "it's not very courteous in merge situations" in which case I implore anyone reading to drive in Maryland one single time.
This is especially true for comments that disagree with Googlers, likely because there are soooo many Googlers now and they (and Waymo) have highly aligned perspectives. Especially on Google-launch-related posts, 5-10 point swings in 24hrs can happen.
A good piece by an ex-Googler on his 2-year journey toward recognizing the scale of institutional thought at Google: https://mtlynch.io/why-i-quit-google/
Levandowski stole Waymo trade secrets, and only escaped the full consequences of his actions because of a Trump pardon. He is not representative of anything about Waymo in 2024.
Larry Page was an ardent supporter of Levandowski and this evidence illustrates Waymo’s core safety culture: that they’re above regulation and above the law. Same mindset illustrated in Google’s anti-trust trials.
If they are doing 50k rides a day, then they would appear to have a remarkable safety record.
It will be interesting to see if these investigations lead to a repeat of the Cruise debacle or if this will become the price of doing business.
[0] https://www.reuters.com/business/autos-transportation/us-saf...
[1] https://www.youtube.com/watch?v=HAZP-RNSr0s