Of course, if the code wasn't available in the first place, the AI wouldn't be able to read it.
It wouldn't qualify as "open source", but I wonder if OP could have some sort of EULA (or maybe it would be considered an NDA). Something to the effect of "by reading this source code, you agree not to use it as training data for any AI system or model."
And then something to make it viral. "You further agree not to allow others to read or redistribute this source code unless they agree to the same terms."
My understanding is that you can have such an agreement (basically a kind of NDA) -- but if courts ruled that AI training is fair use, it could never be a copyright violation, only a violation of that contract. Contract violations can only receive economy damages, not the massive statutory penalties that copyright does.
I could ask this question without AI. How are you going to notice that while you were working on ~/projects/acme3000, you for some reason deleted ~/photos/2003/once-in-a-lifetime-holiday/?
Of course, AI is not a real person, and it does make mistakes that you or I probably would not. However, this class of mistake—deleting completely unrelated directories—does not appear to be a common failure mode. (Something like deleting all of ~ doesn’t count here—that would be immediately noticeable and could be restored from a backup.)
(Disclaimer, I’m not OP and I wouldn’t run Claude with —-dangerously-skip-permissions on my own system)
In fact, I noticed that whenever a book becomes most exciting, I start reading especially fast (to the point of skipping words), because I want to know what happens next. So I spend the least time with the best parts of the best books.
Ever since I realized that, I have switched pretty much exclusively to audiobooks. I don't really know if it's faster or slower overall, but it's a predetermined pace, and that works better for me.
For me, moving my lips while reading is a surefire way to significantly slow down the pace. I do this all the time when giving a document a final proofread before publishing.
I do something similar, only keeping my lips shut and moving my tongue and throat as if I was speaking. I find it's an intermediate speed between conversation speed and purely reading with my eyes. I started doing it when I wasn't so good at English to give myself time to understand the text, as well as to practice the mechanics of English speech when I didn't have anyone, but I find keeping me at this pace gives me maximum comprehension. I have a friend who reads much faster than me and he quite often misses points in whatever he's reading. I think he got into that habit from literature, but it's disastrous when reading something more densely packed with information, like technical documentation.
This is definitely a disadvantage of audiobooks, although I’ve found having access to a `skip backwards 15 seconds` button can help a lot.
(There was one point during Riyria Revelations where a character was explaining how Elven succession works. After repeating the sequence a bunch of times, I finally had to get out my laptop and take notes.)
I feel the same way. It goes away if I already know what's going to happen. For this reason I strongly recommend reading the things you love a second time.
Neither a lack of traffic lights nor cell service should cause the Waymos to stop in the middle of the road, that’s really troubling. I can understand the system deciding to pull over at the first safe opportunity, but outright stopping is ridiculous.
Perhaps this is by design. Cruise had a failsafe system that detected a collision and decided to pull over but by pulling over it dragged a person underneath the car (or something close to this scenario). Maybe this dumb failsafe was designed not to repeat Cruise's mistakes?
Certainly a better way to handle this would have been to pull over. I think stopping where ever it happened to be is only acceptable if the majority of sensors fail for some reason
I was there. I encountered multiple stopped Waymos in the street. It was annoying, but not dangerous. They had their lights on. Any driver following the rules of the road would get around them fine. It was definitely imperfect, but safe. Much safer than the humans blowing through those very same intersections.
When I was a young man, I worked at a restaurant, and the lights went off.
I being the hero I was, wanted to keep the show running, bought some candles, ovens worked fine, water worked fine (for now). I wanted to charge cash. But eventually big boss came and shut us down since light wasn't coming.
And he was right, cooking and working under those conditions is dangerous for the staff, but also for the clients, without light you cannot see the food, cannot inspect its state, whether stale, with visible fungi, etc...
Yes, the perfect worker would still operate under those conditions, but we are not perfect, and admitting that we only can provide 2 or 3 nines, and recognizing where we are in that 0.01% moment, is what keeps us from actually failing so catastrophically that we undo all of the progress and benefits that the last bit of availability would have allowed us.
> but also for the clients, without light you cannot see the food, cannot inspect its state, whether stale, with visible fungi, etc...
...I have to say, I'm pretty skeptical of this one. I've eaten in lots of dark restaurants, sometimes lit pretty much just by candles on the table. Seems to work fine.
AIUI, it was the irregularity of the uncontrolled intersections combining with the “novel” (from the POV of the software) driving style of the humans. In dense areas during outages signaled intersections don’t actually degrade to 4 way stops, drivers act pretty poorly.
The normal order and flow of traffic broke down. The software determined it was now outside its safe parameters and halted.
Certainly not ideal, and the should be a very strong regulatory response (the gov should have shut them down), and meaningful financial penalties (at least for repeat incidents).
Waymos rely on remote operators to take over when the vehicle doesn't know what to do, and obviously if the remote connection is gone then this is no longer available, and one might speculate that the cars then "fail safe" by not proceeding if they are in a situation where remote help is called for and inaccessible.
Perhaps traffic lights being out is what caused the cars to stop operating autonomously and try to phone home for help, or perhaps losing the connection home is itself enough to trigger a fail safe shutdown mode ?
It reminds a bit of the recent TeslaBot video, another of their teleoperated stunts, where we see the bot appearing to remove a headset with both hands that it wasn't wearing (but that it's remote operator was), then fall over backwards "dead" as the remote operator evidentially clocked off his shift or went for a bathroom break.
That’s clearly unacceptable. It needs to gracefully handle not having that fallback. That is an incredibly obvious possible failure.
Things go wrong -> get human help
Human not available -> just block the road???
How is there not a very basic “pull over and wait” final fallback.
I can get staying put if the car thinks it hit someone or ran over something. But in a situation like this where the problem is fully external it should fall back to “park myself” mode.
> How is there not a very basic “pull over and wait” final fallback
Barring everything else, the proper failsafe for any vehicle should be to stop moving and tell the humans inside to evacuate. This is true for autonomous vehicles as well as manned ones–if you can't figure out how to pull over during a disaster, ditching is absolutely a valid move.
If the alternative is that the vehicle explodes, sure. And since GP did say "final fallback", I suppose you're right. But if the cars are actually reaching that point, they probably shouldn't be on the road in the first place.
The not-quite-final fallback should be to pull over.
Yeah. I wasn’t considering people, just getting the car out of the way.
I wasn’t considering people taking it as a given that any time the car gives up the doors should be unlocked for passengers to leave if they feel it’s safe.
And as a passenger, I’d feel way safer getting out if it pulled over instead of just stopped in the middle of the street and other cars were trying to drive around it.
"Navigating an event of this magnitude presented a unique challenge for autonomous technology. While the Waymo Driver is designed to handle dark traffic signals as four-way stops, it may occasionally request a confirmation check to ensure it makes the safest choice. While we successfully traversed more than 7,000 dark signals on Saturday, the outage created a concentrated spike in these requests. This created a backlog that, in some cases, led to response delays contributing to congestion on already-overwhelmed streets."
I’ve done this before, but you need a relatively long subway ride without any transfers. IMO, 30 minutes is just barely at the edge of being worthwhile, and only if you can get a seat right when you get on, and only if the seat isn’t so cramped that it’s actually possible to get your laptop out of your bag. This happens rarely.
But on longer trips from e.g. upper Manhattan to deep Brooklyn, particularly at off-peak hours when I have room to spread out—yeah, I’ve had some very productive sessions.
reply