I also had a great time in the 90s and early 00s doing various software and tech gigs for non-software, mom & pop type outfits that just needed a bit of scripting, a web app, some network setup, etc... often found through mailing lists, usenet, user groups, friends of friends.
I've tried to go back to that a few times, but it's actually pretty hard to do these days. After a few decades of trillions of dollars of investment, pretty much every tiny niche has become a company / app with dozens of developers, or available as an online customizable SaaS, or something that can be vibe coded.
You seem to be in Europe. The problem you'll find is that Europe is composed, to a greater or lesser extent, of US proxy/vassal states. This is for historical reasons, WWII, Marshall Plan, Lend-Lease, NATO, Bretton Woods, IMF, "rules-based order", etc., etc.
There is really no point in worrying about personal independence for smart phones, github hosting and operating systems if the state itself is not independent and self-sufficient.
I wonder if all this shoving of AI down peoples throats could trigger a bit of a backlash around vendor software updates / proprietary software in general. There's this huge infrastructure of Windows Update, chrome auto-updates, app stores and SaaS that predated and enabled all this... and people accepted it when they were getting bugfixes and security updates out of it, but now it's getting used to take away the features they wanted and replace them with worse and worse versions of crapware.
All of a sudden... the free software world of updating when _you_ want the new version, and being able to fork the old version if you want, starts to look pretty great.
In C/C++, this can be done by just not using malloc() or new.
You can get an awfully long way in C with only stack variables (or even no variables, functional style). You can get a little bit further with variable length arrays, and alloca() added to the mix.
With C++, you have the choice of stack, or raw new/delete, or unique_ptr, or shared_ptr / reference counting. I think this "multi-paradigm" approach works pretty well, but of course its complicated, and lots of people mess it up.
I think, with well-designed C/C++, 90+% of things can be on the stack, and dynamic allocation can be very much the exception.
I've been switching back and forth across C/C++/Java for the past few months. The more I think about it, the more ridiculous/pathological the Java approach of every object dynamically allocated, impossible to create an object not on the heap seems.
I think the main problem is kind of a human one, that people see/learn about dynamic allocation/shared_ptr etc. and it becomes a hammer and everything looks like a nail, and they forget the prospect of just using stack variables, or more generally doing the simplest thing that will work.
Maybe some kind of language where doing dumb things is an error would be good. e.g., in C++ if you do new and delete in the same scope, it's an error because it could have been a stack variable, just like unreachable code is an error on Java.
You’re absolutely right — C and C++ give you the primitives to do this manually. If every developer followed the “stack first, heap only when necessary” discipline, and carefully used unique_ptr or avoided new/delete when possible, you could achieve much of the same safety and determinism.
The difference I’m aiming for is that these constraints aren’t optional — they’re baked into the language and compiler. You don’t rely on every developer making the right choice; instead, the structure of the code itself enforces ownership and lifetime rules.
So in your terms, instead of “doing dumb things is an error,” it’s structurally impossible to do dumb things in the first place. The language doesn’t just punish mistakes with foot-guns, it makes the safe path the only path.
This also opens up other possibilities that are really awkward in C/C++, like structured concurrency with deterministic memory cleanup, restartable scopes, and safe parallel allocations, without relying on GC or heavy reference counting.
I’d be curious: if C++ had a compiler that made stack-first allocation the default and forbade escapes unless explicit, would that solve most of the problems you’ve experienced, or are there still edge cases that would require a fundamentally different runtime model?
As far as I'm concerned, stack-first allocation _is_ the default. It's true that the default exists in my head rather than in in a compiler, though.
Maybe think about whether what you propose could exist as a compiler warning, or static analysis tool. Or, if you want to create your own language, go for it, that's cool too.
For my purposes... the choice of paradigms, compilers, platforms with C++ and ability to handle and work on decades of existing code outweighs the benefits of "improved" languages, but that's just me.
I think it all comes down to him testing his work, attention to detail, and taking responsibility for making it work, just like anyone else.
I don't really think "typo" is a useful word in this context. It's a sequence of characters that... _does not work_ and _was not tested_. The fact that it is close to a different sequence of characters that would work, or that a human could recognize it for that other sequence, isn't really relevant.
Some of the engineering approaches you mention can help, but at the end of the day he has to be responsible for verifying that things work. Tests, schemas, etc. can help, but you don't want to get into a game of "the test/linter/AI didn't catch this for me, therefore it is broken".
I've worked with plenty of people over the years with linguistic quirks, like spelling insisting on spelling "deliminator", or "pluggin", or whatever, but if the code works, it works.
This is true for so many things - even driverless taxis, drone deliveries, even office jobs / AI.
The narrative is that human labor is expensive super expensive, there are "skills shortages", etc etc... but in actuality, hiring a few people rounds down to 0 in the context of an airliner or an office building in Manhattan, and you get a lot of political sway for employing folks and paying payroll taxes, and the "doorman fallacy" is very real. The "robots taking our jobs" narrative seems hugely exaggerated to me.
Yes, I agree, I don't see cargo plane pilots being replaced here. Or indeed commercial pilots in general.
Pilots are an unusual species because most of their utility and training is in the ability to deal with (very) edge cases. Indeed it is their ability to deal with unanticipated edge cases which is their most valuable attribute.
Sure, most pilots won't need those skills during their careers, but the value when they do us immeasurable. Landing on the Hudson anyone?
Equally it is the situational awareness and anticipation of problems which avoid things that could have escalated into disaster but instead become near misses.
Sure 99.99% of their work is routine and could be done hy a machine. But that last 0.01% is thousands of lives, and billions in equipment and cargo.
You can store the other details in the notes - e.g. which card / account is used for payments, what are the password reset questions & answers, which address is set for delivery.
Then, when you change credit cards / addresses / emails, etc., it's easy enough to find all the accounts you used the old one for and need to update.
I've tried to go back to that a few times, but it's actually pretty hard to do these days. After a few decades of trillions of dollars of investment, pretty much every tiny niche has become a company / app with dozens of developers, or available as an online customizable SaaS, or something that can be vibe coded.
reply