Linking against every distro-supplied glibc to distribute your own software is as unrealistic as getting distributions to distribute your software for you. The model is backwards from what users and developers expect.
But that's not the point I'm making. I'm attacking the idea that they're "working just fine" when the above is a bug that nearly everyone hits in the wild as a user and a developer shipping software on Linux. It's not the only one caused by the model, but it's certainly one of the most common.
It's hardly unrealistic - most free software has been packaged, by each distro. Very handy for the developer: just email the distro maintainers (or post on your mailing list) that the new version is out, they'll get round to packaging it. Very handy for the user, they just "apt install foo" and ta-da, Foo is installed.
That was very much the point of using a Linux distro (the clue is in the name!) Trying to work in a Windows/macOS way where the "platform" does fuck-all and the developer has to do it all themselves is the opposite of how distros work.
User now waits for 3rd party "maintainers" to get around to manipulating the software they just want to use from the 1st party developer they have a relationship with. If ever.
I understand this is how distros work. What I'm saying is that the distros are wrong, this is a bad design. It leads to actual bugs and crashes for users. There have been significant security mistakes made by distro maintainers. Distros strip bug fixes and package old versions. It's a mess.
And honestly, a lot of software is not free and won't be packaged by distros. Most software I use on my own machines is not packaged by my distro. ALL the software I use professionally is vendored independently of any distribution. And when I've shipped to various distributions in the past, I go to great lengths to never link anything if possible that could be from the distro, because my users do not know how to fix it.
distributions started out with solving the problem that most developers at that time didn't even bother to build ready to run packages. they couldn't, because there were to many different architectures that not everyone had access to. so developers had to rely on users to build the applications for themselves. distributions then organized around that to make this easier for users. that's how the port system in BSD came about. linux distributions went a step further and built distributable binaries.
the problem was to not predict that developers would want more control over the build of their applications, which, thanks to architectures consolidating, became easier because now a single binary will reach the majority of your userbase. and the need to support multiple versions of the same library or app in the package manager. that support should have been there from the start, and now its difficult to fix that.
so it's unfair to say distros are wrong. yes, it's not an ideal design, but this is more of an accident of history, some lack of foresight, and the desire to keep things simple by having only the newest version of each package.
there is a conflict between the complexity of supporting multiple package versions vs the complexity of getting applications to work with the specific library versions the distro supports. when distros started out it looked like the latter would be better for everyone. distributions tended to have the latest versions of libraries and fixing apps to work with those benefited the apps in most cases.
But that's not the point I'm making. I'm attacking the idea that they're "working just fine" when the above is a bug that nearly everyone hits in the wild as a user and a developer shipping software on Linux. It's not the only one caused by the model, but it's certainly one of the most common.