> Bad code is the result of bad programming, not a consequence of the chosen language.
You've repeated this many times now. What I haven't seen you talk about is how to fix all this programmer badness. Do you think bad programmers should all be fired and blacklisted from the whole industry? Who will replace them? How do we make sure their replacements aren't just as bad as the ones who were fired? How do we even identify the bad ones before they write bad code? What if they want to become better programmers instead of getting fired? How do they figure out whether they're sufficiently good?
The value proposition of a new language is not measured strictly by how fast it lets you write new code. It also needs to be a medium for communication between programmers. Communicating the intent behind your code helps identify the ways it could be improved. If your intent can be codified in a way that even the computer can understand, that process speeds up dramatically.
Proponents of new languages are winning the debate over how to fix systemic problems in the software industry. The reason they are winning is because their opponents in this debate do not have a coherent solution. If you can suggest one, maybe you'll change everything.
> What I haven't seen you talk about is how to fix all this programmer badness.
Well, that wasn't part of the conversation until now.
In a sense, I have hinted at this. There are two elements, education and experience.
I firmly believe good programmers come from having a solid foundation built on low level code. That means a good progression might be assembler, Forth and then C. If I ask someone to explain how a list is stored and manipulated in memory in a language like Python, I expect at least a plausible explanation rather than a shrug. For me it doesn't even have to be be absolutely correct to show me they have gotten their hands dirty with low level code.
Forth is interesting not only because of the RPN paradigm; one can learn a lot from implementing it from scratch on any microprocessor. From there you go on to actually turning that into a useful console-based computer. For example, implement all the peripheral drivers, a file system, file manager, text/code editor, etc. I would not have anyone touch C until they have completed the prior work.
There's a reason for which companies like Google have seemingly crazy hiring processes for software developers: Our schools seem to be doing a crap job of training them. If that were not the case, there would be no need for such tests. A degree with a decent GPA would be enough.
> Do you think bad programmers should all be fired and blacklisted from the whole industry?
I am assuming that's not a serious question. There are bad doctors, attorneys, cops, teachers and carpenters. There is no such thing as equality of outcomes in anything. Hopefully the natural process in each domain expunges incompetence over time. That's the best we can hope for. Other than that, I can't tell you what we should do.
I have to go back to my premise (flipping it around a bit): A different programming language isn't going to magically turn a bad programmer into a good one; much like a $4000 computerized welder wasn't going to make me a better welder.
At the extremes, if someone doesn't know how to solve problems computationally, there's no language you can throw at them that will turn them into CS problem solvers.
> Proponents of new languages are winning the debate over how to fix systemic problems in the software industry. The reason they are winning is because their opponents in this debate do not have a coherent solution. If you can suggest one, maybe you'll change everything.
No. That's not correct. The reasons we keep taking crazy rides up and down a bunch of languages is that developers are coming out of schools with skills that require them to start at that level. As I said in another comment, the first thing every recent grad reaches for is a complex object structure. Because that's all the know. They don't actually know we were doing things like sending people to the moon without any of that stuff. They think it's necessary. And so, they develop tools and frameworks "in their image", if you will. Which means that the entire thing is a self-fulfilling prophecy.
I remember one of the most impactful examples my son (a recent MS CS grad) experienced while working with me on a project. He needed a serial communications library for a robotics system we were building. He reached for a library and used it. The thing did not perform well and was giving us problems. That's when I became involved. The library consisted of, I don't know, two to four pages of classes, methods, etc. After understanding what it was doing I re-wrote what we needed in something like ten lines of procedural code.
The unnecessary bloat in various programming languages ecosystems is something that should make anyone take pause. I mean, you see things like someone creating an entire object hierarchy with methods and properties for what amounts to managing a few thousand bytes of data in an array in memory. Instead of a raw close-to-the-machine for loop iterating through the data you end-up with a dozen objects instantiated, copious properties, layers of methods and...well, you get the point (I hope).
> Well, that wasn't part of the conversation until now.
Sure it was. You kept saying "this isn't a solution." All I did was point out the converse.
I experienced the kind of "modern" CS education that you consider a failure, and believe it or not, I have about as much scorn for it as you do. I did get some good exposure to serious analysis and low-level programming, but the other half of what I learned was pretty much a waste and I had to unlearn it the hard way after leaving school. So, I'm not here to defend the Java idiom of mile-high towers of superclasses, runtime polymorphism that nobody will ever need, or wild pointer goose chases that accomplish nothing but stalling the pipeline. All that stuff is a waste of everyone's time. I like my compiled code to stay lightweight and close to the metal.
I am here to defend expressive type systems that permit detailed annotations of what should or shouldn't be done with a particular piece of data. I shouldn't have to rely on comments alone to say "the pointer returned by this function must be freed by the caller" or "the pointer returned by this function must NEVER be freed by the caller." I want the compiler to understand me when I say these things, and I want it to enforce my rules.
C doesn't have that, obviously, but I'm sure you already know that it was a huge change from its immediate predecessors, which didn't have types. Even early C didn't have structs. It added these features because they made programming less error-prone. Nobody wanted to manually calculate field offsets and risk getting it wrong.
That was 50 years ago. There have been missteps since then (I think every PL theorist counts OOP among these) but that doesn't mean C has to be the absolute last systems language ever in the history of computing.
It isn't a failure. It simply isn't optimal. Just my opinion, of course. Which means I could be very wrong.
> "the pointer returned by this function must be freed by the caller" or "the pointer returned by this function must NEVER be freed by the caller."
Those two are examples of bad programming. Both constraints are almost guaranteed to create, at a minimum, an unmanageable mess and, at worst, very dangerous software (think embedded controller for a robot or a rocket).
While I have not looked, I would be very surprised if something like that existed in the Linux codebase, which is C.
> that doesn't mean C has to be the absolute last systems language ever in the history of computing.
I don't think I have suggested this at all in this conversation.
My position is very simple: Don't blame the language for bad programming. This is rarely the problem.
The fact that someone can cause a mess using pointers does not mean pointers are the problem. They simply don't know what they are doing. The Linux codebase uses pointers everywhere, right? Is it a mess? No. Maybe that's because they are using the language correctly.
I am also not saying that all work in evolving programming languages should stop because C is perfect. Not the case I have made at all. What I will say is that --again, my opinion-- quite a few of the modern paradigms are complete nonsense in support of lazy programmers.
The question to ask might be something like:
Would you have been able to create the software without bugs using C?
If the answer is "yes", then it is likely the newfangled language was not needed or justified.
As an observation, pretty much all of these languages look like C. Outside of assembler, the only three languages I have used in my career that explored other paradigms were Forth, LISP and APL. I used these languages professionally for about 10 years. Not one of the modern languages people rave about have done anything to truly push the mark forward. In many ways APL was, in my opinion, the pinnacle. It' problem was that it surfaced way ahead of hardware being able to embrace it. For example, this code would take half a page of crud to create with any of the C-derivative modern languages:
+/⍉10 10 ⍴ ⍳100
And we were able to do this FORTY years ago.
What does it do?
This creates a vector of 100 consecutive numbers, 1 to 100.
Of course, this is a super-simple example of something that isn't necessarily ground-breaking on first inspection. Anyone should be able to reproduce this result with reasonable efficiency using C or derivatives. Not one line, but, who cares, that's not the important metric.
One of the most interesting demonstration of APL for those who have never seen it in action is this implementation of Conway's Game of Life in APL. Very much worth watching:
Once you internalize APL (which does not happen through casual use) it changes the way you think about how to solve problems computationally. You, quite literally, think at an entirely different level. Your brain visualizes data structures and implementation in a very different form. The closest equivalent I can reach for is using musical notation to describe music; which, of course, requires someone to have internalized the connection between notation and music.
This is where I was going when I said that we likely need a better programming paradigm for AI. That, in my opinion, can easily justify a new language. And, yes, I am biased --ten years of APL use will do that to you-- I think it has to be symbolic. Quite literally, a new language --just like musical notation is a language for music.
Going back to my core premise and the title of this thread: Throwing Rust at software development isn't going to fix bad programming. I don't see the point. And, yes, I could be wrong. If all hope of having capable programmers is gone, then, yes, of course, we need to make sure they don't make a mess just 'cause they have no clue.
You've repeated this many times now. What I haven't seen you talk about is how to fix all this programmer badness. Do you think bad programmers should all be fired and blacklisted from the whole industry? Who will replace them? How do we make sure their replacements aren't just as bad as the ones who were fired? How do we even identify the bad ones before they write bad code? What if they want to become better programmers instead of getting fired? How do they figure out whether they're sufficiently good?
The value proposition of a new language is not measured strictly by how fast it lets you write new code. It also needs to be a medium for communication between programmers. Communicating the intent behind your code helps identify the ways it could be improved. If your intent can be codified in a way that even the computer can understand, that process speeds up dramatically.
Proponents of new languages are winning the debate over how to fix systemic problems in the software industry. The reason they are winning is because their opponents in this debate do not have a coherent solution. If you can suggest one, maybe you'll change everything.