It isn't a failure. It simply isn't optimal. Just my opinion, of course. Which means I could be very wrong.
> "the pointer returned by this function must be freed by the caller" or "the pointer returned by this function must NEVER be freed by the caller."
Those two are examples of bad programming. Both constraints are almost guaranteed to create, at a minimum, an unmanageable mess and, at worst, very dangerous software (think embedded controller for a robot or a rocket).
While I have not looked, I would be very surprised if something like that existed in the Linux codebase, which is C.
> that doesn't mean C has to be the absolute last systems language ever in the history of computing.
I don't think I have suggested this at all in this conversation.
My position is very simple: Don't blame the language for bad programming. This is rarely the problem.
The fact that someone can cause a mess using pointers does not mean pointers are the problem. They simply don't know what they are doing. The Linux codebase uses pointers everywhere, right? Is it a mess? No. Maybe that's because they are using the language correctly.
I am also not saying that all work in evolving programming languages should stop because C is perfect. Not the case I have made at all. What I will say is that --again, my opinion-- quite a few of the modern paradigms are complete nonsense in support of lazy programmers.
The question to ask might be something like:
Would you have been able to create the software without bugs using C?
If the answer is "yes", then it is likely the newfangled language was not needed or justified.
As an observation, pretty much all of these languages look like C. Outside of assembler, the only three languages I have used in my career that explored other paradigms were Forth, LISP and APL. I used these languages professionally for about 10 years. Not one of the modern languages people rave about have done anything to truly push the mark forward. In many ways APL was, in my opinion, the pinnacle. It' problem was that it surfaced way ahead of hardware being able to embrace it. For example, this code would take half a page of crud to create with any of the C-derivative modern languages:
+/⍉10 10 ⍴ ⍳100
And we were able to do this FORTY years ago.
What does it do?
This creates a vector of 100 consecutive numbers, 1 to 100.
Of course, this is a super-simple example of something that isn't necessarily ground-breaking on first inspection. Anyone should be able to reproduce this result with reasonable efficiency using C or derivatives. Not one line, but, who cares, that's not the important metric.
One of the most interesting demonstration of APL for those who have never seen it in action is this implementation of Conway's Game of Life in APL. Very much worth watching:
Once you internalize APL (which does not happen through casual use) it changes the way you think about how to solve problems computationally. You, quite literally, think at an entirely different level. Your brain visualizes data structures and implementation in a very different form. The closest equivalent I can reach for is using musical notation to describe music; which, of course, requires someone to have internalized the connection between notation and music.
This is where I was going when I said that we likely need a better programming paradigm for AI. That, in my opinion, can easily justify a new language. And, yes, I am biased --ten years of APL use will do that to you-- I think it has to be symbolic. Quite literally, a new language --just like musical notation is a language for music.
Going back to my core premise and the title of this thread: Throwing Rust at software development isn't going to fix bad programming. I don't see the point. And, yes, I could be wrong. If all hope of having capable programmers is gone, then, yes, of course, we need to make sure they don't make a mess just 'cause they have no clue.
It isn't a failure. It simply isn't optimal. Just my opinion, of course. Which means I could be very wrong.
> "the pointer returned by this function must be freed by the caller" or "the pointer returned by this function must NEVER be freed by the caller."
Those two are examples of bad programming. Both constraints are almost guaranteed to create, at a minimum, an unmanageable mess and, at worst, very dangerous software (think embedded controller for a robot or a rocket).
While I have not looked, I would be very surprised if something like that existed in the Linux codebase, which is C.
> that doesn't mean C has to be the absolute last systems language ever in the history of computing.
I don't think I have suggested this at all in this conversation.
My position is very simple: Don't blame the language for bad programming. This is rarely the problem.
The fact that someone can cause a mess using pointers does not mean pointers are the problem. They simply don't know what they are doing. The Linux codebase uses pointers everywhere, right? Is it a mess? No. Maybe that's because they are using the language correctly.
I am also not saying that all work in evolving programming languages should stop because C is perfect. Not the case I have made at all. What I will say is that --again, my opinion-- quite a few of the modern paradigms are complete nonsense in support of lazy programmers.
The question to ask might be something like:
If the answer is "yes", then it is likely the newfangled language was not needed or justified.As an observation, pretty much all of these languages look like C. Outside of assembler, the only three languages I have used in my career that explored other paradigms were Forth, LISP and APL. I used these languages professionally for about 10 years. Not one of the modern languages people rave about have done anything to truly push the mark forward. In many ways APL was, in my opinion, the pinnacle. It' problem was that it surfaced way ahead of hardware being able to embrace it. For example, this code would take half a page of crud to create with any of the C-derivative modern languages:
And we were able to do this FORTY years ago.What does it do?
This creates a vector of 100 consecutive numbers, 1 to 100.
This reshapes the vector into a 10 x 10 matrix: In other words: This rotates the matrix along it's diagonal: This: This delivers the sum of each row as a vector: Result: You can also do it this way, same result: Of course, this is a super-simple example of something that isn't necessarily ground-breaking on first inspection. Anyone should be able to reproduce this result with reasonable efficiency using C or derivatives. Not one line, but, who cares, that's not the important metric.One of the most interesting demonstration of APL for those who have never seen it in action is this implementation of Conway's Game of Life in APL. Very much worth watching:
https://www.youtube.com/watch?v=a9xAKttWgP4
Once you internalize APL (which does not happen through casual use) it changes the way you think about how to solve problems computationally. You, quite literally, think at an entirely different level. Your brain visualizes data structures and implementation in a very different form. The closest equivalent I can reach for is using musical notation to describe music; which, of course, requires someone to have internalized the connection between notation and music.
This is where I was going when I said that we likely need a better programming paradigm for AI. That, in my opinion, can easily justify a new language. And, yes, I am biased --ten years of APL use will do that to you-- I think it has to be symbolic. Quite literally, a new language --just like musical notation is a language for music.
Going back to my core premise and the title of this thread: Throwing Rust at software development isn't going to fix bad programming. I don't see the point. And, yes, I could be wrong. If all hope of having capable programmers is gone, then, yes, of course, we need to make sure they don't make a mess just 'cause they have no clue.