I think it's a "forgetting" again: Hindley-Milner type inference dates back to 1969! And still very few languages let you use it. Some have wisely added a very weak type inference (var/auto).
Let's not forget the massive handicap that there is one and only one programming language that the browser allows: Javascript.
I think HM is simply not practical. You don’t want your types to be a multidimensional logic puzzle solved by a computer, cause you want to reason about them and you are much weaker than a computer. You want clear contracts and rigidity that they provide to the whole code structure. And only then to automatically fill up niches to avoid obvious typing. You also rarely want Turing Completeness in your types (although some people are still in this phase, looking at some d.ts’es).
Weak var/auto is practical. An average program and an average programmer have average issues that do not include “sound type constraints” or whatever. All they want is to write “let” in place of a full-blown declaration that must be named, exported, imported and then used with various modifiers like Optional<T> and Array<T>. This friction is one of the biggest reasons people may turn to the untyped mess. Cause it doesn’t require this stupid maintenance every few lines and just does the job.
Very few people use HM type systems even today, though.
I think it really is worth considering that Java effectively didn't have sum types until, I think, version 17, and nowadays, many modern and popular statically typed languages have them.
Let's not forget the massive handicap that there is one and only one programming language that the browser allows: Javascript.