Readability trumps just about all else in software dev, so I am 100% with you there on variable name limits and lack of syntax extension.
```[a-zA-Z_][a-zA-Z0-9_]*``` is such a universal and useful convention that I find myself using it even when the domain allows other characters (bash, Xnix filenames). Obviously this is English-centric so I would include other language constructs if I were in a country with another dominant language.
In fact even though I have the ability (languages supporting unicode and a QMK keyboard) to use symbols, e.g. Greek letters for math, I don't, because I know it's a hassle for anyone else.
> You can use virtually any character as part of your variable name.
> invent new syntax
are a hassle and anti-pattern for the same reason.
I think you think all this stuff is bad because you’re imagining having this power in C or JavaScript, in which case these features would be awful.
However if these features are exposed through clean abstractions in a language that is syntactically bare to begin with, you might have different thoughts.
The original (implicit) question was how one language is better than another. Responding to the answer with "Things shouldn't be different" is probably not the best counter. If you value familiarity, stick to the familiar.
I do not care about downvotes on here.
> You can use virtually any character as part of your variable name. Lispers
This is not a good thing.
> However, it is nice that for a domain specific problem, you can just invent new syntax and use it.
This is equally not a good thing.