It's wrong to frame this as resistance to change for no reason. See my other comment. I see some of this stuff as repeating mistakes that were made in the design of Perl. ...but there are quite few people around these days who know Perl well enough to recognize the way in which history is repeating itself, and that has at least something to do with age.
This is possibly the best example of the ambiguity of language I've ever seen. Two contradictory meanings expressed in the exact same phrase, and both of them are valid in the broader context.
Jeez. What number of people who read the same phrase with either of those two meanings then continue to form opinions and even make decisions based on the resulting meaning?
Me, I am old enough to know Perl, and I've got plenty of one-line skeletons in my own closet. And it more-or-less entered the world already vastly more TMTOWTDI-y than Python is after 3 decades.
FWIW, I tend to think of comparisons to Perl as being a lot like Nazi comparisons, only for programming languages. And I do think there's some wisdom to the Godwin's Law idea that the first person to make a Nazi comparison is understood to have automatically lost the argument.
It's just that, at this point, Perl is both so near-universally reviled, and so well-understood to be way too optimized for code golf, that any comparison involving it is kind of a conversation-killer. As soon as it shows up, your best options are to either quietly ignore the statement in which the comparison was made, or join in escalating things into a flamewar.
I wouldn't call it reviled. Perl makes for a poor general purpose programming language, it always did. You can write an HTTP server in Perl but you probably shouldn't. It's very good for what it was always intended for, those situations where you need to process some data, but like just once not every week for the rest of eternity.
I've never regretted a Perl program that I wrote, used and discarded. And I've never been content with a Perl program I found I was still using a week after I wrote it.
The point #1 is expanded on in Feral by George Monbiot. Basically, we have a tendency to see the outside world we grew up with as the way things naturally should be, ignoring that previous generations may have changed it to be that way. That sheep-grazed pastoral landscape is easy to view as a thing worth preserving, but to an ecologist it might be a barren waste where there used to be a beautiful forest.
Forewarned is forearmed. I headed into adulthood watching out for such mirages. For example: Making sure to listen to pop music enough that it does exactly what pop music is supposed to do (worm its way into your subconscious) so I don't wake up one morning unaccountably believing that Kylie Minogue was good but Taylor Swift isn't.
My understanding of Python will probably never be quite as good as my understanding of C, but I can live with that.
How do you know to listen to Taylor Swift or whatever? In the last century it was easy to be in sync: you could just watch MTV. Is there something keeping the notion of pop coherent these days?
Not exactly pop, but there are some great weekly music podcasts that I listen to to hear new music, which tend to be a little more indie pop/rock/${genre} than pop :)
Apple/Google Music or Spotify or Pandora all have pop playlists that play the current top 100 songs on rotation. The Billboard Hot 100 also lists popular western music if you just want a list to review on your own.
I’d argue it’s easier now to stay in sync than even when MTV was popular. MTV you needed a cable subscription and be sitting at a TV, now SiriusXM or Apple/Google/Spotify can stream it right to your phone laptop or tablet, and regular FM radio will play it on the local Top 40 station.
I'm 34 and I don't like this, so it's definitely not only those above 35. Jokes aside, I would say I'm a minimalist and this is where my resistance comes from. One of the things that I dislike the most in programming is feature creep. I prefer smaller languages. I like the idea of having a more minimal feature set that doesn't change very much. In a language with less features, you might have to write slightly more code, but the code you write will be more readable to everyone else. Said language should also be easier to learn.
IMO, the more complex a system is, the more fragile it tends to become. The same is true for programming languages. Features will clash with each other. You'll end up having 10 different ways to achieve the same thing, and it won't be obvious which direction to go.
Furthermore, I did my grad studies in compilers. I've thought about writing an optimizing JIT for Python. I really feel like CPython is needlessly slow, and it's kind of embarassing, in an age where single-core performance is reaching a plateau, to waste so many CPU cycles interpreting a language. We have the technology to do much better. However, the fact that Python is a fast moving target makes it very difficult to catch up. If Python were a smaller, more stable language, this wouldn't be so difficult.
> In a language with less features, you might have to write slightly more code, but the code you write will be more readable to everyone else.
I disagree with this, which is precisely why I prefer feature rich languages like Java or better yet Kotlin. It doesn't get much more readable than something like:
Python is a little more readable, but both Python and Kotlin are perfectly clear in this case:
sorted((u for u in users
if u.last_name.startswith("S")),
key=lambda u: u.last_name
)[:3]
If last_name is a function, which it often would be in Python, it gets better:
sorted((u for u in users
if last_name(u).startswith("S")),
key=last_name
)[:3]
However, I think you probably got the sort key wrong if you're taking the first three items of the result. Maybe you meant key=abuse_score, reverse=True, or something.
I disagree this python version is as readable and here’s why. It’s about as many characters but more complex. The Kotlin version performs several distinct actions, each being clear to its purpose. These actions have the same syntax (eg requires less parsing effort). The Python version mixes at least 4 different language syntax/features, being list comprehension, if special form in the list comprehension, keywords, and lambda functions.
On top of the lessened readability, the Kotlin version makes it very easy to add, subtract, or comment out lines/actions which really helps when debugging. The Kotlin version is almost identical in structure to how you’d do it in Rust, Elixir, etc.
I agree. I don't know Kotlin and am reasonably well versed in Python, yet I immediately grasp the Kotlin example as more readable, while having to squint at the Python one for a few seconds. (this is anecdotal of course, and does not account for the example possibly being contrived)
One thing that I like more in the Python version is that it contains less names: .asSequence and .take are replaced by operators of much greater generality, while the ugly implicitly declared identifier it is replaced by explicitly deciding that sequence elements are u.
It should also be noted that Python would allow a more functional style, possibly leaving out the list comprehension.
It's surprising to me that there are people who disagree with my opinion about this, but it suggests that my familiarity with Python has damaged my perspective. You're clearly much less familiar with Python (this code doesn't contain any list comprehensions, for example), so I think your opinion about readability is probably a lot more objective than mine.
FWIW most of the programming I've ever done has been in Python, and while I have no trouble understanding either snippet, I think that the Kotlin snippet is much clearer in intent and structure.
I certainly didn't mean to imply that only someone unfamiliar with Python could prefer the Kotlin version! Perhaps you thought I meant that, but I didn't.
> this code doesn't contain any list comprehensions, for example
It does contain a generator expression though, which is the same as a list comprehension in general structure, but slightly more confusing because it doesn't have the relationship to lists that square brackets in a list comprehension would have given it.
Yes, it shares the structure of a list comprehension, but has different semantics. In this case a listcomp would have worked just as well.
My point, though, was that not being able to tell the difference was a key "tell" that the comment author was not very familiar with Python — in some contexts, that would tend to undermine credibility in their comment (and then it would be rude to point it out), but in this context, it probably makes their opinion more objective.
Good point, though it's less my familiarity with Python and more that I tend to simplify and call generator expressions as list comprehensions unless the laziness is important to call out (meta laziness there? ;) ). Mainly since L.C.'s were first and describing the differences is tedious.
This isn't very readable at all and certainly not any more readable than a chain of method calls, being that you've spread the operations out in different places. It's not even syntactically obvious what the `key` argument is passed to if one doesn't know that `sorted` takes it. None of those problems exist when piping through normal functions or chaining method calls.
Python is for the most part overrated when it comes to these things, IMO. It's a nice enough language but it's aged badly and has an undeserved reputation for concision, readability and being "simple".
C# supports both conventions (in LINQ) - I mean the Kotlin one from the grandparent comment, and the Python's from parent's.
The method chaining syntax and the query syntax are alternatives. I think most devs lean towards the former, considered to be cleaner... whereas the latter is probably easier to learn in the beginning, to those unfamiliar with piping/functional style - owing to its SQL feel-alikeness.
ReSharper would offer converting the latter to the former, and that's how I learned method-chaining LINQ back in the day.
A little off-topic but how does that work? Is 'it' a magic variable referring to the first argument? Never seen magic variables that blend into lambdas like that before... would've expected $1 or something like that.
The idea of anaphoric macros[1] is first found in Paul Graham's "On Lisp"[2] and is based on the linguistic concept of anaphora, an expression whose meaning depends on the meaning of another expression in its context. An anaphor (like "it") is such a referring term.
I think if you like this idea, you will really like the book. Better still, you can download the pdf for free.
Inside of any lambda that takes a single parameter you can refer to the parameter as 'it'. If you prefer to name your parameters you can do so as well, it's just slightly more verbose:
users.asSequence()
.filter { user ->
user.lastName.startsWith("S")
}
.sortedBy { user ->
user.lastName
}
.take(3)
Groovy is from 2003. PG keynoted PyCon in 2003 talking about his progress on Arc: http://www.paulgraham.com/hundred.html. He had been talking about Arc online for a couple of years at that point, including in particular the convenience of "anaphoric macros" that defined the identifier "it" as an implicit argument.
(He'd also written about that more at length in the 1990s in On Lisp, but many more people became acquainted with his language-design ideas in the 2001–2003 period, thanks to Lightweight Languages and his increasingly popular series of essays.)
But surely Perl's $_ was way more influential than an obscure PG talk. I was reading PG way back in 2004, and I had never heard of anaphoric macros until now.
Wait, you think that, in the context of programming language design, a PyCon keynote is an obscure talk? I don't know what to say about that. It might be possible for you to be more wrong, but it would be very challenging.
Anyway, I'm talking specifically about the use of the identifier "it" in Kotlin, not implicitly or contextually defined identifiers in general, which are indeed a much more widespread concept, embracing Perl's $_ and @_, awk's $0 (and for that matter $1 and $fieldnumber and so on), Dyalog APL's α and ω, Smalltalk's "self", C++'s "this", dynamically-scoped variables in general, and for that matter de Bruijn numbering.
Compared to the existence of Perl, yes. Anyone who does any amount of Perl learns that $_ is the implicit argument ("like 'it'") to most functions. It's pretty much one of Perl's main deals. The talk has about 100K views on YouTube, which is pretty good, but Perl is in another league.
Too bad Apache Groovy itself didn't remain popular after popularizing the name "it" for the much older idea of contextually-defined pronouns in programming languages. Using the names of pronouns in English (like "this" and "it") is easier for an English-speaking programmer to understand than symbols like "$1" or "_". But because of Groovy's bad project management, another programming language (Kotlin) is becoming widely known for introducing the "it" name.
Pretty sure the Go community will be fine with not being feature rich, since simplicity, maintainability and getting new people up to speed matter more for them.
Furthermore, I did my grad studies in compilers. I've thought about writing an optimizing JIT for Python. I really feel like CPython is needlessly slow, and it's kind of embarassing,
Many have tried and failed, Google and Dropbox to name a couple, and countless other attempts.
Yes, PyPy is fantastic for long-running processes that aren't primarily wrappers around C code. In my experience, the speedups you see in its benchmarks translate to the real world very well.
It's not the new features of Python that make it hard to optimize; it's the fundamental dynamic nature of the language that was there from day one. Syntactic sugar doesn't have an impact one way or the other on optimizing Python.
The new features aren't just syntactic, they're also new libraries that come standard with CPython, etc. If you want to implement a Python JIT that people will use, you have to match everything CPython supports. Furthermore, since the people behind CPython don't care about JIT, you also can't count on them not adding language features that will break optimizations present in your JIT. You can't count on these being just "syntactic sugar". Even if you could though, in order to keep up it means you have to use CPython's front-end, or constantly implement every syntactic tweak CPython does.
Lastly, AFAIK, CPython's FFI API is actually more of a problem than the dynamic semantics of the language. You can expose Python objects directly to C code. That makes it very hard for a JIT to represent said Python objects in an efficient way internally.
> In a language with less features, you might have to write slightly more code, but the code you write will be more readable to everyone else.
That's not universally true. C# has more features than Java but is generally easier to read and the intent of the code is easier to follow. The lack of features, like properties or unsigned integers, leads to Java coders creating much more convoluted solutions.
If languages with less features were universally better we would all be using C and BASIC for everything.
I think the importance is the orthogonality of the features.
Eg. having so many ways to do string formatting or now multiple ways of doing assigments are not ortogonal and thus can be seen as cluttering.
I'm 38, and I'm fine with these changes, and ive been using Python for +15 years.
I can plainly see how these changes will actually make my code cleaner and more obvious while saving me keystrokes.
I also don't think these changes are very drastic. They're opt-in, doesn't break anything and looks to lead to cleaner code. I love the walrus operator (not so sure about the name, but hey. C++ is getting the spaceship operator... As has been said, naming things is hard). To me, the change of print from a statement to a function has been the hardest Python chamge over the years. Just too much mental momentum. Even though ive been on Python 3 for years, I still make the mistake of trying to use it as a statement. That said, I think it was the right (if painful) move.
theory : age itself with regards to computing has nothing to do with how old you act (with regards to computing), the time you spent doing a specific thing is what grows that 'characteristic'.
Anecdote : i'm fairly young, but i've been involved with python long enough and traveled to enough pycons to be a bit jaded with regards to change within the language.
I'm fairly certain it's only due to the additional cognitive load that's thrust upon me when I must learn a new nuance to a skill that I already considered myself proficient at.
in other words : i'm resistant to change because i'm lazy, and because it (the language, and the way I did things previously) works for me. Both reasons are selfish and invalid, to a degree.
No, those aren't really the reasons for my reaction. And if I told you my age, you would probably switch your argument and say that I'm far too young to criticize ;)
I am an example which supports this notion. I've done some Python programming about 10 years ago but then took a break from programming altogether for the last 9 years. Last year I got back into it and have been using Python 3.7, and I personally love all the most recent stuff. I hate having to go back to 3.5 or even 3.6, and I end up pulling in stuff from futures.
This 'resistance to change' catchall argument puts everything beyond criticism, and it can be used/abused in every case of criticism. It seeks to reframe 'change' from a neutral word - change can be good or bad - to a positive instead of focusing on the specifics.
Anyone making this argument should be prepared to to accept every single criticism they make in their life moving forward can be framed as 'their resistance to change'.
This kind of personalization of specific criticism is disingenuous and political and has usually been used
as a PR strategy to push through unpopular decisions. Better to respond to specific criticisms than reach for a generic emotional argument that seeks to delegitimize scrutiny and criticism.
True, but this was not “specific criticism”. It was a general dismissing criticism without details, and so can be refuted with a similarly detail-less answer. A detailed criticism deserves a reasoned and detailed answer, but vague criticism gets a generic rebuttal.
I am both a Python programmer and a C++ programmer. I have programmed professionally full time in one or the other for years at a time. I think C++ is now a much better language than when I learnt it first (cfront). In particular C++11 really fixed a lot of the memory issues with shared_ptr and std:: algorithms. It is a better language now if you are doing anything larger than then a program that takes more than a few weeks to write. On the other hand, I love python for everything else and some of the new stuff is great but making a new way to print strings over and over tells me some people have too much spare time or not enough real work to do.
In my opinion formatting a string to print a debug statement should be as concise as possible whereas a lot of these fancier formatting systems are better suited to stuff that ends up staying for use by other people. Luckily there are ways to use ye olde printf style formatters in both for those times.
C++ might be "better" now (I doubt it, to be honest, it just has more features that try to fix the issue at hand; that you're using C++), but it will never, ever get simpler or simple enough. They'd have to remove something like 75% of the language to end up with something that approaches simplicity and even then there are languages that would undoubtedly do those remaining 25% much better.
I stopped writing C++ at some point in 2008/2009 but I still keep track of it to some extent and I'm continually surprised by the nonsense that is introduced into the language. The whole RAII movement, for example, is just one massive band-aid on top of the previous mistake of allowing exceptions, etc..
It'd be mostly fine in the long run, but you have all these people using like 15% of C++ and complain about it all day long, making their libraries not usable from stuff that understands C (most of which have drastically improved on the whole paradigm). There's a solution here and it's not using whichever arbitrary percentage you've decided on of C++, it's realizing that there are way better languages with real interoperability in mind to talk about lower-level things.
Good point. I think it should be rephrased in basis of personal familiarity: people who learned C++ before they were 15 indeed think that it's simple and elegant.
* Anything that is in the world when you’re born is normal and ordinary and is just a natural part of the way the world works.*
Yep, I entered the Python world with v2. I eventually reconciled myself to 2.7, and have only recently and begrudgingly embraced 3. Being over 35, I must be incredibly open minded on these things.
1. Anything that is in the world when you’re born is normal and ordinary and is just a natural part of the way the world works.
2. Anything that's invented between when you’re fifteen and thirty-five is new and exciting and revolutionary and you can probably get a career in it.
3. Anything invented after you're thirty-five is against the natural order of things.”
― Douglas Adams, The Salmon of Doubt