> In a strict language, you can say (roughly at least) that replacing [] with [1..n] will add at minimum a certain number of extra CPU cycles and a certain amount of additional allocation.
It's not at minimum, it's always the worst case cost of N in strict languages, whereas the lazy setting provides you with amortized complexity.
I think you might be misunderstanding what I meant by "at minimum". I'm talking about the case of replacing a [] constant in some arbitrary piece of code with something like [1..10000]. Given strict semantics you'll of course incur at least the time and space costs associated with constructing the list (that's the "at minimum" bit), but you might also incur additional costs depending on what the rest of the code does with the list. For example, it might execute some arbitrarily expensive computation if the list has more than 20 elements, or whatever.
I think you might have thought I was saying that given a strict semantics it was somehow still possible that you wouldn't necessarily incur the full time and space penalty for constructing n elements (which of course is not the case).
It's not at minimum, it's always the worst case cost of N in strict languages, whereas the lazy setting provides you with amortized complexity.