I'm guessing this person hasn't used (ImageMagick) "convert"; AFAIK it is the program that does the most to turn a sequence of command-line arguments into an imperative scripting "language"
You can built pretty darn complex things with just imagemagick command-line, between the max command line size and combining several intermediate pictures. I used to do complex dashboards, system architectural diagrams, with just bash functions calling in imagemagick. Now if you don't have imagemagick, rsvg-convert will do, just remember your svg :-).
The thing that weirds me out the most with ffmpeg is that they have input and output flipped. Input needs a flag (-i), while output is a flag-less argument.
Every other tool I know has either an output flag (-o) or takes the first argument as input and the second as output.
These kinds of complexities make accurately modeling (and instrumenting) build systems incredibly annoying.
To make things even worse: linker groups can include source inputs, not just arguments passed to the linker. Both clang and GCC seem to be aware of this and will compile the inputs, treating them as if they aren't inside the linker group[1]. Real build systems rely on this!
But hang on, this is Linux where file extensions are basically just decoration. So if GCC now has special behaviour (and a different linker invocation) depending if the file is a source or an object file, does that mean GCC has to do content sniffing to figure out what the command is supposed to do?
I don't think it's content sniffing per se -- GCC is aware of the "standard" extensions for C and C++ sources, and will complain if an input doesn't match one of them (absent other flags to override the behavior, like `-x c++`.
Also it's pretty well known that the order of command line arguments passed to the linker is important. It is sometimes also needed to pass a library to the linker multiple times.
Well known to people who are knowledgable in all the intricacies of the build tools used for systems programming, which I'm pretty sure is a much smaller group than you're assuming here.
Also depends on the linker. This is the behavior of bfd/gold, usually the default linker on Linux but lld allows "back references" as it targets more platforms and many of those don't care about the ancient UNIX convention.
Yes, sort-of, though the "imperative" linker flags -( and -) may help you. if you can figure out how to quote them properly (or use --start-group / --end-group)
I believe it would be that way, if not for performance. The linker keeps track of unresolved symbols, and resolves those symbols while parsing subsequent object files. So if you list a dependency before it is used, that dependency won’t be linked at all, resulting in unresolved symbols — thus the need for object files to be listed in a particular order.
Edit: and the need for listing a dependency twice is to resolve mutually dependent object files. If A depends on symbols from B and B depends on symbols from A, you can link A,B,A or B,A,B (per the aforementioned reasoning).
> I thought find was a strong contender for Unix command with the weirdest argument handling, but I guess gcc takes the cake.
Depending on how you count, surely dd should take the cake? Otherwise, you're just including the group of commands that uses arguments as a command language, and I don't think that's actually such a small group; ffmpeg does the same thing, too.
Apropos that (though a bit OT) : what exactly is the deal with unescaped parantheses in bash?
They just seem to produce a syntax error no matter where I use them. But if they have no purpose, then why not simply treat them as nonspecial characters?
E.g.
echo hello :-) ;
is invalid syntax, even though it doesn't seem to cause any ambiguities with any other bash syntax features. Why not just allow this and treat it like
ps can also get a prize with its incompatible mixture of BSD (non-hyphenated) and UNIX (hyphenated) options. The letters chosen appear pretty random, too.
Pedant: That’s not imperative. You can have a purely functional program that uses a flag in a list to treat subsequent arguments in a list differently. Just have a conditional on the flag, and recursively call “do_things_this_way” on one branch and “do_things_that_way” on the other.
I think there is no clear boundary where "order dependant" ends and "imperative" begins, but it's at least pretty unusual to use the same flag - or even mutually excluding flags! - together multiple times in the same invocation.
I think as a rule of thumb, if you have to mentally keep track of some internal data structure to make sense of your invocation, you're usually pretty deep in "imperative" territory.
> and recursively call “do_things_this_way” on one branch and “do_things_that_way” on the other.
That's the usual algorithm by which imperative logic is translated into equivalent functional logic though.
We know for about as long as computers exit that the two are equivalent in terms of computational power - in the sense that any imperative program can be translated into a functional program that does the same thing and vice versa. That doesn't mean the two
programs are identical though.
Pedant: You can have a purely functional program which is an interpreter for BASIC or COBOL or FORTRAN (at least for batch/non-interactive programs, although it may depend on exactly what you mean by “pure”.) Does that mean, since those languages can be correctly interpreted by a purely functional program, that they aren’t “imperative”?
Wow I needed precisely -Wl,-whole-archive about a month ago to solve a missing symbol issue in a plug-in. I didn’t find the solution then, and moved on to a different strategy. This is why is scroll hacker news! The random tidbits are essential when you need them.
Unless a "goto" feature is implemented, you can't tell whether the command line options work by scoping or by mutation. Without evidence for mutation, you have to give it the benefit of the doubt and call it scoping, which isn't imperative.
So that is to say, each option which specifies some setting affecting files to the right of it on the command line is introducing a new version of the previous environment, in which that setting is altered, and the scope of that environment is the remainder of the command line.