Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I just started digging into C a few years ago and was struck by the same. It's amazingly simple and the only "flaw" that leaps out at me is the precedence of & and | being higher than comparison operators. Other than that, and maybe the macro system, everything frustrating about learning it was due to the frustration of dealing with the machine rather than anything C itself imposed on me.


The modern aggressive undefined-behavior based optimizations ruins any remaining appeal of the "simplicity" of C for me. The extremely broad definition of undefined behavior may allow the compiler to, for example, silently delete explicit checks for signed integer overflow [1] among other things and still claim standards compliance, but I don't think that programming model can reasonably be described as simple.

It many ways Go feels like an updated C. It retains most of the basic abilities of C, while cleaning up some things (headers, etc), having more basic libraries included, and choosing some defaults around things like out-of-bounds array access that are probably more appropriate for the majority of most programs (outside of particular hot loops, etc) in an era where software is often connected to the network and security is a bigger concern.

[1] https://gcc.gnu.org/bugzilla/show_bug.cgi?id=30475, http://blog.llvm.org/2011/05/what-every-c-programmer-should-...


Lots of programs don't need efficiency in "hot loops". You want efficient code size everywhere (usually smaller is faster), and realtime programs like games are CPU-bound everywhere not just in one loop.

Undefined behavior is a great way to help the language help you; for instance, most loops look like infinite loops without it. All you need to do is test with ubsan to see if you're dynamically undefined anywhere.


>realtime programs like games are CPU-bound everywhere not just in one loop

Isn't this a simplification? Even for games, I'd expect most often to be GPU bound, then memory bandwidth bound, and only then CPU bound. Although I get that the hot loop thing/10% rule is not universally true for every CPU bound program (especially after code has already been heavily optimized).

But I'm not sure what you are disagreeing with. I didn't mean to totally deny the usefulness of UB in C++ (although I think the core rendering code of AAA games are more performance sensitive than "the majority of most programs"). I was just pointing out that C is certainly not as simple as it first appears, and many people new to it are not familiar with the rules around UB.

Go's able to design around some of the performance aspects without UB (using the native word size for ints by default can help some of the signed integer stuff, range loops can elide bounds checks, etc), but probably can't generate as optimal code as C can without dropping to assembly language.


I love C, but it's biggest mistake is clearly the unstoppable decay of arrays to pointers:

https://www.digitalmars.com/articles/b44.html

A mistake that C++ missed an opportunity to fix.


I read the article you linked. It's too bad they didn't decide to pass a fat pointer with the array dimensions.

But when the article implies that null terminated strings and the problems associated with them are caused by arrays decaying to pointers, I get more skeptical.

I mean it's not as if arrays in C are bounds checked anyway, so how would that have helped implement strings any better?

I guess you could have bounds checked them in the stdlib code, but you certainly wouldn't have gotten it for free just by knowing the array dimensions.


> I mean it's not as if arrays in C are bounds checked anyway, so how would that have helped implement strings any better?

I've written an awful lot of C code, and I have much experience with the D way which uses arrays instead of null termination.

When I review a section of code consisting of strlen/strcat/strcpy/etc. I routinely find bugs in it, always centered around a mistake with the 0 termination.

When the array bounds are available, the compiler can optionally insert array bounds checks.

As for getting it for free, avoiding doing the strlen()'s is a big time saver.


Arrays are for FORTRAN is I think the reason.


C can be quite complex once you get into all the abstractions it's using. And even for newbies I think the whole syntax around pointers is unnecessarily confusing (or at least people are often confused by it in a way they usually aren't when learning assembly). Not to mention how many keywords and symbols are overloaded and depend on their context for meaning.


>> C can be quite complex once you get into all the abstractions it's using.

Perhaps you were referring to C++, or some newer C dialect?

What I mean is, the K&R C had very few (if any) abstractions, other than those defined by preprocessor.

Like, you could look a the C code and pretty much "see" what the assembly code would look like after compiler did its work.


Yeah, on reflection you're right about that. At the time when the language matched the hardware the abstraction was much more literal. It was only when they diverged that optimizing compilers were needed and all the complexity was introduced in order to maintain the fiction that the hardware was still as simple as PDP-11.


To be fair, I never learned it in its heyday and even now, I've just worked on some toy utilities and an 8088 emulator.

I've never used it for work or open source or anything "real".




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: