electroly 6 hours ago

The way to understand Arthur Whitney's C code is to first learn APL (or, more appropriately, one of his languages in the family). If you skip that part, it'll just look like a weirdo C convention, when really he's trying to write C as if it were APL. The most obvious of the typographic stylings--the lack of spaces, single-character names, and functions on a single line--are how he writes APL too. This is perhaps like being a Pascal programmer coming to C and indignantly starting with "#define begin {" and so forth, except that atw is not a mere mortal like us.

  • mlochbaum 5 hours ago

    It looks like a weirdo C convention to APLers too though. Whitney writes K that way, but single-line functions in particular aren't used a lot in production APL, and weren't even possible before dfns were introduced (the classic "tradfn" always starts with a header line). All the stuff like macros with implicit variable names, type punning, and ternary operators just doesn't exist in APL. And what APL's actually about, arithmetic and other primives that act on whole immutable arrays, is not part of the style at all!

    • electroly 5 hours ago

      "the typographic stylings ... are how he writes" is what I said, isn't it? :) Well said.

  • maximilianburke 6 hours ago

    >This is perhaps like being a Pascal programmer coming to C and indignantly starting with "#define begin {" and so forth

    Ah, like Stephen Bourne

  • raddan 6 hours ago

    My first thought was "oh, this just looks like a functional language" but my next thought was "with the added benefit of relying on the horrors of the C preprocessor."

  • brudgers 4 hours ago

    Would learning J work instead?

    It’s probably more accessible than APL since its symbols can be found on conventional keyboards.

    • thechao 2 hours ago

      Every time I read about APL, I'm reminded of Lev Grossman's "The Magicians" — I'm always imagining some keyboard with just a little bit more than two dimensions; and, with sufficient capabilities, I could stretch to hit the meta-keys that let me type APL directly on my modified split MTGAP keyboard.

  • arboles 5 hours ago

    We know, the beginning of the article tells us his C code is APL-inspired. So many comments that just summarize the article on a surface level.

    • jacquesm 2 hours ago

      Yes, but... even if you know that it is APL inspired, that does not change the fact that this is not how you want to write C.

      The C pre-processor is probably one of the most abused pieces of the C toolchain and I've had to clean up more than once after a 'clever' programmer left the premises and their colleagues had no idea of what they were looking at. Just don't. Keep it simple, and comment your intent, not what the code does. Use descriptive names. Avoid globally scoped data and functions with side effects.

      That doesn't look smart and it won't make you look smart, but it is smart because the stuff you build will be reliable, predictable and maintainable.

    • electroly 5 hours ago

      The beginning of the article talks about not learning APL--specifically mentions that he's not here to talk about APL--and proceeds into a wide-eyed dissection of the C without mentioning APL syntax again. It also doesn't, literally, say that the C is like APL; it says Arthur is an APL guy who writes weird C code. Another comment disagrees that this is APL style at all--which is it?? I think you could have given me more credit than this. I read the article and participated as best I could. I'm always happy to bump APL related articles so they get more visibility.

      • arboles 5 hours ago

        It's irrelevant that someone doesn't think the code is APL-inspired. Their disagreement is as much with the article as your comment. I felt like what is written in the article already implied what I then read in your comment. Credit where due, the disagreement with the article probably would've not been posted if the implications in that part hadn't been re-stated plainly. Comments like these can be useful as pointers to specific aspects of an article, where conversations can be organized under, now that I think about it.

        • mlochbaum 4 hours ago

          Dunno why electroly is dragging me into this but I believe you've misread the article. When it says "His languages take significantly after APL" it means the languages themselves and not their implementations.

          • arboles 4 hours ago

            The article: "Let's make sense of the C code by the APL guy"

            Do you think the article meant to say it was more likely that the code wasn't inspired by APL?

            • mlochbaum 3 hours ago

              I think the article expresses no position. Most source code for array languages is not, in fact, inspired by APL. I encourage you to check a few random entries at [0]; Kap and April are some particularly wordy implementations, and even A+ mostly consists of code by programmers other than Whitney, with a variety of styles.

              I do agree that Whitney was inspired to some extent by APL conventions (not exclusively; he was quite a Lisp fan and that's the source of his indentation style when he writes multi-line functions, e.g. in [1]). The original comment was not just a summary of this claim but more like an elaboration, and began with the much stronger statement "The way to understand Arthur Whitney's C code is to first learn APL", which I moderately disagree with.

              [0] https://aplwiki.com/wiki/List_of_open-source_array_languages

              [1] https://code.jsoftware.com/wiki/Essays/Incunabulum

              • arboles 2 hours ago

                I unfortunately glossed over the part of the original comment that gives it substance: "The most obvious of the typographic stylings--the lack of spaces, single-character names, and functions on a single line--are how he writes APL too."

                That's backing for a claim.

                Also, I haven't once written APL. I think this might've been borderline trolling, just because of how little investment I have in the topic in reality. Sorry.

sebstefan 5 hours ago

```

#define _(e...) ({e;})

#define x(a,e...) _(s x=a;e)

#define $(a,b) if(a)b;else

#define i(n,e) {int $n=n;int i=0;for(;i<$n;++i){e;}}

```

>These are all pretty straight forward, with one subtle caveat I only realized from the annotated code. They're all macros to make common operations more compact: wrapping an expression in a block, defining a variable x and using it, conditional statements, and running an expression n times.

This is war crime territory

  • maldev 2 hours ago

    Some of these are wrong to. You can encounter issues with #define

    #define $(a,b) if(a)b;else

    due to not having brackets. So it's just extremely lazy to.

    • jacquesm an hour ago

      This should not be downvoted, this sort of error is indeed a very easy one to make when dealing with the C pre-processor.

      > Some of these are wrong to[o] <- that needs an extra 'o' > due to not having brackets. <- that one is fine > So it's just extremely lazy to[o]. <- that needs an extra 'o' too

      'to' comes in two versons, 'too' and 'to', both have different meanings.

      • throwaway150 an hour ago

        Good grief! Are we really so insufferable as software developers that we can't just appreciate a brilliant article about the work of a remarkable computer scientist without nitpicking every supposed "bad practice"?

        The whole point of the piece seems completely lost on some readers. Yes, we all know that #define $(a,b) if(a)b;else is questionable. I don't need a crash course on C macros in the comments, thank you. The author already acknowledges that Whitney's style is controversial. Do we really need to keep rehashing that point in every comment, or can we finally focus on how all this unconventional code fits together beautifully to form a working interpreter?

        • josephg an hour ago

          > I don't need a crash course on C macros in the comments, thank you.

          This is an enduring great & terrible thing about sites like HN and reddit: As people become more senior & experienced, junior engineers come in to fill the ranks. You and I don't need a crash course on C macros in the comments. But I promise you, a lot of people here have no idea why #define $(a,b) if(a)b;else is a weird C macro.

          How much should HN cater to junior engineers?

          • throwaway150 an hour ago

            > How much should HN cater to junior engineers?

            The assumption that HN should cater to junior engineers is curious. It implies a purpose the site has never claimed to have.

        • jacquesm an hour ago

          I wouldn't have a problem with it, if the implication wasn't that the author became smarter as a result of reading this code. That's my whole beef with it.

          'Hey, look at this interesting way of using the CPP to create a DSL'

          I'm fine with that. But this is precisely what aspiring C programmers should avoid at all costs. It's not controversial. It's bad.

epolanski 6 hours ago

There are best or accepted practices in every field.

And in every field they work well for the average case, but are rarely the best fit for that specific scenario. And in some rare scenarios, doing the opposite is the solution that fits best the individual/team/project.

The interesting takeaway here is that crowd wisdom should be given weight and probably defaulted if we want to turn off our brains. But if you turn on your brain you will unavoidably see the many cracks that those solutions bring for your specific problem.

  • Pannoniae 5 hours ago

    That's why I hate them being called "best" practices. No, they aren't the best practices, they are the mediocre practices. Sometimes, that's a good thing (you don't want to have the really bad results!), but if you aim for the very best practices, all of them will hold you back. It's basically a tradeoff, sacrificing efficiency / good performance in exchange for maintainability, consistency and reliability.

  • WhitneyLand 6 hours ago

    Having a solid product that solves a problem well can be orthogonal to how well a codebase lends itself to readability, learning curve, and efficiently ramping up new developers on a project.

    Just because you succeed at one says nothing about other practical and important metrics.

    • epolanski 6 hours ago

      I don't think you're reading this correctly.

      The proper way to read it is to understand the problem and its pros and cons.

      Without going long in the speculation, the situation likely was: there's only one guy who really can deliver this because of his knowledge, cv and experience and we need it.

      And at that point your choice is having a solution or not.

      • WhitneyLand 4 hours ago

        As the old saying goes the graveyards are full of irreplaceable men.

        But even if we grant that only one person could deliver a solution, it wouldn’t change the fact that you’re giving up on certain things to get it.

romperstomper 3 hours ago

Is this supposed to be a specific coding style or paradigm?

I’ve never seen code written like this in real-world projects — maybe except for things like the "business card ray tracer". When I checked out Arthur Whitney’s Wikipedia page I noticed he also made the J programming language (which is open source) and the code there has that same super-dense style https://github.com/jsoftware/jsource/blob/master/jsrc/j.c

  • susam 17 minutes ago

    > Is this supposed to be a specific coding style or paradigm?

    This is indeed Whitney's distinctive coding style, well known for its use in his various array programming language interpreters. His coding style is famously minimalist and idiosyncratic. The code tends to be highly concise and dense, often implementing entire interpreters in just a few pages.

    This has been discussed a number of times on HN. I have collected some of the interesting comments on this topic from previous threads here in this meta comment: https://news.ycombinator.com/item?id=45800777#45805346

  • jacquesm 2 hours ago

    > I’ve never seen code written like this in real-world projects

    Lucky you. I've seen far worse (at least this is somewhat consistent). But this isn't C anymore, it is a new language built on top of C and then a program written in that language. C is merely the first stage compilation target.

  • rcxdude 3 hours ago

    It's similar to J and that family of languages (K is another). Those are inspired by APL, which also has this super compact nature but in addition it largely uses non-ascii symbols. Apparently it is something you can get used to and notionally has some advantages (extreme density means you can see 'more' of the program on a given page, for example, and you need fewer layers of abstraction).

  • leoc 2 hours ago

    I believe it’s usually referred to as ‘OCC’. ;)

uvaursi 5 hours ago

This is a good use of macros. I understand people are frightened by how it looks but it’s just C in a terse, declarative style. It’s mostly straightforward, just dense and yes - will challenge you because of various obscure macro styles used.

I believe “oo” is probably an infinity error condition or some such not 100% sure. I didn’t see the author discuss it since they said it’s not used. Was probably used during development as a debug printout.

  • saulpw 3 hours ago

    I agree, some of the macros are very useful, and I've found myself wanting DO(n, code) as a simpler for-loop construct. In my own code, when I have some dozens of small things (like opcodes or forth words or APL operators), I specifically do want a "one-liner" syntax for most of them. The individual elements are usually so small that it's distasteful to spend 10 lines of code on them, and especially because the real understanding lies in the 'space between', so I want to see a large subset of the elements at once, and not put code-blinders on to focus on one element at a time.

    • uvaursi an hour ago

      In reading many C code bases, including the Linux kernel, every one finds a use case for macros of this nature.

  • procaryote 3 hours ago

    > This is a good use of macros.

    no.

taeric 6 hours ago

Kudos on not just taking a combative stance on the code!

This was a very fun read that I'm fairly convinced I will have to come back to.

shawn_w 6 hours ago

Much as a Real Programmer can write FORTRAN programs in any language, Whitney can write APL programs in any language.

piazz 6 hours ago

I can’t explain why but “He’s assigning 128 to a string called Q” made me absolutely lose it.

  • arlyle 4 hours ago

    ksimple is eight bit. 128 is the unsigned middle or one plus signed max. usually using it for null or error signal. on sixty for bit k implementations it would be two to the sixty three.

holografix an hour ago

“would you rather spend 10 days reading 100,000 lines of code, or 4 days reading 1000?"

More like 10 days understanding 100K loc or 30 days stabbing yourself in the eye over 4K loc

m463 2 hours ago

This reminds me of when I was learning perl.

At first, I thought it looked like line noise. $var on the left of the = sign? Constructs like $_ and @_? more obscure constructs were worse.

But I had to keep going and then one day something happened. It was like one of those 3d stereograms where your eyes have to cross or uncross. The line noise became idioms and I just started becoming fluent in perl.

I liked some of it too - stuff like "unless foo" being more a readable/human of saying if not foo.

perl became beautiful to me - it was the language I thought in, and at the highest level. I could take an idea in my mind and express it in perl.

But I had some limits. I would restrain myself on putting entire loops or nested expression on one line just to "save space".

I used regular expressions, but sometimes would match multiple times instead of all in one giant unreadable "efficient" expression.

and then, I looked at other people's perl. GAH! I guess other people can "express themselves in perl", but rarely was it beautiful or kind, it was statistically worse and closer to vomit.

I like python now. more sanity, (somewhat) more likely that different people will solve a problem with similar and/or readable code.

by the way, very powerful article (even if I intensely dislike the code)

jacquesm 2 hours ago

As a very long time C programmer: don't try to be smart. The more you rely on fancy preprocessor tricks the harder it will be to understand and debug your code.

The C preprocessor gives you enough power to shoot yourself in the foot, repeatedly, with anything from small caliber handguns to nuclear weapons. You may well end up losing control over your project entirely.

One nice example: glusterfs. There are a couple of macros in use there that, when they work are magic. But when they don't you lose days, sometimes weeks. This is not the way to solve coding problems, you only appear smart as long as you remember what you've built. Your other self, three years down the road is going to want to kill the present one, and the same goes for your colleagues a few weeks from now.

  • 5- 2 hours ago

    > as long as you remember what you've built

    yes! like any craft, this works only if you keep practising it.

    various implementations of k, written in this style (with iterative improvements), have been in constant development for decades getting very good use out of these macros.

  • gitonthescene an hour ago

    Losing control of a project is likely more due to the programmers on it than the tools they use. IMHO _anything_ done consistently can be reasoned about and if necessary undone.

    • jacquesm an hour ago

      Not necessarily. Sometimes the rot goes so deep that there is really no way out.

      And the C pre-processor has figured prominently in more than one such case in my career. And it was precisely in the kind of way that is described in TFA.

      For something to be doable it needs to make economic sense as well and that's the problem with nightmare trickery like this. Initially it seems like a shortcut, but in the long run the price tag keeps going up.

  • switchbak 2 hours ago

    Seems to me that this is now exponentially true with AI coding assistants. If you don't understand what you're adding, and you're being clever - you can quickly end up in a situation where you can't reason effectively about your system.

    I'm seeing this on multiple fronts, and it's quickly becoming an unsustainable situation in some areas. I expect I'm not alone in this regard.

    • gitonthescene an hour ago

      I’d bet that a lot of the work done with AI assistants is decidedly _not_ clever.

cozzyd 6 hours ago

The C preprocessor allows you to define a limited DSL on top of C. This is... sometimes a good thing, and often convenient, even if it makes it hard to understand.

  • jacquesm an hour ago

    For extremely small values of 'sometimes' where sometimes is constrained by the following expressions evaluating to 'true':

    - you have no interest in maintaining your code

    - your code will never be maintained by someone else

    - you know your C preprocessor better than you know your C compiler

    - your favorite language isn't available for this particular target

    - you don't mind object level debugging

    - your idea of a fun time is to spend a few hours per day memorizing code

    - you really are smarter than everybody else

  • gitonthescene an hour ago

    I think _all_ programming is about finding an appropriate DSL for the problem at hand. First you need to understand the “language” of the problem then you develop a “lingo”.

richhhh 2 hours ago

Kerrnigan’s law seems to apply:

Everyone knows that debugging is twice as hard as writing a program in the first place. So if you’re as clever as you can be when you write it, how will you ever debug it?

  • pragma_x 2 hours ago

    Agreed. Although it's also a bit worse than that for coding exclusively with macros. You have to add an extra degree of complexity for any additional code generator you add to your toolchain, when that path comes into play for debugging. Since we whole-buffalo'ed this situation, that's 100% of the code you could possibly need to debug.

    • jacquesm an hour ago

      Yes, precisely, that's when all that cleverness will come back to bite you hard.

      "Which line was that again? Oh... "

      Pics up the phone, dials.

      "Honey, I won't be home in time for dinner."

munchler 4 hours ago

The person who wrote this code might be a genius, but learning to read it isn’t going to make anyone smart. It’s basically obfuscated assembly code.

igleria 7 hours ago

Holy molly this must be the equivalent of reading the necronomicon and getting cosmic madness disease as a result.

What a flex of patience!

kazinator 5 hours ago

You will not become smart, only crazy and unemployable. :)

  • Keyframe 4 hours ago

    Or an unrealized IOCCC champion Whitney seems to aspire to.

    • IncreasePosts 3 hours ago

      Whitney would never submit his code because it is trivially understandable and not obfuscated?

  • gitonthescene an hour ago

    Are you saying most employers are smart by default??

jandrese 6 hours ago

> His languages take significantly after APL, which was a very popular language for similar applications before the invention of (qwerty) keyboards.

Ok, so this article is tongue in cheek. Good to know that up front.

russellbeattie 4 hours ago

> "Opinions on his coding style are divided, though general consensus seems to be that it's incomprehensible."

I wholeheartedly concur with popular opinion. It's like writing a program in obfuscated code.

Hmmm... his way of basically making C work like APL made me wonder: Is there a programming language out there that defines its own syntax in some sort of header and then uses that syntax for the actual code?

  • IncreasePosts 3 hours ago

    In racket, you can say something like "#lang X", which can modify the reader and let you create your own arbitrary syntax

voidhorse an hour ago

Nice write up!

When I see stuff like this, personally, I don't try to understand it, as code like this emerges from basically three motivations:

- The other person wanted to write in some other more (functional|object oriented|stack) language but couldn't, so they did this.

- The person couldn't be bothered to learn idioms for the target language and didn't care about others being able to read the program.

- The person intentionally wanted to obfuscate the program.

And none of these are good reasons to write code in a particular way. Code is about communication. Code like this is the equivalent to saying "I know the grammatical convention in English is subject-verb-object but I feel like speaking verb-object-subject and other people will have to just deal with it"—which, obviously, is a horrible way to communicate if you actually want to share ideas/get your point across.

That all said, the desire to have logic expressed more compactly and declaratively definitely resonates. Unfortunately C style verbosity and impurity remains dominant.

realo 6 hours ago

Ah yes... very tempting to ask an AI to refactor some large Java program (pick your language) "in the style of Arthur Whitney".

  • wvbdmp 5 hours ago

    I asked ChatGPT to explain the code from the OP (without the header file), and it seems to have given a really good breakdown. Although I know nothing about interpreters, C, or this fucked style, so who really knows if it makes any sense at all…

susam 39 minutes ago

HN stories about Whitney's code tend to predictably attract a lot of comments about the coding style, so I thought I'd share a couple of positive discussions from previous related posts.

Here's one from one of my favourite HN commenters posted at https://news.ycombinator.com/item?id=25902615#25903452 (Jan 2021):

"Its density is many times higher than most C programs, but that's no big obstacle to understanding if you don't attempt to "skim" it; you need to read it character-by-character from top to bottom. It starts off defining some basic types, C for Character and I for Integer, and then the all-important Array. This is followed by some more shorthand for printf, return, and functions of one and two arguments, all of the array type. The DO macro is used to make iteration more concise. Then the function definitions begin. ma allocates an array of n integers (this code is hardcoded for a 32-bit system), mv is basically memcpy, tr (Total Rank?) is used to compute the total number of elements, and ga (Get/Generate Array) allocates an array. This is followed by the definitions of all the primitive operations (interestingly, find is empty), a few more globals, and then the main evaluator body. Lastly, main contains the REPL. While I don't think this style is suitable for most programmers, it's unfortunate that the industry seems to have gone towards the other extreme." -- userbinator

Here's another from the same commenter on a different story at https://news.ycombinator.com/item?id=39026551#39038364 (Jan 2024):

"There's something very satisfying about how this style seems to "climb the abstraction ladder" very quickly, but all of those abstractions he creates are not wasted and immediately put to use. I think much of the amazement and beauty is that there isn't much code at all, and yet it does so much. It's the complete opposite of the bloated, lazy, lowest-common-denominator trend that's been spreading in many other languages's communities." -- userbinator

Another from the story at https://news.ycombinator.com/item?id=40544283#40544491 (Jun 2024):

"For people not accustomed to the style of Whitney, you can read various HN threads from the past to learn more about why he writes programs the way he does. It's deliberate and powerful." -- hakanderyal

One more from the same story at https://news.ycombinator.com/item?id=40544283#40545004 (Jun 2024):

"Whitney is famous for writing code like this, it's been his coding style for decades. For example, he wrote an early J interpreter this way in 1989. There's also a buddy allocator he wrote at Morgan Stanley that's only about 10 lines of C code." -- papercrane

bluedino 5 hours ago

Reminds me of a Python codebase I used to work with

The company was originally a bunch of Access/VB6 programmers.

Then they wrote their VB code in PHP.

And then they wrote their PHP code in Python. It was disgusting.

netbioserror 6 hours ago

This code style is psychotic. I had to reverse-engineer and verify a C codebase that was machine-obfuscated and it was still clearer to follow than this. Increasing clarity through naming is great, but balancing information density is, dare I say, also a desirable goal. Compacting code rapidly diminishes returns once you're relying on a language having insignificant whitespace.

readthenotes1 7 hours ago

During code reviews I would always ask for clear code because it's much harder to tell whether it's correct if it's unclear.

I got too much other stuff to do than decode the voynich manuscript...