Learning to read Arthur Whitney's C to become smart (2024)
60 comments
·November 3, 2025jacquesm
5-
> as long as you remember what you've built
yes! like any craft, this works only if you keep practising it.
various implementations of k, written in this style (with iterative improvements), have been in constant development for decades getting very good use out of these macros.
switchbak
Seems to me that this is now exponentially true with AI coding assistants. If you don't understand what you're adding, and you're being clever - you can quickly end up in a situation where you can't reason effectively about your system.
I'm seeing this on multiple fronts, and it's quickly becoming an unsustainable situation in some areas. I expect I'm not alone in this regard.
sebstefan
```
#define _(e...) ({e;})
#define x(a,e...) _(s x=a;e)
#define $(a,b) if(a)b;else
#define i(n,e) {int $n=n;int i=0;for(;i<$n;++i){e;}}
```
>These are all pretty straight forward, with one subtle caveat I only realized from the annotated code. They're all macros to make common operations more compact: wrapping an expression in a block, defining a variable x and using it, conditional statements, and running an expression n times.
This is war crime territory
electroly
The way to understand Arthur Whitney's C code is to first learn APL (or, more appropriately, one of his languages in the family). If you skip that part, it'll just look like a weirdo C convention, when really he's trying to write C as if it were APL. The most obvious of the typographic stylings--the lack of spaces, single-character names, and functions on a single line--are how he writes APL too. This is perhaps like being a Pascal programmer coming to C and indignantly starting with "#define begin {" and so forth, except that atw is not a mere mortal like us.
mlochbaum
It looks like a weirdo C convention to APLers too though. Whitney writes K that way, but single-line functions in particular aren't used a lot in production APL, and weren't even possible before dfns were introduced (the classic "tradfn" always starts with a header line). All the stuff like macros with implicit variable names, type punning, and ternary operators just doesn't exist in APL. And what APL's actually about, arithmetic and other primives that act on whole immutable arrays, is not part of the style at all!
electroly
"the typographic stylings ... are how he writes" is what I said, isn't it? :) Well said.
maximilianburke
>This is perhaps like being a Pascal programmer coming to C and indignantly starting with "#define begin {" and so forth
Ah, like Stephen Bourne
raddan
My first thought was "oh, this just looks like a functional language" but my next thought was "with the added benefit of relying on the horrors of the C preprocessor."
brudgers
Would learning J work instead?
It’s probably more accessible than APL since its symbols can be found on conventional keyboards.
thechao
Every time I read about APL, I'm reminded of Lev Grossman's "The Magicians" — I'm always imagining some keyboard with just a little bit more than two dimensions; and, with sufficient capabilities, I could stretch to hit the meta-keys that let me type APL directly on my modified split MTGAP keyboard.
arboles
We know, the beginning of the article tells us his C code is APL-inspired. So many comments that just summarize the article on a surface level.
jacquesm
Yes, but... even if you know that it is APL inspired, that does not change the fact that this is not how you want to write C.
The C pre-processor is probably one of the most abused pieces of the C toolchain and I've had to clean up more than once after a 'clever' programmer left the premises and their colleagues had no idea of what they were looking at. Just don't. Keep it simple, and comment your intent, not what the code does. Use descriptive names. Avoid globally scoped data and functions with side effects.
That doesn't look smart and it won't make you look smart, but it is smart because the stuff you build will be reliable, predictable and maintainable.
electroly
The beginning of the article talks about not learning APL--specifically mentions that he's not here to talk about APL--and proceeds into a wide-eyed dissection of the C without mentioning APL syntax again. It also doesn't, literally, say that the C is like APL; it says Arthur is an APL guy who writes weird C code. Another comment disagrees that this is APL style at all--which is it?? I think you could have given me more credit than this. I read the article and participated as best I could. I'm always happy to bump APL related articles so they get more visibility.
arboles
It's irrelevant that someone doesn't think the code is APL-inspired. Their disagreement is as much with the article as your comment. I felt like what is written in the article already implied what I then read in your comment. Credit where due, the disagreement with the article probably would've not been posted if the implications in that part hadn't been re-stated plainly. Comments like these can be useful as pointers to specific aspects of an article, where conversations can be organized under, now that I think about it.
svat
IMO this is a really good blog post, whatever you think of the coding style. Great effort by the author, really good for eight hours' work (as mentioned), and some illuminating conclusions: https://needleful.net/blog/2024/01/arthur_whitney.html#:~:te...
richhhh
Kerrnigan’s law seems to apply:
Everyone knows that debugging is twice as hard as writing a program in the first place. So if you’re as clever as you can be when you write it, how will you ever debug it?
romperstomper
Is this supposed to be a specific coding style or paradigm?
I’ve never seen code written like this in real-world projects — maybe except for things like the "business card ray tracer". When I checked out Arthur Whitney’s Wikipedia page I noticed he also made the J programming language (which is open source) and the code there has that same super-dense style https://github.com/jsoftware/jsource/blob/master/jsrc/j.c
jacquesm
> I’ve never seen code written like this in real-world projects
Lucky you. I've seen far worse (at least this is somewhat consistent). But this isn't C anymore, it is a new language built on top of C and then a program written in that language. C is merely the first stage compilation target.
tom_
Possibly related(ish): video about co-dfns, prompted by a previous HN thread (links in video summary), not written in C but put together in a similarly dense style: https://www.youtube.com/watch?v=gcUWTa16Jc0
rcxdude
It's similar to J and that family of languages (K is another). Those are inspired by APL, which also has this super compact nature but in addition it largely uses non-ascii symbols. Apparently it is something you can get used to and notionally has some advantages (extreme density means you can see 'more' of the program on a given page, for example, and you need fewer layers of abstraction).
leoc
I believe it’s usually referred to as ‘OCC’. ;)
MisterTea
Reminds me of Bourne's attempt at beating C into Algol: https://www.tuhs.org/cgi-bin/utree.pl?file=V7/usr/src/cmd/sh...
Example: https://www.tuhs.org/cgi-bin/utree.pl?file=V7/usr/src/cmd/sh...
epolanski
There are best or accepted practices in every field.
And in every field they work well for the average case, but are rarely the best fit for that specific scenario. And in some rare scenarios, doing the opposite is the solution that fits best the individual/team/project.
The interesting takeaway here is that crowd wisdom should be given weight and probably defaulted if we want to turn off our brains. But if you turn on your brain you will unavoidably see the many cracks that those solutions bring for your specific problem.
Pannoniae
That's why I hate them being called "best" practices. No, they aren't the best practices, they are the mediocre practices. Sometimes, that's a good thing (you don't want to have the really bad results!), but if you aim for the very best practices, all of them will hold you back. It's basically a tradeoff, sacrificing efficiency / good performance in exchange for maintainability, consistency and reliability.
WhitneyLand
Having a solid product that solves a problem well can be orthogonal to how well a codebase lends itself to readability, learning curve, and efficiently ramping up new developers on a project.
Just because you succeed at one says nothing about other practical and important metrics.
epolanski
I don't think you're reading this correctly.
The proper way to read it is to understand the problem and its pros and cons.
Without going long in the speculation, the situation likely was: there's only one guy who really can deliver this because of his knowledge, cv and experience and we need it.
And at that point your choice is having a solution or not.
WhitneyLand
As the old saying goes the graveyards are full of irreplaceable men.
But even if we grant that only one person could deliver a solution, it wouldn’t change the fact that you’re giving up on certain things to get it.
taeric
Kudos on not just taking a combative stance on the code!
This was a very fun read that I'm fairly convinced I will have to come back to.
shawn_w
Much as a Real Programmer can write FORTRAN programs in any language, Whitney can write APL programs in any language.
piazz
I can’t explain why but “He’s assigning 128 to a string called Q” made me absolutely lose it.
arlyle
ksimple is eight bit. 128 is the unsigned middle or one plus signed max. usually using it for null or error signal. on sixty for bit k implementations it would be two to the sixty three.
jandrese
> His languages take significantly after APL, which was a very popular language for similar applications before the invention of (qwerty) keyboards.
Ok, so this article is tongue in cheek. Good to know that up front.
As a very long time C programmer: don't try to be smart. The more you rely on fancy preprocessor tricks the harder it will be to understand and debug your code.
The C preprocessor gives you enough power to shoot yourself in the foot, repeatedly, with anything from small caliber handguns to nuclear weapons. You may well end up losing control over your project entirely.
One nice example: glusterfs. There are a couple of macros in use there that, when they work are magic. But when they don't you lose days, sometimes weeks. This is not the way to solve coding problems, you only appear smart as long as you remember what you've built. Your other self, three years down the road is going to want to kill the present one, and the same goes for your colleagues a few weeks from now.