They're all macros to make common operations more compact
I read the J Incunabulum before encountering this, and the point that stands out is that you don't start by jumping into the middle of it like many programmers who are familiar with C will do; the macros defined at the beginning will confuse you otherwise. They also build upon previous ones, so the code ends up climbing the "abstraction ladder" very quickly. I personally like the Iterate macro (i), for how it compresses a relatively verbose loop into a single character; and of course in an array language, the iteration is entirely implicit.
In other words, I believe the reason this code is hard to read for many who are used to more "normal" C styles is because of its density; in just a few dozen lines, it creates many abstractions and uses them immediately, something which would otherwise be many many pages long in a more normal style. Thus if you try to "skim read" it, you are taking in a significantly higher amount of complexity than usual. It needs to be read one character at a time.
As someone who has spent considerable time working with huge codebases composed of hundreds of tiny files that have barely any substance to them, and trying to find where things happen becomes an exercise in search, this extreme compactness feels very refreshing.
When you zoom out on Google maps, usually the street names and bussiness pins disappear. This code style is like having a fully informationally dense zoomed out map. Being too zoomed in is definitely frustrating, and reminds me of early attempts at "mobile" web pages. But I'm not sure that this condensed code style is a good general solution either, it certainly seems overwhelming, like a where's waldo puzzle.
I prefer having a consistent information density regardless of zoom level. When I would like to see more details, I would like to achieve that by zooming in. When I want the overview, I'd prefer to zoom out and have certain details omitted from the map.
Having a clear and by-convention code organization is one way to achieve this. Then to drill into the details, just navigate the project directory to the right file.
For rabbit Hole style hunting, following references/symbol usage/definition is ideal, which is enabled by modern IDEs.
K and family are made for data analysis. Data analyses are relatively simple software projects that don't have a very wide scope. I think this dense style of programming falls apart when you consider the breadth of requirements of typical modern application software.
I have definitely seen many 'typical modern applications' where the business logic can be summarized into 100 lines of code. The rest is just shoveling things around.
>In other words, I believe the reason this code is hard to read for many who are used to more "normal" C styles is because of its density; in just a few dozen lines, it creates many abstractions and uses them immediately, something which would otherwise be many many pages long in a more normal style.
I also spent some time with the Incunabulum and came away with a slightly different conclusion. I only really grokked it after going through and renaming the variables to colorful emojis (https://imgur.com/F27ZNfk). That made me think that, in addition to informational density, a big part of the initial difficulty is orthographic. IMO two features of our current programming culture make this coding style hard to read: (1) Most modern languages discourage or forbid symbol/emoji characters in identifiers, even though their highly distinctive shapes would make this kind of code much more readable, just as they do in mathematical notation (there's a reason APL looked the way it did!). (2) When it comes to color, most editors default to "syntax highlighting" (each different syntactic category gets a different color), whereas what's often most helpful (esp. here) is token-based highlighting, where each distinct identifier (generally) gets its own color (This was pioneered afaik by Sublime Text which calls it "hashed syntax highlighting" and is sometimes called "semantic highlighting" though that term was later co-opted by VSCode to mean something quite different.) Once I renamed the identifiers so it becomes easier to recognize them at a glance by shape and/or color the whole thing became much easier to follow.
I've experimented a few times with coloring my variables explicitly (using a prefix like R for red, hiding the letters, etc) after playing with colorforth. I agree getting color helps with small shapes, but I think the colors shouldn't be arbitrary: every character Arthur types is a choice about how the code should look, what he is going to need, and what he needs to see at the same time, and it seems like a missed opportunity to turn an important decision about what something is named (or colored) over to a random number generator.
> (1) Most modern languages discourage or forbid symbol/emoji characters in identifiers
> (2) When it comes to color,
Call me boomer if you wish, but if you can't grasp the value of having your code readable on a 24 rows by 80 columns, black and white screen, you are not a software developer. You are not even a programmer: at most, you are a prompt typist for ChatGPT.
While I agree that, if the function at hand can’t fit in a 25x80 window it most likely should be broken in smaller functions, there are kinder ways to say that.
I also joke God made the VT100 with 80 columns for a reason.
... For the reason that IBM made their 1928 card with 80 columns, in an attempt to increase the storage efficiency of Hollerith’s 45-column card without increasing its size?
That said, ~60 characters per printed line has been the typographer’s recommendation for much longer. Which is why typographers dislike Times and derivatives when used on normal-sized single-column pages, as that typeface was made to squeeze more characters into narrow newspaper columns (it’s in the name).
Adding some comments would have been advisable, but I guess he doesn’t need comments.
APL taught me the importance of comments - if I didn’t comment my code thoroughly I would forget how it works and what it did as soon as I moved away from the keyboard. It is a cruel language.
"... trying to find where things happen becomes an exercise in search."
It seems like developers actually prefer large codebases and having to use recursive search through multiple layers of sub directories
I prefer no sub directories and being able to just grep against *.[ch] with no recursion
I think projects in Java languages are the worst when it comes to having to search through numerous small verbose files having barely any substance to them. IDEs probably make this easier but I don't use one
Years ago I read that Whitney's "IDE" is something like the Windows console and Notepad. He has said in interviews that he wants all the code to fit on a single page
The way to understand Arthur Whitney's C code is to first learn APL (or, more appropriately, one of his languages in the family). If you skip that part, it'll just look like a weirdo C convention, when really he's trying to write C as if it were APL. The most obvious of the typographic stylings--the lack of spaces, single-character names, and functions on a single line--are how he writes APL too. This is perhaps like being a Pascal programmer coming to C and indignantly starting with "#define begin {" and so forth, except that atw is not a mere mortal like us.
When I first encountered it years ago, the thing was impenetrable, but after learning APL to a high level, it now reads like a simple, direct expression of intent. The code even clearly communicates design tradeoffs and the intended focus of experimentation. Or more on the nose, to me the code ends up feeling primarily like extremely readable communication of ideas between like-minded humans. This is a very rare thing in software development in my experience.
IMHO, ideas around "readable code" and "good practices" in software development these days optimize for large, high-turnover teams working on large codebases. Statistically speaking, network effects mean that these are the codebasese and developer experiences we are most likely to hear about. However, as an industry, I think we are relatively blind to alternatives. We don't have sufficient shared language and cognitive tooling to understand how to optimize software dev for small, expert teams.
It looks like a weirdo C convention to APLers too though. Whitney writes K that way, but single-line functions in particular aren't used a lot in production APL, and weren't even possible before dfns were introduced (the classic "tradfn" always starts with a header line). All the stuff like macros with implicit variable names, type punning, and ternary operators just doesn't exist in APL. And what APL's actually about, arithmetic and other primives that act on whole immutable arrays, is not part of the style at all!
My first thought was "oh, this just looks like a functional language" but my next thought was "with the added benefit of relying on the horrors of the C preprocessor."
Every time I read about APL, I'm reminded of Lev Grossman's "The Magicians" — I'm always imagining some keyboard with just a little bit more than two dimensions; and, with sufficient capabilities, I could stretch to hit the meta-keys that let me type APL directly on my modified split MTGAP keyboard.
Yes, but... even if you know that it is APL inspired, that does not change the fact that this is not how you want to write C.
The C pre-processor is probably one of the most abused pieces of the C toolchain and I've had to clean up more than once after a 'clever' programmer left the premises and their colleagues had no idea of what they were looking at. Just don't. Keep it simple, and comment your intent, not what the code does. Use descriptive names. Avoid globally scoped data and functions with side effects.
That doesn't look smart and it won't make you look smart, but it is smart because the stuff you build will be reliable, predictable and maintainable.
Layman question: say you have a C codebase with a bunch of preprocessor macros and you want to get rid of a particular one that's too clever, and assume no other macros depend on it.
Is it possible to command the preprocessor to take the source files as input and print them out with that one particular macro expanded and no other changes?
Intuitively, it sounds like it should be possible, and then you'd end up with a code base with a bunch of repetition but one fewer too-clever abstraction - and refactoring to deal with repetition (if necessary!) is a far more approachable and well-understood problem.
(Kind of like how some fancy compiles-to-javascript languages have a literal 'mytool --escape' command that will turn the entire code base into a plain, non-minified javascript in case you ever want to stop using them.)
The beginning of the article talks about not learning APL--specifically mentions that he's not here to talk about APL--and proceeds into a wide-eyed dissection of the C without mentioning APL syntax again. It also doesn't, literally, say that the C is like APL; it says Arthur is an APL guy who writes weird C code. Another comment disagrees that this is APL style at all--which is it?? I think you could have given me more credit than this. I read the article and participated as best I could. I'm always happy to bump APL related articles so they get more visibility.
It's irrelevant that someone doesn't think the code is APL-inspired. Their disagreement is as much with the article as your comment. I felt like what is written in the article already implied what I then read in your comment. Credit where due, the disagreement with the article probably would've not been posted if the implications in that part hadn't been re-stated plainly. Comments like these can be useful as pointers to specific aspects of an article, where conversations can be organized under, now that I think about it.
Dunno why electroly is dragging me into this but I believe you've misread the article. When it says "His languages take significantly after APL" it means the languages themselves and not their implementations.
I think the article expresses no position. Most source code for array languages is not, in fact, inspired by APL. I encourage you to check a few random entries at [0]; Kap and April are some particularly wordy implementations, and even A+ mostly consists of code by programmers other than Whitney, with a variety of styles.
I do agree that Whitney was inspired to some extent by APL conventions (not exclusively; he was quite a Lisp fan and that's the source of his indentation style when he writes multi-line functions, e.g. in [1]). The original comment was not just a summary of this claim but more like an elaboration, and began with the much stronger statement "The way to understand Arthur Whitney's C code is to first learn APL", which I moderately disagree with.
I unfortunately glossed over the part of the original comment that gives it substance: "The most obvious of the typographic stylings--the lack of spaces, single-character names, and functions on a single line--are how he writes APL too."
That's backing for a claim.
Also, I haven't once written APL. I think this might've been borderline trolling, just because of how little investment I have in the topic in reality. Sorry.
I was curious about Shakti after reading this and the comments, so followed the link to shakti.com on Wikipedia. It seems it now redirects to the k.nyc domain, which displays a single letter 'k'.
I wondered if I was missing something, so looked at the source, to find the following:
<div style='font-family:monospace'>k
Nothing but that. Which is, surely, the HTML equivalent of the Whitney C style: relying on the compiler/interpreter to add anything implicit, and shaving off every element that isn't required, such as a closing tag (which, yes, only matters if you're going to want something else afterwards, I guess...). Bravo.
IMO this is a really good blog post, whatever you think of the coding style. Great effort by the author, really good for eight hours' work (as mentioned), and some illuminating conclusions: https://needleful.net/blog/2024/01/arthur_whitney.html#:~:te...
TIL `a ?: b`, that's actually pretty nice, a bit like Haskell's `fromMaybe b a` (or `a <|> b` if b can also b "empty")
and I do like `#define _(e...) ({e;})` – that's one where I feel the short macro name is OK. But I'd like it better if that were just how C worked from the get-go.
Very nice discussion at the end of the article. There are good things to be learnt from this code and its discussions even if you disagree with some or even most of the style.
Yes, '?:' is also known as the Elvis operator [1][2]. I sometimes use it in other languages such as Groovy. But I don't use it in C because this happens to be a GCC extension [3][4] and I've often had to compile my C projects with compilers that do not support GCC extensions. The C standard [5] defines the conditional operator as:
So per the C standard there must be an expression between '?' and ':' and an expression cannot be empty text. To confirm this we need to check the grammar for expression, which unfortunately is a little tedious to verify manually due to its deeply nested nature. Here it is:
expression:
assignment-expression
expression , assignment-expression
assignment-expression:
conditional-expression
unary-expression assignment-operator assignment-expression
unary-expression:
postfix-expression
++ unary-expression
-- unary-expression
unary-operator cast-expression
sizeof unary-expression
sizeof ( type-name )
alignof ( type-name )
assignment-operator: one of
= *= /= %= += -= <<= >>= &= ^= |=
... and so on ...
The recursion goes further many more levels deep but the gist is that no matter whichever branch the parser takes, it expects the expression to have at least one symbol per the grammar. Perhaps an easier way to confirm this is to just have the compiler warn us about it. For example:
There are best or accepted practices in every field.
And in every field they work well for the average case, but are rarely the best fit for that specific scenario. And in some rare scenarios, doing the opposite is the solution that fits best the individual/team/project.
The interesting takeaway here is that crowd wisdom should be given weight and probably defaulted if we want to turn off our brains. But if you turn on your brain you will unavoidably see the many cracks that those solutions bring for your specific problem.
That's why I hate them being called "best" practices. No, they aren't the best practices, they are the mediocre practices. Sometimes, that's a good thing (you don't want to have the really bad results!), but if you aim for the very best practices, all of them will hold you back. It's basically a tradeoff, sacrificing efficiency / good performance in exchange for maintainability, consistency and reliability.
Having a solid product that solves a problem well can be orthogonal to how well a codebase lends itself to readability, learning curve, and efficiently ramping up new developers on a project.
Just because you succeed at one says nothing about other practical and important metrics.
The proper way to read it is to understand the problem and its pros and cons.
Without going long in the speculation, the situation likely was: there's only one guy who really can deliver this because of his knowledge, cv and experience and we need it.
And at that point your choice is having a solution or not.
>These are all pretty straight forward, with one subtle caveat I only realized from the annotated code. They're all macros to make common operations more compact: wrapping an expression in a block, defining a variable x and using it, conditional statements, and running an expression n times.
This should not be downvoted, this sort of error is indeed a very easy one to make when dealing with the C pre-processor.
> Some of these are wrong to[o] <- that needs an extra 'o'
> due to not having brackets. <- that one is fine
> So it's just extremely lazy to[o]. <- that needs an extra 'o' too
'to' comes in two versons, 'too' and 'to', both have different meanings.
Good grief! Are we really so insufferable as software developers that we can't just appreciate a brilliant article about the work of a remarkable computer scientist without nitpicking every supposed "bad practice"?
The whole point of the piece seems completely lost on some readers. Yes, we all know that #define $(a,b) if(a)b;else is questionable. I don't need a crash course on C macros in the comments, thank you. The author already acknowledges that Whitney's style is controversial. Do we really need to keep rehashing that point in every comment, or can we finally focus on how all this unconventional code fits together beautifully to form a working interpreter?
> I don't need a crash course on C macros in the comments, thank you.
This is an enduring great & terrible thing about sites like HN and reddit: As people become more senior & experienced, junior engineers come in to fill the ranks. You and I don't need a crash course on C macros in the comments. But I promise you, a lot of people here have no idea why #define $(a,b) if(a)b;else is a weird C macro.
It is nothing to do with seniors vs. juniors but merely a lack understanding as to the intent behind somebody's work. When an acknowledged expert does something out of the ordinary you ask why and try to grasp his pov rather than pointing out obvious trivialities.
Yeah, I feel the same way you do, but then console myself with this quote - “Mediocrity knows nothing higher than itself; but Talent instantly recognizes Genius.” (from The Valley of Fear by Arthur Conan Doyle).
People have a silly need to point out the obvious as a crutch to their ego.
I wouldn't have a problem with it, if the implication wasn't that the author became smarter as a result of reading this code. That's my whole beef with it.
'Hey, look at this interesting way of using the CPP to create a DSL'
I'm fine with that. But this is precisely what aspiring C programmers should avoid at all costs. It's not controversial. It's bad.
Still, since the article already contains this warning, some people might argue that it's unnecessary for us to add it as a response to every comment here.
This is a good use of macros. I understand people are frightened by how it looks but it’s just C in a terse, declarative style. It’s mostly straightforward, just dense and yes - will challenge you because of various obscure macro styles used.
I believe “oo” is probably an infinity error condition or some such not 100% sure. I didn’t see the author discuss it since they said it’s not used. Was probably used during development as a debug printout.
I agree, some of the macros are very useful, and I've found myself wanting DO(n, code) as a simpler for-loop construct. In my own code, when I have some dozens of small things (like opcodes or forth words or APL operators), I specifically do want a "one-liner" syntax for most of them. The individual elements are usually so small that it's distasteful to spend 10 lines of code on them, and especially because the real understanding lies in the 'space between', so I want to see a large subset of the elements at once, and not put code-blinders on to focus on one element at a time.
>These are all pretty straight forward, [...] wrapping an expression in a block, defining a variable x and using it, conditional statements, and running an expression n times.
Making your reader learn some ad-hoc shorthands you wrote to avoid declaring blocks, defining variables or writing conditions in my book is very impolite
Is this supposed to be a specific coding style or paradigm?
I’ve never seen code written like this in real-world projects — maybe except for things like the "business card ray tracer". When I checked out Arthur Whitney’s Wikipedia page I noticed he also made the J programming language (which is open source) and the code there has that same super-dense style https://github.com/jsoftware/jsource/blob/master/jsrc/j.c
> Is this supposed to be a specific coding style or paradigm?
This is indeed Whitney's distinctive coding style, well known for its use in his various array programming language interpreters. His coding style is famously minimalist and idiosyncratic often fitting entire implementations of interpreters in a few pages.
This has been discussed a number of times on HN. I have collected some of the interesting comments on this topic from previous threads here in this meta comment: https://news.ycombinator.com/item?id=45800777#45805346
> I’ve never seen code written like this in real-world projects
Lucky you. I've seen far worse (at least this is somewhat consistent). But this isn't C anymore, it is a new language built on top of C and then a program written in that language. C is merely the first stage compilation target.
It's similar to J and that family of languages (K is another). Those are inspired by APL, which also has this super compact nature but in addition it largely uses non-ascii symbols. Apparently it is something you can get used to and notionally has some advantages (extreme density means you can see 'more' of the program on a given page, for example, and you need fewer layers of abstraction).
Possibly related(ish): video about co-dfns, prompted by a previous HN thread (links in video summary), not written in C but put together in a similarly dense style: https://www.youtube.com/watch?v=gcUWTa16Jc0
ksimple is eight bit. 128 is the unsigned middle or one plus signed max. usually using it for null or error signal. on sixty for bit k implementations it would be two to the sixty three.
I don't writing code like that will make the average programmer team any faster. Unless you are really deep into the code and have a good mental model of how the symbols are structured it think its going to take longer with the constant need to refer back/ re work out what a symbol means.
I'd rather have the descriptive variable names. What he writes looks akin to minified JS to me.
The macros are fine as concept, i've used something similar before for reducing code size,e.g. defining hundreds of similar functions and stuff.
What is incomprehensible and puts the entire thing into "Obfuscated C" territory is one-letter variables. You'll need to memorize all of them and can't reuse them in normal code. If at least the variables were self-descriptive i'd support such coding style, but it clearly need comments.
It's interesting to compare that version from 2017 with the current version from 2025: https://github.com/KxSystems/javakdb/blob/9a94dc5af9288fe845... — the current one is over ten times as long in terms of number of lines, and has copious comments, but still has the short names and dense code.
The C preprocessor allows you to define a limited DSL on top of C. This is... sometimes a good thing, and often convenient, even if it makes it hard to understand.
I think _all_ programming is about finding an appropriate DSL for the problem at hand. First you need to understand the “language” of the problem then you develop a “lingo”.
That might be an excellent reason not to use some of these capabilities. And maybe in a different situation it would make sense to use the mechanisms provided. Programmer’s responsibility to decide what’s appropriate in each case, that’s all I’m saying.
This style is inherently worse because there's no spaces. My brain has been wired since 4 years old to read words, not letters. Words are separated by spaces. Havingnospacesbetweenwordsmakesthemexponentiallyhardertoreadandcomprehend.
At first, I thought it looked like line noise. $var on the left of the = sign? Constructs like $_ and @_? more obscure constructs were worse.
But I had to keep going and then one day something happened. It was like one of those 3d stereograms where your eyes have to cross or uncross. The line noise became idioms and I just started becoming fluent in perl.
I liked some of it too - stuff like "unless foo" being more a readable/human of saying if not foo.
perl became beautiful to me - it was the language I thought in, and at the highest level. I could take an idea in my mind and express it in perl.
But I had some limits. I would restrain myself on putting entire loops or nested expression on one line just to "save space".
I used regular expressions, but sometimes would match multiple times instead of all in one giant unreadable "efficient" expression.
and then, I looked at other people's perl. GAH! I guess other people can "express themselves in perl", but rarely was it beautiful or kind, it was statistically worse and closer to vomit.
I like python now. more sanity, (somewhat) more likely that different people will solve a problem with similar and/or readable code.
by the way, very powerful article (even if I intensely dislike the code)
I asked ChatGPT to explain the code from the OP (without the header file), and it seems to have given a really good breakdown. Although I know nothing about interpreters, C, or this fucked style, so who really knows if it makes any sense at all…
Everyone knows that debugging is twice as hard as writing a program in the first place. So if you’re as clever as you can be when you write it, how will you ever debug it?
Agreed. Although it's also a bit worse than that for coding exclusively with macros. You have to add an extra degree of complexity for any additional code generator you add to your toolchain, when that path comes into play for debugging. Since we whole-buffalo'ed this situation, that's 100% of the code you could possibly need to debug.
As a very long time C programmer: don't try to be smart. The more you rely on fancy preprocessor tricks the harder it will be to understand and debug your code.
The C preprocessor gives you enough power to shoot yourself in the foot, repeatedly, with anything from small caliber handguns to nuclear weapons. You may well end up losing control over your project entirely.
One nice example: glusterfs. There are a couple of macros in use there that, when they work are magic. But when they don't you lose days, sometimes weeks. This is not the way to solve coding problems, you only appear smart as long as you remember what you've built. Your other self, three years down the road is going to want to kill the present one, and the same goes for your colleagues a few weeks from now.
yes! like any craft, this works only if you keep practising it.
various implementations of k, written in this style (with iterative improvements), have been in constant development for decades getting very good use out of these macros.
Losing control of a project is likely more due to the programmers on it than the tools they use. IMHO _anything_ done consistently can be reasoned about and if necessary undone.
Not necessarily. Sometimes the rot goes so deep that there is really no way out.
And the C pre-processor has figured prominently in more than one such case in my career. And it was precisely in the kind of way that is described in TFA.
For something to be doable it needs to make economic sense as well and that's the problem with nightmare trickery like this. Initially it seems like a shortcut, but in the long run the price tag keeps going up.
Best guess is that your analysis is missing some detail. People not tools write programs. Also any serious discussion here ends up in politics. If you design your software so that the programmers are fungible then the software suffers regardless of your choices.
> Best guess is that your analysis is missing some detail.
What do you base that guess on?
I'm not saying it couldn't be done, I'm just saying that it sometimes just isn't worth it.
> People not tools write programs.
Yes. And just like people sometimes write crazy manifestos there isn't much point in fixing them, the purpose that they might have served is most likely better addressed by replacing them entirely.
> Also any serious discussion here ends up in politics.
Everybody knows HN doesn't do politics ;)
> If you design your software so that the programmers are fungible then the software suffers regardless of your choices.
Programmers are fungible not because they are cast from the same mold, but because - assuming they are responsible people - they can look past their present day horizon to the future, where after a lot of context switching they have to revisit that which they have made before, or, where they have to take over someone else's project, either because that person moved to a different role or because they've moved on entirely.
It is with such a future in mind that you can, if you want, make the life of that person a little bit easier by focusing on clarity of thought rather than terseness in expression. Nobody ever died for want of a few keystrokes more, but I'd have been a lot happier if some people made a habit of writing down first what the pile of executable spaghetti they wrought was supposed to be doing in the first place.
If you have not seen how bad it can get then more power to you.
Just one anecdote, which I may have posted on HN before, but a long time ago I worked for a game studio where there was a programmer who got into a fight with management. He left and I had to take over his project. All of the variable names were fruits and all of the functions were vegetables (or the reverse, as I said, it is long ago). There were no comments. But there were some bugs that needed fixing.
Clearly in his mind programmers were not fungible, and in my view the software suffered from his choices. So the one isn't necessarily a guarantee of the other (ok, n=1), though you might find them together every now and then.
I've seen some absolutely brilliant code that was clever and clear. That's the kind of thing that I aspire to, not to see how far I can push the CPP to do stuff it was never intended to do in the first place. We have contests for that sort of thing but it isn't the kind of construct that you should foist off on others in your line of work. Not if we're ever going to get serious about that engineering thing.
Recreational programming, that's a different story. Go wild, and I really hope you enjoy it. But if you submit your preprocessor based magical DSL as a pull request I'll nix it.
All good decisions are a product of the particular circumstances in which they arise. This post seemed to be about generalizing that process which I would guess comes out of a supposition of fungiblity.
As much as one can use a given style for a personal project so can one for a professional one so long as it fills the given need. Too often (in my view) fungibility is seen as a preeminent requirement and layers and layers of self justifying processes are built on top of that. I’m only saying that’s a choice and the costs and benefits are not as obvious as most suppose.
Also you can minimize risks with redundancy but most presume those costs to be too high. But again this quickly becomes about politics.
I think the main question is whether or not you want to reach your goal and see programming as a means to an end or as the end itself. Usually, even when working on my own projects I have a goal, and the software is just a means to an end. So I tend to work on my own as though I am a team of one rather than that I am working 'just for myself'. This means I set up a whole pile of superstructure that isn't strictly a requirement, force myself to try to abstract as cleanly as I can think of and even refactor my code (when there is no project lead telling me to do so), write tests and abstain from trying to be too clever because it gives me better and faster results.
I'd imagine a chef or a competent musician would still use their hard won skill when cooking for themselves or making music for their own enjoyment.
Seems to me that this is now exponentially true with AI coding assistants. If you don't understand what you're adding, and you're being clever - you can quickly end up in a situation where you can't reason effectively about your system.
I'm seeing this on multiple fronts, and it's quickly becoming an unsustainable situation in some areas. I expect I'm not alone in this regard.
All great industrial apps are DSLs for specific domains, because often time end users are much smarter & craftier than developers. Some great examples:
- AutoCad (vector drawing DSL on top of Lisp)
- Mathematica (symbolic algebra DSL - Lisp & C)
- Aspen One (Thermodynamics/Chemistry DSL on FORTRAN)
- COMSOL (Multiphysics DSL C++)
- Verilog (FPGA design DSL C)
and also general purpose tools like Regex, XLA, CERN/Root, SQL, HTML/CSS,...
HN stories about Whitney's code tend to predictably attract a lot of comments about the coding style, so I thought I'd share a couple of positive discussions from previous related posts.
"Its density is many times higher than most C programs, but that's no big obstacle to understanding if you don't attempt to "skim" it; you need to read it character-by-character from top to bottom. It starts off defining some basic types, C for Character and I for Integer, and then the all-important Array. This is followed by some more shorthand for printf, return, and functions of one and two arguments, all of the array type. The DO macro is used to make iteration more concise. Then the function definitions begin. ma allocates an array of n integers (this code is hardcoded for a 32-bit system), mv is basically memcpy, tr (Total Rank?) is used to compute the total number of elements, and ga (Get/Generate Array) allocates an array. This is followed by the definitions of all the primitive operations (interestingly, find is empty), a few more globals, and then the main evaluator body. Lastly, main contains the REPL. While I don't think this style is suitable for most programmers, it's unfortunate that the industry seems to have gone towards the other extreme." -- userbinator
"There's something very satisfying about how this style seems to "climb the abstraction ladder" very quickly, but all of those abstractions he creates are not wasted and immediately put to use. I think much of the amazement and beauty is that there isn't much code at all, and yet it does so much. It's the complete opposite of the bloated, lazy, lowest-common-denominator trend that's been spreading in many other languages's communities." -- userbinator
"For people not accustomed to the style of Whitney, you can read various HN threads from the past to learn more about why he writes programs the way he does. It's deliberate and powerful." -- hakanderyal
"Whitney is famous for writing code like this, it's been his coding style for decades. For example, he wrote an early J interpreter this way in 1989. There's also a buddy allocator he wrote at Morgan Stanley that's only about 10 lines of C code." -- papercrane
When I see stuff like this, personally, I don't try to understand it, as code like this emerges from basically three motivations:
- The other person wanted to write in some other more (functional|object oriented|stack) language but couldn't, so they did this.
- The person couldn't be bothered to learn idioms for the target language and didn't care about others being able to read the program.
- The person intentionally wanted to obfuscate the program.
And none of these are good reasons to write code in a particular way. Code is about communication. Code like this is the equivalent to saying "I know the grammatical convention in English is subject-verb-object but I feel like speaking verb-object-subject and other people will have to just deal with it"—which, obviously, is a horrible way to communicate if you actually want to share ideas/get your point across.
That all said, the desire to have logic expressed more compactly and declaratively definitely resonates. Unfortunately C style verbosity and impurity remains dominant.
> "Opinions on his coding style are divided, though general consensus seems to be that it's incomprehensible."
I wholeheartedly concur with popular opinion. It's like writing a program in obfuscated code.
Hmmm... his way of basically making C work like APL made me wonder: Is there a programming language out there that defines its own syntax in some sort of header and then uses that syntax for the actual code?
Obfuscation is usually just a lack of accountability, and naive job security through avoiding peer-review.
Practically speaking, if people can't understand you, than why are you even on the team? Some problems can't be solved alone even if you live to a 116 years old.
Also, folks could start dropping code in single instruction obfuscated C for the lols =3
Whitney has valid reasons to write code this way. If you look at his career, you'll understand how this is not a problem - he literally spent decades working on "one-page" programs written that way. It's not "for the lols", it's simply what he's been comfortable with for 50+ years.
He's a software developer from a different era, when individual programmers wrote tiny (by today's standard) programs that powered entire industries. So for what he's been doing his entire career, neither lack of accountability, job security, or working with teams are really applicable.
Ivory tower politics is never an excuse, and failure to adapt to the shop standards usually means your position ends. Inflicting a goofy meta-circular interpreter on people is a liability.
Anyone competent would normally revert that nonsense in about 30 seconds, as it looks like a compressed/generated underhanded payload. "Trust me bro" is also not a valid excuse. =3
This isn't about Ivory tower politics or gate keeping. It's just a fact. Software development changed and Whitney started his career 45 years ago.
If you need help understanding what I mean, look at the credits of computer games released in the 80s and early 90s. You'll usually find a single programmer, with maybe one or two others, who contributed specialised parts like sound/music processing or special effects. No one cared about your particular programming style, because there were no big teams, no code reviews, no PRs. If you had questions, your fellow programmer would simply sit down with you and go over the details until you got familiar with their style and -code.
> failure to adapt to the shop standards usually means your position ends
Well, he runs his own company and has been his own boss for the past 32 years so again - this simply doesn't apply to him.
It does if any of his customers ever care about maintaining the kind of code after his death.
Code is read more than it is written, and most of us don’t and wouldn’t write in this style. This could mean he’s much smarter than the rest of us, or he could just be a jerk doing his own thing. In either case I’ve never had a good experience working with coders who are this “clever”. Real brilliance is writing code anyone can understand that remains performant and well tested. This is more like the obfuscated Perl contest entries. I guess it’s cool that you can do it, but good sense dictates that you shouldn’t.
As to OPs endeavor to understand this style, it is an interesting learning approach, but I think reading a lot of code in many styles that are actually used by more than one guy is likely to get make you “smarter”.
> It does if any of his customers ever care about maintaining the kind of code after his death.
Which is why there's annotated and reformatted versions of the code. There's basically a "clean" version for those who care about such things and his "development"-version, which looks like executable line noise to the uninitiated.
> This could mean he’s much smarter than the rest of us, or he could just be a jerk doing his own thing.
Or - and I know this is difficult to comprehend these days - he cultivated this style over decades and it's just easier for HIM to work with code like this. No teams, no code reviews, no systems upon systems that need to interact. Just a single page program that does one thing and that he (the only contributor and his own boss) is able to understand and work with because that's what he did for past 50 years.
> In either case I’ve never had a good experience working with coders who are this “clever”.
Neither have I and I wouldn't write code like that either. I also don't think that reading and understanding such code makes you "smarter".
It's more of a peek into a different era of software development and one particular person's preferences.
Still it's amusing how Whitney's style seems to personally offend people. It's just a different way of programming that works for this one guy due to very specific circumstances. Neither the OP nor Whitney himself advocate for emulating this style.
Conways Law tends to manifest in both directions...
It may be profitable having systems only a few people in the world could understand, but the scope of development is constrained.
I respect your opinion, but also recognize languages like Forth/Fortran actually killed people with 1 character syntax errors. People need to be as unsurprising as possible on large team projects. Sure, all our arrays today could be written in only l's , I's, and 1's like lIl1Il1lI[I1lI11l]... and being a CEO is also still not a valid excuse. =3
I’m wondering now with LLM in the loop, how the languages of solving complex problems will evolve in the long run.
Perhaps I will start to playing with this macro style ladder of abstraction with the help of LLM. Such as literate programming with an AI agent. Computer is much better at parsing than us. We can stand on highest rung of the ladder.
This code style is psychotic. I had to reverse-engineer and verify a C codebase that was machine-obfuscated and it was still clearer to follow than this. Increasing clarity through naming is great, but balancing information density is, dare I say, also a desirable goal. Compacting code rapidly diminishes returns once you're relying on a language having insignificant whitespace.
They're all macros to make common operations more compact
I read the J Incunabulum before encountering this, and the point that stands out is that you don't start by jumping into the middle of it like many programmers who are familiar with C will do; the macros defined at the beginning will confuse you otherwise. They also build upon previous ones, so the code ends up climbing the "abstraction ladder" very quickly. I personally like the Iterate macro (i), for how it compresses a relatively verbose loop into a single character; and of course in an array language, the iteration is entirely implicit.
In other words, I believe the reason this code is hard to read for many who are used to more "normal" C styles is because of its density; in just a few dozen lines, it creates many abstractions and uses them immediately, something which would otherwise be many many pages long in a more normal style. Thus if you try to "skim read" it, you are taking in a significantly higher amount of complexity than usual. It needs to be read one character at a time.
As someone who has spent considerable time working with huge codebases composed of hundreds of tiny files that have barely any substance to them, and trying to find where things happen becomes an exercise in search, this extreme compactness feels very refreshing.
When you zoom out on Google maps, usually the street names and bussiness pins disappear. This code style is like having a fully informationally dense zoomed out map. Being too zoomed in is definitely frustrating, and reminds me of early attempts at "mobile" web pages. But I'm not sure that this condensed code style is a good general solution either, it certainly seems overwhelming, like a where's waldo puzzle.
I prefer having a consistent information density regardless of zoom level. When I would like to see more details, I would like to achieve that by zooming in. When I want the overview, I'd prefer to zoom out and have certain details omitted from the map.
Having a clear and by-convention code organization is one way to achieve this. Then to drill into the details, just navigate the project directory to the right file.
For rabbit Hole style hunting, following references/symbol usage/definition is ideal, which is enabled by modern IDEs.
K and family are made for data analysis. Data analyses are relatively simple software projects that don't have a very wide scope. I think this dense style of programming falls apart when you consider the breadth of requirements of typical modern application software.
I have definitely seen many 'typical modern applications' where the business logic can be summarized into 100 lines of code. The rest is just shoveling things around.
>In other words, I believe the reason this code is hard to read for many who are used to more "normal" C styles is because of its density; in just a few dozen lines, it creates many abstractions and uses them immediately, something which would otherwise be many many pages long in a more normal style.
I also spent some time with the Incunabulum and came away with a slightly different conclusion. I only really grokked it after going through and renaming the variables to colorful emojis (https://imgur.com/F27ZNfk). That made me think that, in addition to informational density, a big part of the initial difficulty is orthographic. IMO two features of our current programming culture make this coding style hard to read: (1) Most modern languages discourage or forbid symbol/emoji characters in identifiers, even though their highly distinctive shapes would make this kind of code much more readable, just as they do in mathematical notation (there's a reason APL looked the way it did!). (2) When it comes to color, most editors default to "syntax highlighting" (each different syntactic category gets a different color), whereas what's often most helpful (esp. here) is token-based highlighting, where each distinct identifier (generally) gets its own color (This was pioneered afaik by Sublime Text which calls it "hashed syntax highlighting" and is sometimes called "semantic highlighting" though that term was later co-opted by VSCode to mean something quite different.) Once I renamed the identifiers so it becomes easier to recognize them at a glance by shape and/or color the whole thing became much easier to follow.
I've experimented a few times with coloring my variables explicitly (using a prefix like R for red, hiding the letters, etc) after playing with colorforth. I agree getting color helps with small shapes, but I think the colors shouldn't be arbitrary: every character Arthur types is a choice about how the code should look, what he is going to need, and what he needs to see at the same time, and it seems like a missed opportunity to turn an important decision about what something is named (or colored) over to a random number generator.
> (1) Most modern languages discourage or forbid symbol/emoji characters in identifiers
> (2) When it comes to color,
Call me boomer if you wish, but if you can't grasp the value of having your code readable on a 24 rows by 80 columns, black and white screen, you are not a software developer. You are not even a programmer: at most, you are a prompt typist for ChatGPT.
While I agree that, if the function at hand can’t fit in a 25x80 window it most likely should be broken in smaller functions, there are kinder ways to say that.
I also joke God made the VT100 with 80 columns for a reason.
... For the reason that IBM made their 1928 card with 80 columns, in an attempt to increase the storage efficiency of Hollerith’s 45-column card without increasing its size?
That said, ~60 characters per printed line has been the typographer’s recommendation for much longer. Which is why typographers dislike Times and derivatives when used on normal-sized single-column pages, as that typeface was made to squeeze more characters into narrow newspaper columns (it’s in the name).
23x75 to allow for a status bar and the possibility that the code may be quoted in an email. Also, it’s green on black. Or possibly amber.
And yet I still have a utility named "~/bin/\uE43E"
\uExxx is in the private use area. What is it?
Adding some comments would have been advisable, but I guess he doesn’t need comments.
APL taught me the importance of comments - if I didn’t comment my code thoroughly I would forget how it works and what it did as soon as I moved away from the keyboard. It is a cruel language.
"... trying to find where things happen becomes an exercise in search."
It seems like developers actually prefer large codebases and having to use recursive search through multiple layers of sub directories
I prefer no sub directories and being able to just grep against *.[ch] with no recursion
I think projects in Java languages are the worst when it comes to having to search through numerous small verbose files having barely any substance to them. IDEs probably make this easier but I don't use one
Years ago I read that Whitney's "IDE" is something like the Windows console and Notepad. He has said in interviews that he wants all the code to fit on a single page
What about `grep -R .`?
The way to understand Arthur Whitney's C code is to first learn APL (or, more appropriately, one of his languages in the family). If you skip that part, it'll just look like a weirdo C convention, when really he's trying to write C as if it were APL. The most obvious of the typographic stylings--the lack of spaces, single-character names, and functions on a single line--are how he writes APL too. This is perhaps like being a Pascal programmer coming to C and indignantly starting with "#define begin {" and so forth, except that atw is not a mere mortal like us.
> The way to understand Arthur Whitney's C code is to first learn APL
This is the main insight in my breakdown of the J Incunabulum:
https://blog.wilsonb.com/posts/2025-06-06-readable-code-is-u...
When I first encountered it years ago, the thing was impenetrable, but after learning APL to a high level, it now reads like a simple, direct expression of intent. The code even clearly communicates design tradeoffs and the intended focus of experimentation. Or more on the nose, to me the code ends up feeling primarily like extremely readable communication of ideas between like-minded humans. This is a very rare thing in software development in my experience.
IMHO, ideas around "readable code" and "good practices" in software development these days optimize for large, high-turnover teams working on large codebases. Statistically speaking, network effects mean that these are the codebasese and developer experiences we are most likely to hear about. However, as an industry, I think we are relatively blind to alternatives. We don't have sufficient shared language and cognitive tooling to understand how to optimize software dev for small, expert teams.
Thanks for that breakdown, that does make it a lot more understandable.
> DO defines our basic loop operation, so iterations will probably all naïvely be O(1);
Shouldn't that be "naïvely be O(n)"?
It looks like a weirdo C convention to APLers too though. Whitney writes K that way, but single-line functions in particular aren't used a lot in production APL, and weren't even possible before dfns were introduced (the classic "tradfn" always starts with a header line). All the stuff like macros with implicit variable names, type punning, and ternary operators just doesn't exist in APL. And what APL's actually about, arithmetic and other primives that act on whole immutable arrays, is not part of the style at all!
"the typographic stylings ... are how he writes" is what I said, isn't it? :) Well said.
>This is perhaps like being a Pascal programmer coming to C and indignantly starting with "#define begin {" and so forth
Ah, like Stephen Bourne
My first thought was "oh, this just looks like a functional language" but my next thought was "with the added benefit of relying on the horrors of the C preprocessor."
Would learning J work instead?
It’s probably more accessible than APL since its symbols can be found on conventional keyboards.
Every time I read about APL, I'm reminded of Lev Grossman's "The Magicians" — I'm always imagining some keyboard with just a little bit more than two dimensions; and, with sufficient capabilities, I could stretch to hit the meta-keys that let me type APL directly on my modified split MTGAP keyboard.
We know, the beginning of the article tells us his C code is APL-inspired. So many comments that just summarize the article on a surface level.
Yes, but... even if you know that it is APL inspired, that does not change the fact that this is not how you want to write C.
The C pre-processor is probably one of the most abused pieces of the C toolchain and I've had to clean up more than once after a 'clever' programmer left the premises and their colleagues had no idea of what they were looking at. Just don't. Keep it simple, and comment your intent, not what the code does. Use descriptive names. Avoid globally scoped data and functions with side effects.
That doesn't look smart and it won't make you look smart, but it is smart because the stuff you build will be reliable, predictable and maintainable.
Layman question: say you have a C codebase with a bunch of preprocessor macros and you want to get rid of a particular one that's too clever, and assume no other macros depend on it.
Is it possible to command the preprocessor to take the source files as input and print them out with that one particular macro expanded and no other changes?
Intuitively, it sounds like it should be possible, and then you'd end up with a code base with a bunch of repetition but one fewer too-clever abstraction - and refactoring to deal with repetition (if necessary!) is a far more approachable and well-understood problem.
(Kind of like how some fancy compiles-to-javascript languages have a literal 'mytool --escape' command that will turn the entire code base into a plain, non-minified javascript in case you ever want to stop using them.)
That's a great question.
I found this on SO:
https://stackoverflow.com/questions/59553295/selective-macro...
Maybe that would work for your use case?
What I like about your question is that I always assumed the answer was a hard 'no' but that appears not to be the case.
The beginning of the article talks about not learning APL--specifically mentions that he's not here to talk about APL--and proceeds into a wide-eyed dissection of the C without mentioning APL syntax again. It also doesn't, literally, say that the C is like APL; it says Arthur is an APL guy who writes weird C code. Another comment disagrees that this is APL style at all--which is it?? I think you could have given me more credit than this. I read the article and participated as best I could. I'm always happy to bump APL related articles so they get more visibility.
It's irrelevant that someone doesn't think the code is APL-inspired. Their disagreement is as much with the article as your comment. I felt like what is written in the article already implied what I then read in your comment. Credit where due, the disagreement with the article probably would've not been posted if the implications in that part hadn't been re-stated plainly. Comments like these can be useful as pointers to specific aspects of an article, where conversations can be organized under, now that I think about it.
Dunno why electroly is dragging me into this but I believe you've misread the article. When it says "His languages take significantly after APL" it means the languages themselves and not their implementations.
The article: "Let's make sense of the C code by the APL guy"
Do you think the article meant to say it was more likely that the code wasn't inspired by APL?
I think the article expresses no position. Most source code for array languages is not, in fact, inspired by APL. I encourage you to check a few random entries at [0]; Kap and April are some particularly wordy implementations, and even A+ mostly consists of code by programmers other than Whitney, with a variety of styles.
I do agree that Whitney was inspired to some extent by APL conventions (not exclusively; he was quite a Lisp fan and that's the source of his indentation style when he writes multi-line functions, e.g. in [1]). The original comment was not just a summary of this claim but more like an elaboration, and began with the much stronger statement "The way to understand Arthur Whitney's C code is to first learn APL", which I moderately disagree with.
[0] https://aplwiki.com/wiki/List_of_open-source_array_languages
[1] https://code.jsoftware.com/wiki/Essays/Incunabulum
I unfortunately glossed over the part of the original comment that gives it substance: "The most obvious of the typographic stylings--the lack of spaces, single-character names, and functions on a single line--are how he writes APL too."
That's backing for a claim.
Also, I haven't once written APL. I think this might've been borderline trolling, just because of how little investment I have in the topic in reality. Sorry.
I was curious about Shakti after reading this and the comments, so followed the link to shakti.com on Wikipedia. It seems it now redirects to the k.nyc domain, which displays a single letter 'k'.
I wondered if I was missing something, so looked at the source, to find the following:
Nothing but that. Which is, surely, the HTML equivalent of the Whitney C style: relying on the compiler/interpreter to add anything implicit, and shaving off every element that isn't required, such as a closing tag (which, yes, only matters if you're going to want something else afterwards, I guess...). Bravo.You shouldn’t have seen that. By now the cleaners must have gotten to you and erased your memory of these events.
could have been `<pre>k`
<tt>k
k
IMO this is a really good blog post, whatever you think of the coding style. Great effort by the author, really good for eight hours' work (as mentioned), and some illuminating conclusions: https://needleful.net/blog/2024/01/arthur_whitney.html#:~:te...
Reminds me of Bourne's attempt at beating C into Algol: https://www.tuhs.org/cgi-bin/utree.pl?file=V7/usr/src/cmd/sh...
Example: https://www.tuhs.org/cgi-bin/utree.pl?file=V7/usr/src/cmd/sh...
TIL `a ?: b`, that's actually pretty nice, a bit like Haskell's `fromMaybe b a` (or `a <|> b` if b can also b "empty")
and I do like `#define _(e...) ({e;})` – that's one where I feel the short macro name is OK. But I'd like it better if that were just how C worked from the get-go.
Very nice discussion at the end of the article. There are good things to be learnt from this code and its discussions even if you disagree with some or even most of the style.
Yes, '?:' is also known as the Elvis operator [1][2]. I sometimes use it in other languages such as Groovy. But I don't use it in C because this happens to be a GCC extension [3][4] and I've often had to compile my C projects with compilers that do not support GCC extensions. The C standard [5] defines the conditional operator as:
So per the C standard there must be an expression between '?' and ':' and an expression cannot be empty text. To confirm this we need to check the grammar for expression, which unfortunately is a little tedious to verify manually due to its deeply nested nature. Here it is: The recursion goes further many more levels deep but the gist is that no matter whichever branch the parser takes, it expects the expression to have at least one symbol per the grammar. Perhaps an easier way to confirm this is to just have the compiler warn us about it. For example: Or more explicitly: [1] https://kotlinlang.org/docs/null-safety.html#elvis-operator[2] https://groovy-lang.org/operators.html#_elvis_operator
[3] https://gcc.gnu.org/onlinedocs/gcc/Syntax-Extensions.html
[4] https://gcc.gnu.org/onlinedocs/gcc/Conditionals.html
[5] https://www.open-std.org/jtc1/sc22/wg14/www/docs/n3299.pdf
There are best or accepted practices in every field.
And in every field they work well for the average case, but are rarely the best fit for that specific scenario. And in some rare scenarios, doing the opposite is the solution that fits best the individual/team/project.
The interesting takeaway here is that crowd wisdom should be given weight and probably defaulted if we want to turn off our brains. But if you turn on your brain you will unavoidably see the many cracks that those solutions bring for your specific problem.
That's why I hate them being called "best" practices. No, they aren't the best practices, they are the mediocre practices. Sometimes, that's a good thing (you don't want to have the really bad results!), but if you aim for the very best practices, all of them will hold you back. It's basically a tradeoff, sacrificing efficiency / good performance in exchange for maintainability, consistency and reliability.
Having a solid product that solves a problem well can be orthogonal to how well a codebase lends itself to readability, learning curve, and efficiently ramping up new developers on a project.
Just because you succeed at one says nothing about other practical and important metrics.
I don't think you're reading this correctly.
The proper way to read it is to understand the problem and its pros and cons.
Without going long in the speculation, the situation likely was: there's only one guy who really can deliver this because of his knowledge, cv and experience and we need it.
And at that point your choice is having a solution or not.
As the old saying goes the graveyards are full of irreplaceable men.
But even if we grant that only one person could deliver a solution, it wouldn’t change the fact that you’re giving up on certain things to get it.
Kudos on not just taking a combative stance on the code!
This was a very fun read that I'm fairly convinced I will have to come back to.
```
#define _(e...) ({e;})
#define x(a,e...) _(s x=a;e)
#define $(a,b) if(a)b;else
#define i(n,e) {int $n=n;int i=0;for(;i<$n;++i){e;}}
```
>These are all pretty straight forward, with one subtle caveat I only realized from the annotated code. They're all macros to make common operations more compact: wrapping an expression in a block, defining a variable x and using it, conditional statements, and running an expression n times.
This is war crime territory
Some of these are wrong to. You can encounter issues with #define
#define $(a,b) if(a)b;else
due to not having brackets. So it's just extremely lazy to.
This should not be downvoted, this sort of error is indeed a very easy one to make when dealing with the C pre-processor.
> Some of these are wrong to[o] <- that needs an extra 'o' > due to not having brackets. <- that one is fine > So it's just extremely lazy to[o]. <- that needs an extra 'o' too
'to' comes in two versons, 'too' and 'to', both have different meanings.
Good grief! Are we really so insufferable as software developers that we can't just appreciate a brilliant article about the work of a remarkable computer scientist without nitpicking every supposed "bad practice"?
The whole point of the piece seems completely lost on some readers. Yes, we all know that #define $(a,b) if(a)b;else is questionable. I don't need a crash course on C macros in the comments, thank you. The author already acknowledges that Whitney's style is controversial. Do we really need to keep rehashing that point in every comment, or can we finally focus on how all this unconventional code fits together beautifully to form a working interpreter?
> I don't need a crash course on C macros in the comments, thank you.
This is an enduring great & terrible thing about sites like HN and reddit: As people become more senior & experienced, junior engineers come in to fill the ranks. You and I don't need a crash course on C macros in the comments. But I promise you, a lot of people here have no idea why #define $(a,b) if(a)b;else is a weird C macro.
How much should HN cater to junior engineers?
It is nothing to do with seniors vs. juniors but merely a lack understanding as to the intent behind somebody's work. When an acknowledged expert does something out of the ordinary you ask why and try to grasp his pov rather than pointing out obvious trivialities.
> How much should HN cater to junior engineers?
The assumption that HN should cater to junior engineers is curious. It implies a purpose the site has never claimed to have.
Yeah, I feel the same way you do, but then console myself with this quote - “Mediocrity knows nothing higher than itself; but Talent instantly recognizes Genius.” (from The Valley of Fear by Arthur Conan Doyle).
People have a silly need to point out the obvious as a crutch to their ego.
I wouldn't have a problem with it, if the implication wasn't that the author became smarter as a result of reading this code. That's my whole beef with it.
'Hey, look at this interesting way of using the CPP to create a DSL'
I'm fine with that. But this is precisely what aspiring C programmers should avoid at all costs. It's not controversial. It's bad.
Still, since the article already contains this warning, some people might argue that it's unnecessary for us to add it as a response to every comment here.
This is a good use of macros. I understand people are frightened by how it looks but it’s just C in a terse, declarative style. It’s mostly straightforward, just dense and yes - will challenge you because of various obscure macro styles used.
I believe “oo” is probably an infinity error condition or some such not 100% sure. I didn’t see the author discuss it since they said it’s not used. Was probably used during development as a debug printout.
I agree, some of the macros are very useful, and I've found myself wanting DO(n, code) as a simpler for-loop construct. In my own code, when I have some dozens of small things (like opcodes or forth words or APL operators), I specifically do want a "one-liner" syntax for most of them. The individual elements are usually so small that it's distasteful to spend 10 lines of code on them, and especially because the real understanding lies in the 'space between', so I want to see a large subset of the elements at once, and not put code-blinders on to focus on one element at a time.
In reading many C code bases, including the Linux kernel, every one finds a use case for macros of this nature.
From the article
>These are all pretty straight forward, [...] wrapping an expression in a block, defining a variable x and using it, conditional statements, and running an expression n times.
Making your reader learn some ad-hoc shorthands you wrote to avoid declaring blocks, defining variables or writing conditions in my book is very impolite
Style doesn't need to be innovative.
> This is a good use of macros.
no.
Is this supposed to be a specific coding style or paradigm?
I’ve never seen code written like this in real-world projects — maybe except for things like the "business card ray tracer". When I checked out Arthur Whitney’s Wikipedia page I noticed he also made the J programming language (which is open source) and the code there has that same super-dense style https://github.com/jsoftware/jsource/blob/master/jsrc/j.c
> Is this supposed to be a specific coding style or paradigm?
This is indeed Whitney's distinctive coding style, well known for its use in his various array programming language interpreters. His coding style is famously minimalist and idiosyncratic often fitting entire implementations of interpreters in a few pages.
This has been discussed a number of times on HN. I have collected some of the interesting comments on this topic from previous threads here in this meta comment: https://news.ycombinator.com/item?id=45800777#45805346
Thanks! I can't imagine how to code in this style everyday, tbh :)
I imagine that if you do it enough it starts to become second nature.
> I’ve never seen code written like this in real-world projects
Lucky you. I've seen far worse (at least this is somewhat consistent). But this isn't C anymore, it is a new language built on top of C and then a program written in that language. C is merely the first stage compilation target.
And everyone says you can't do DSLs in boring old languages. :P
You can certainly do DSLs in C. That’s what SystemC is.
It's similar to J and that family of languages (K is another). Those are inspired by APL, which also has this super compact nature but in addition it largely uses non-ascii symbols. Apparently it is something you can get used to and notionally has some advantages (extreme density means you can see 'more' of the program on a given page, for example, and you need fewer layers of abstraction).
Possibly related(ish): video about co-dfns, prompted by a previous HN thread (links in video summary), not written in C but put together in a similarly dense style: https://www.youtube.com/watch?v=gcUWTa16Jc0
I believe it’s usually referred to as ‘OCC’. ;)
Could you elaborate? :) I found OrangeC Compiler but I'm not sure this is the OCC you've mentioned.
I’m sorry, I was just making a silly https://en.wikipedia.org/wiki/International_Obfuscated_C_Cod... joke.
Ah, that's cool! Thanks!
There's C which complies with the MISRA C standards, and there's C which complies with the IOCCC C standards. (There may also be some overlap.)
Much as a Real Programmer can write FORTRAN programs in any language, Whitney can write APL programs in any language.
For the APL fans (or haters) Unicomp makes keycaps with APL symbols for their (excellent) Model M mechanical keyboards.
I can’t explain why but “He’s assigning 128 to a string called Q” made me absolutely lose it.
ksimple is eight bit. 128 is the unsigned middle or one plus signed max. usually using it for null or error signal. on sixty for bit k implementations it would be two to the sixty three.
I don't writing code like that will make the average programmer team any faster. Unless you are really deep into the code and have a good mental model of how the symbols are structured it think its going to take longer with the constant need to refer back/ re work out what a symbol means. I'd rather have the descriptive variable names. What he writes looks akin to minified JS to me.
The same article is available under “I read Arthur Whiteney's code and all I got was Mental Illness”, which is apt.
This parades all the reasons why you may want to avoid C like the plague, and then some. This stuff gives me nightmares.
The macros are fine as concept, i've used something similar before for reducing code size,e.g. defining hundreds of similar functions and stuff. What is incomprehensible and puts the entire thing into "Obfuscated C" territory is one-letter variables. You'll need to memorize all of them and can't reuse them in normal code. If at least the variables were self-descriptive i'd support such coding style, but it clearly need comments.
“would you rather spend 10 days reading 100,000 lines of code, or 4 days reading 1000?"
More like 10 days understanding 100K loc or 30 days stabbing yourself in the eye over 4K loc
there's the java version too
https://github.com/KxSystems/javakdb/blob/8a263abee29de582cd...
People here might not notice - your link is the official client interface for talking to KDB+ processes from Java.
There's a decent chance your broker (or their dealers) are using stuff built on this.
It's interesting to compare that version from 2017 with the current version from 2025: https://github.com/KxSystems/javakdb/blob/9a94dc5af9288fe845... — the current one is over ten times as long in terms of number of lines, and has copious comments, but still has the short names and dense code.
The C preprocessor allows you to define a limited DSL on top of C. This is... sometimes a good thing, and often convenient, even if it makes it hard to understand.
I think _all_ programming is about finding an appropriate DSL for the problem at hand. First you need to understand the “language” of the problem then you develop a “lingo”.
Exactly. You build the language up to the problem. When you do that, the program almost writes itself.
For extremely small values of 'sometimes' where sometimes is constrained by the following expressions evaluating to 'true':
- you have no interest in maintaining your code
- your code will never be maintained by someone else
- you know your C preprocessor better than you know your C compiler
- your favorite language isn't available for this particular target
- you don't mind object level debugging
- your idea of a fun time is to spend a few hours per day memorizing code
- you really are smarter than everybody else
It depends how far down the rabbit hole you go.
Something like JCON (originally from mongodb, see https://blogs.gnome.org/chergert/2016/10/21/jcon/) is actually pretty nice IMO.
It’s cool that you can do this in C! And it’s cool that this article explores that.
As developers we have to decide where and when this makes sense, just like with other language features, libraries, architectural patterns, etc.
It uses multiple non-standard extensions to C. A strictly standards conformant compiler would refuse to compile it.
That might be an excellent reason not to use some of these capabilities. And maybe in a different situation it would make sense to use the mechanisms provided. Programmer’s responsibility to decide what’s appropriate in each case, that’s all I’m saying.
This style is inherently worse because there's no spaces. My brain has been wired since 4 years old to read words, not letters. Words are separated by spaces. Havingnospacesbetweenwordsmakesthemexponentiallyhardertoreadandcomprehend.
This reminds me of when I was learning perl.
At first, I thought it looked like line noise. $var on the left of the = sign? Constructs like $_ and @_? more obscure constructs were worse.
But I had to keep going and then one day something happened. It was like one of those 3d stereograms where your eyes have to cross or uncross. The line noise became idioms and I just started becoming fluent in perl.
I liked some of it too - stuff like "unless foo" being more a readable/human of saying if not foo.
perl became beautiful to me - it was the language I thought in, and at the highest level. I could take an idea in my mind and express it in perl.
But I had some limits. I would restrain myself on putting entire loops or nested expression on one line just to "save space".
I used regular expressions, but sometimes would match multiple times instead of all in one giant unreadable "efficient" expression.
and then, I looked at other people's perl. GAH! I guess other people can "express themselves in perl", but rarely was it beautiful or kind, it was statistically worse and closer to vomit.
I like python now. more sanity, (somewhat) more likely that different people will solve a problem with similar and/or readable code.
by the way, very powerful article (even if I intensely dislike the code)
You will not become smart, only crazy and unemployable. :)
Or an unrealized IOCCC champion Whitney seems to aspire to.
Whitney would never submit his code because it is trivially understandable and not obfuscated?
Are you saying most employers are smart by default??
The person who wrote this code might be a genius, but learning to read it isn’t going to make anyone smart. It’s basically obfuscated assembly code.
Ah yes... very tempting to ask an AI to refactor some large Java program (pick your language) "in the style of Arthur Whitney".
I asked ChatGPT to explain the code from the OP (without the header file), and it seems to have given a really good breakdown. Although I know nothing about interpreters, C, or this fucked style, so who really knows if it makes any sense at all…
The header file does most of the work. I submitted the output of gcc -E (preprocessor only) to ChatGPT: https://chatgpt.com/share/69093ba2-ae74-8006-abbb-5c7f24be23... -- and I found out about "tagged pointers".
https://en.wikipedia.org/wiki/Tagged_pointer
> His languages take significantly after APL, which was a very popular language for similar applications before the invention of (qwerty) keyboards.
Ok, so this article is tongue in cheek. Good to know that up front.
https://en.wikipedia.org/wiki/APL_(programming_language)#Har...
you would add a special "typeball" into your IBM Selectric Typewriter. Some pics:
https://www.duxburysystems.org/downloads/library/texas/apple...
https://pierce.smugmug.com/Misc/APL-Typeball/i-pjq6hWC
Holy molly this must be the equivalent of reading the necronomicon and getting cosmic madness disease as a result.
What a flex of patience!
This man casually codes up IOCCC entries.
The code registers a bit like FORTH in concept.
Kerrnigan’s law seems to apply:
Everyone knows that debugging is twice as hard as writing a program in the first place. So if you’re as clever as you can be when you write it, how will you ever debug it?
Agreed. Although it's also a bit worse than that for coding exclusively with macros. You have to add an extra degree of complexity for any additional code generator you add to your toolchain, when that path comes into play for debugging. Since we whole-buffalo'ed this situation, that's 100% of the code you could possibly need to debug.
Yes, precisely, that's when all that cleverness will come back to bite you hard.
"Which line was that again? Oh... "
Pics up the phone, dials.
"Honey, I won't be home in time for dinner."
I can guarantee that if you start looking at random, you will find the right line within the 43 guesses
Reminds me of a Python codebase I used to work with
The company was originally a bunch of Access/VB6 programmers.
Then they wrote their VB code in PHP.
And then they wrote their PHP code in Python. It was disgusting.
As a very long time C programmer: don't try to be smart. The more you rely on fancy preprocessor tricks the harder it will be to understand and debug your code.
The C preprocessor gives you enough power to shoot yourself in the foot, repeatedly, with anything from small caliber handguns to nuclear weapons. You may well end up losing control over your project entirely.
One nice example: glusterfs. There are a couple of macros in use there that, when they work are magic. But when they don't you lose days, sometimes weeks. This is not the way to solve coding problems, you only appear smart as long as you remember what you've built. Your other self, three years down the road is going to want to kill the present one, and the same goes for your colleagues a few weeks from now.
> as long as you remember what you've built
yes! like any craft, this works only if you keep practising it.
various implementations of k, written in this style (with iterative improvements), have been in constant development for decades getting very good use out of these macros.
Losing control of a project is likely more due to the programmers on it than the tools they use. IMHO _anything_ done consistently can be reasoned about and if necessary undone.
Not necessarily. Sometimes the rot goes so deep that there is really no way out.
And the C pre-processor has figured prominently in more than one such case in my career. And it was precisely in the kind of way that is described in TFA.
For something to be doable it needs to make economic sense as well and that's the problem with nightmare trickery like this. Initially it seems like a shortcut, but in the long run the price tag keeps going up.
Best guess is that your analysis is missing some detail. People not tools write programs. Also any serious discussion here ends up in politics. If you design your software so that the programmers are fungible then the software suffers regardless of your choices.
> Best guess is that your analysis is missing some detail.
What do you base that guess on?
I'm not saying it couldn't be done, I'm just saying that it sometimes just isn't worth it.
> People not tools write programs.
Yes. And just like people sometimes write crazy manifestos there isn't much point in fixing them, the purpose that they might have served is most likely better addressed by replacing them entirely.
> Also any serious discussion here ends up in politics.
Everybody knows HN doesn't do politics ;)
> If you design your software so that the programmers are fungible then the software suffers regardless of your choices.
Programmers are fungible not because they are cast from the same mold, but because - assuming they are responsible people - they can look past their present day horizon to the future, where after a lot of context switching they have to revisit that which they have made before, or, where they have to take over someone else's project, either because that person moved to a different role or because they've moved on entirely.
It is with such a future in mind that you can, if you want, make the life of that person a little bit easier by focusing on clarity of thought rather than terseness in expression. Nobody ever died for want of a few keystrokes more, but I'd have been a lot happier if some people made a habit of writing down first what the pile of executable spaghetti they wrought was supposed to be doing in the first place.
If you have not seen how bad it can get then more power to you.
Just one anecdote, which I may have posted on HN before, but a long time ago I worked for a game studio where there was a programmer who got into a fight with management. He left and I had to take over his project. All of the variable names were fruits and all of the functions were vegetables (or the reverse, as I said, it is long ago). There were no comments. But there were some bugs that needed fixing.
Clearly in his mind programmers were not fungible, and in my view the software suffered from his choices. So the one isn't necessarily a guarantee of the other (ok, n=1), though you might find them together every now and then.
I've seen some absolutely brilliant code that was clever and clear. That's the kind of thing that I aspire to, not to see how far I can push the CPP to do stuff it was never intended to do in the first place. We have contests for that sort of thing but it isn't the kind of construct that you should foist off on others in your line of work. Not if we're ever going to get serious about that engineering thing.
Recreational programming, that's a different story. Go wild, and I really hope you enjoy it. But if you submit your preprocessor based magical DSL as a pull request I'll nix it.
All good decisions are a product of the particular circumstances in which they arise. This post seemed to be about generalizing that process which I would guess comes out of a supposition of fungiblity.
As much as one can use a given style for a personal project so can one for a professional one so long as it fills the given need. Too often (in my view) fungibility is seen as a preeminent requirement and layers and layers of self justifying processes are built on top of that. I’m only saying that’s a choice and the costs and benefits are not as obvious as most suppose.
Also you can minimize risks with redundancy but most presume those costs to be too high. But again this quickly becomes about politics.
I think the main question is whether or not you want to reach your goal and see programming as a means to an end or as the end itself. Usually, even when working on my own projects I have a goal, and the software is just a means to an end. So I tend to work on my own as though I am a team of one rather than that I am working 'just for myself'. This means I set up a whole pile of superstructure that isn't strictly a requirement, force myself to try to abstract as cleanly as I can think of and even refactor my code (when there is no project lead telling me to do so), write tests and abstain from trying to be too clever because it gives me better and faster results.
I'd imagine a chef or a competent musician would still use their hard won skill when cooking for themselves or making music for their own enjoyment.
ZFS has a very nice set of macros that work very well:
https://github.com/openzfs/zfs/blob/master/include/os/freebs...
See P2PHRASE() and friends. They were inherited from OpenSolaris.
Seems to me that this is now exponentially true with AI coding assistants. If you don't understand what you're adding, and you're being clever - you can quickly end up in a situation where you can't reason effectively about your system.
I'm seeing this on multiple fronts, and it's quickly becoming an unsustainable situation in some areas. I expect I'm not alone in this regard.
I’d bet that a lot of the work done with AI assistants is decidedly _not_ clever.
All great industrial apps are DSLs for specific domains, because often time end users are much smarter & craftier than developers. Some great examples: - AutoCad (vector drawing DSL on top of Lisp) - Mathematica (symbolic algebra DSL - Lisp & C) - Aspen One (Thermodynamics/Chemistry DSL on FORTRAN) - COMSOL (Multiphysics DSL C++) - Verilog (FPGA design DSL C) and also general purpose tools like Regex, XLA, CERN/Root, SQL, HTML/CSS,...
HN stories about Whitney's code tend to predictably attract a lot of comments about the coding style, so I thought I'd share a couple of positive discussions from previous related posts.
Here's one from one of my favourite HN commenters posted at https://news.ycombinator.com/item?id=25902615#25903452 (Jan 2021):
"Its density is many times higher than most C programs, but that's no big obstacle to understanding if you don't attempt to "skim" it; you need to read it character-by-character from top to bottom. It starts off defining some basic types, C for Character and I for Integer, and then the all-important Array. This is followed by some more shorthand for printf, return, and functions of one and two arguments, all of the array type. The DO macro is used to make iteration more concise. Then the function definitions begin. ma allocates an array of n integers (this code is hardcoded for a 32-bit system), mv is basically memcpy, tr (Total Rank?) is used to compute the total number of elements, and ga (Get/Generate Array) allocates an array. This is followed by the definitions of all the primitive operations (interestingly, find is empty), a few more globals, and then the main evaluator body. Lastly, main contains the REPL. While I don't think this style is suitable for most programmers, it's unfortunate that the industry seems to have gone towards the other extreme." -- userbinator
Here's another from the same commenter on a different story at https://news.ycombinator.com/item?id=39026551#39038364 (Jan 2024):
"There's something very satisfying about how this style seems to "climb the abstraction ladder" very quickly, but all of those abstractions he creates are not wasted and immediately put to use. I think much of the amazement and beauty is that there isn't much code at all, and yet it does so much. It's the complete opposite of the bloated, lazy, lowest-common-denominator trend that's been spreading in many other languages's communities." -- userbinator
Another from the story at https://news.ycombinator.com/item?id=40544283#40544491 (Jun 2024):
"For people not accustomed to the style of Whitney, you can read various HN threads from the past to learn more about why he writes programs the way he does. It's deliberate and powerful." -- hakanderyal
One more from the same story at https://news.ycombinator.com/item?id=40544283#40545004 (Jun 2024):
"Whitney is famous for writing code like this, it's been his coding style for decades. For example, he wrote an early J interpreter this way in 1989. There's also a buddy allocator he wrote at Morgan Stanley that's only about 10 lines of C code." -- papercrane
Nice. Previous attempts by other users to decode Whitney's style of C programming can be found here - https://news.ycombinator.com/item?id=38889148
The stated reason Whitney does this is - https://news.ycombinator.com/item?id=32202742
Nice write up!
When I see stuff like this, personally, I don't try to understand it, as code like this emerges from basically three motivations:
- The other person wanted to write in some other more (functional|object oriented|stack) language but couldn't, so they did this.
- The person couldn't be bothered to learn idioms for the target language and didn't care about others being able to read the program.
- The person intentionally wanted to obfuscate the program.
And none of these are good reasons to write code in a particular way. Code is about communication. Code like this is the equivalent to saying "I know the grammatical convention in English is subject-verb-object but I feel like speaking verb-object-subject and other people will have to just deal with it"—which, obviously, is a horrible way to communicate if you actually want to share ideas/get your point across.
That all said, the desire to have logic expressed more compactly and declaratively definitely resonates. Unfortunately C style verbosity and impurity remains dominant.
> "Opinions on his coding style are divided, though general consensus seems to be that it's incomprehensible."
I wholeheartedly concur with popular opinion. It's like writing a program in obfuscated code.
Hmmm... his way of basically making C work like APL made me wonder: Is there a programming language out there that defines its own syntax in some sort of header and then uses that syntax for the actual code?
forth and lisp?
In racket, you can say something like "#lang X", which can modify the reader and let you create your own arbitrary syntax
During code reviews I would always ask for clear code because it's much harder to tell whether it's correct if it's unclear.
I got too much other stuff to do than decode the voynich manuscript...
Obfuscation is usually just a lack of accountability, and naive job security through avoiding peer-review.
Practically speaking, if people can't understand you, than why are you even on the team? Some problems can't be solved alone even if you live to a 116 years old.
Also, folks could start dropping code in single instruction obfuscated C for the lols =3
https://github.com/xoreaxeaxeax/movfuscator
Whitney has valid reasons to write code this way. If you look at his career, you'll understand how this is not a problem - he literally spent decades working on "one-page" programs written that way. It's not "for the lols", it's simply what he's been comfortable with for 50+ years.
He's a software developer from a different era, when individual programmers wrote tiny (by today's standard) programs that powered entire industries. So for what he's been doing his entire career, neither lack of accountability, job security, or working with teams are really applicable.
> He's a software developer from a different era
Ivory tower politics is never an excuse, and failure to adapt to the shop standards usually means your position ends. Inflicting a goofy meta-circular interpreter on people is a liability.
Anyone competent would normally revert that nonsense in about 30 seconds, as it looks like a compressed/generated underhanded payload. "Trust me bro" is also not a valid excuse. =3
https://en.wikipedia.org/wiki/Conways_Law
This isn't about Ivory tower politics or gate keeping. It's just a fact. Software development changed and Whitney started his career 45 years ago.
If you need help understanding what I mean, look at the credits of computer games released in the 80s and early 90s. You'll usually find a single programmer, with maybe one or two others, who contributed specialised parts like sound/music processing or special effects. No one cared about your particular programming style, because there were no big teams, no code reviews, no PRs. If you had questions, your fellow programmer would simply sit down with you and go over the details until you got familiar with their style and -code.
> failure to adapt to the shop standards usually means your position ends
Well, he runs his own company and has been his own boss for the past 32 years so again - this simply doesn't apply to him.
It does if any of his customers ever care about maintaining the kind of code after his death.
Code is read more than it is written, and most of us don’t and wouldn’t write in this style. This could mean he’s much smarter than the rest of us, or he could just be a jerk doing his own thing. In either case I’ve never had a good experience working with coders who are this “clever”. Real brilliance is writing code anyone can understand that remains performant and well tested. This is more like the obfuscated Perl contest entries. I guess it’s cool that you can do it, but good sense dictates that you shouldn’t.
As to OPs endeavor to understand this style, it is an interesting learning approach, but I think reading a lot of code in many styles that are actually used by more than one guy is likely to get make you “smarter”.
> It does if any of his customers ever care about maintaining the kind of code after his death.
Which is why there's annotated and reformatted versions of the code. There's basically a "clean" version for those who care about such things and his "development"-version, which looks like executable line noise to the uninitiated.
> This could mean he’s much smarter than the rest of us, or he could just be a jerk doing his own thing.
Or - and I know this is difficult to comprehend these days - he cultivated this style over decades and it's just easier for HIM to work with code like this. No teams, no code reviews, no systems upon systems that need to interact. Just a single page program that does one thing and that he (the only contributor and his own boss) is able to understand and work with because that's what he did for past 50 years.
> In either case I’ve never had a good experience working with coders who are this “clever”.
Neither have I and I wouldn't write code like that either. I also don't think that reading and understanding such code makes you "smarter".
It's more of a peek into a different era of software development and one particular person's preferences.
Still it's amusing how Whitney's style seems to personally offend people. It's just a different way of programming that works for this one guy due to very specific circumstances. Neither the OP nor Whitney himself advocate for emulating this style.
Conways Law tends to manifest in both directions...
It may be profitable having systems only a few people in the world could understand, but the scope of development is constrained.
I respect your opinion, but also recognize languages like Forth/Fortran actually killed people with 1 character syntax errors. People need to be as unsurprising as possible on large team projects. Sure, all our arrays today could be written in only l's , I's, and 1's like lIl1Il1lI[I1lI11l]... and being a CEO is also still not a valid excuse. =3
[dead]
I’m wondering now with LLM in the loop, how the languages of solving complex problems will evolve in the long run.
Perhaps I will start to playing with this macro style ladder of abstraction with the help of LLM. Such as literate programming with an AI agent. Computer is much better at parsing than us. We can stand on highest rung of the ladder.
[dead]
[dead]
[flagged]
"Please don't post shallow dismissals, especially of other people's work. A good critical comment teaches us something."
https://news.ycombinator.com/newsguidelines.html
This code style is psychotic. I had to reverse-engineer and verify a C codebase that was machine-obfuscated and it was still clearer to follow than this. Increasing clarity through naming is great, but balancing information density is, dare I say, also a desirable goal. Compacting code rapidly diminishes returns once you're relying on a language having insignificant whitespace.