Anatomy of Lisp

Anatomy of Lisp

John Allen. Anatomy of Lisp. McGraw-Hill. ISBN 0-07-001115-X. 1978.

This is a hard book to characterise. It’s simultaneously an introduction, a collection of advanced (for the time) programming techniques, and a guide to some very low-level implementation details. It contains a lot of the usual introductory material, but then delves into different representational choices for the basic data structures – lists, but also arrays and hash tables as well as Lisp code itself – and for the frames needed to represent function calls. It the tackles interpretation versus compilation, and even topics such as cross-compilation and just-in-time or memoised compilation.

It’s impossible to read this book without thinking of SICP, and indeed I think in many ways it’s a direct precursor. (It’s referenced a couple of times in SICP.) It has that same combination of high-level, semantics-driven programming combined with descriptions of underlying mechanisms. Where I think it mainly differs is that it mixes representations, using a more semantics-driven notation and explicit assembler instructions rather than sticking with Lisp variants for everything. It’s only when you see the two approaches side by side that you realise how clear Lisp can be in radically different contexts..

Another important book was published in 1978: Kernighan and Ritchie’s The C programming language. The two books share a certain low-level focus, but come to very different conclusions about the correct approach to dynamic storage, as the following footnote by Allen (slightly edited for clarity) illustrates:

Experiments have been performed in which Lisp programmers were allowed to return “garbage” to the free-space list themselves. The results were disastrous: list structure thought to be garbage was returned to the free-space list even though the structure was still being used by other computations.

C, of course, accepts this “disaster” as a natural consequence of manual allocation management. It encapsulates the different attitudes of the two contemporary language communities, though: one wanting a machine in which to think (and willing to use computing power to simplify tasks), and those wanting raw performance from limited hardware (and willing to complicate the programming task to get it).

Overall, Allen’s book is a great attempt at an end-to-end discussion of Lisp all the way down to the metal. Many of the techniques are still relevant, at least as exemplars against later, more efficient, techniques should be judged.

(Thanks to Paul McJones for pointing me to the free PDF of the book, which he was responsible for developing. See this post for a history of how this came about.)

TIL: Web site carbon ratings

TIL: Web site carbon ratings

Today I learned about a site that rates web pages according to their carbon footprints.

My web site does well:

nil

I’m not all that surprised by this, since I use a static site generator and minimal (almost no) JavaScript: basically as low-power as one could get.

The full report includes an approximate carbon cost per page visit (60mg), and even tries to work out the underlying energy provision, dirty or renewable – dirty in my case at the moment, unfortunately. (I’m currently hosted on GitHub.) I should perhaps take the hint and move onto an alternative provider, or even host it myself with a solar-powered Raspberry Pi. Low-tech Magazine does this, but they are based in Barcelona rather than Scotland…

You can even include a badge that reports your rating “live”:

(It caches so as to only make one request per day.) On the other hand this increases the JavaScript footprint slightly, so I don’t think I’ll be using it anywhere apart from just here.

Diary of an MP’s Wife: Inside and Outside Power

Sasha Swire

An insight into the British politics of the 2010s. And not a pretty sight.

The pitch for this book is that, as a political wife, the author had a unique ringside seat from which to observe the goings-on the (mostly male) politicians. And it’s true that she has plenty of insight into them and into her own position in the circle.

But she also exposes herself as almost a caricature of a Tory. She accepts the misogyny and crudeness of the male MPs as just how it is, and utterly lacks any understanding of why anyone would disagree with her own positions. She can’t accept, for example, that Remainers were sincere and their misgivings might be valid, or that the EU is anything other that fascinated with the idea of humiliating Britain. She constantly refers to the Labour party as “Marxists” and trots-out the usual tropes of how they would bankrupt the country, despite the fact that their “extreme left-wing” positions would be entirely mainstream in many European countries. She’s clear-eyed and unenthusiastic about Boris Johnson as Prime Minister, but then fawns over him when he wins his election landslide and glides over the time-bomb he planted on the Irish border issue, even though this was evident at the time.

The overwhelming theme, returned to again and again, is how self-interested the political class is, focused entirely on who’s up, who’s down, and how if affects their own careers. Lots of dinner parties, gentlemen’s clubs, and holidays: one is left amazed by how shallow everyone is.

It’s not a bad book, and an enjoyably light read. But politically insightful it isn’t, other than to highlight the trivial nature of politicians.

3/5. Finished Friday 23 February, 2024.

(Originally published on Goodreads.)

The Death of Grass

John Christopher (1956)

Post-apocalyptic fiction of a determinedly British kind. It’s not a bad book, and has a certain complexity to it in exploring how people’s attitudes might change when faced with the destruction of normal civilisation.

A deadly virus destroys all grass-like plants, thereby eliminating almost all food crops and the cattle that they feed. Worried the government might atom-bomb the major population centres, a small group leave London to make their way to a Lake District valley. On the way they encounter looters, towns gone feral to protect themselves – and become feral themselves, willing to kill to survive. In some ways it reads like Lord of the Flies with adults and a more pervading sense of long-term doom.

But it’s also very much a novel of its time, full of racism, sexism, classism, deference, stiff upper lips, and a sense of self-justification wrapped-up as duty. That makes it a hard read, and it doesn’t really have enough force to balance that out. Manu similar points are made elsewhere, for example in Earth Abides, without the 1950s baggage.

2/5. Finished Sunday 11 February, 2024.

(Originally published on Goodreads.)

Trying to refute some criticisms of Lisp

Trying to refute some criticisms of Lisp

I recently had a discussion with someone on Mastodon about Lisp and its perceived (by them) deficiencies as a language. There were some interesting points, but I felt I had to try to refute them, at least partially.

I should say from the start the I’m not blind to Lisp’s many inadequacies and anachronisms, merely pointing out that it has a context like everything else.

There seemed to be two main issues:

  • Poor design decisions throughout, and especially a lack of static typing
  • The shadows of really early machines in car and cadr

These points are tied together, but let’s try to unpack them.

Design

Let’s start with design. Lisp is over half a century old. I’d argue it was exceptionally well-designed – when it was designed. It lacks most modern advances in types because … well, they didn’t exist, many of them arose as solutions to perceived problems in Lisp (and Fortran), and many of those “solutions” still aren’t universally accepted, such as static typing itself.

What we’ve actually learned is that many aspects of programming lack any really universal solutions. If static typing were such an obvious and unarguable route to efficiency and quality, all new software would be being written in Haskell.

Typing and features

And the lack of modern types isn’t really as clear-cut as it appears. The argument about the lack of features in Lisp also ignores the presence of other features that are absent from almost all other languages.

Lisp’s numeric types are surprisingly flexible. Indeed, Common Lisp is still, in the 21st century, just about the only language in which one can write modern crypto algorithms like Diffie-Hellman key exchange without recourse to additional libraries, because it has arbitrary-precision integer arithmetic built-in to the standard operators. It also has rational numbers, so no loss of precision on division either.

The Common Lisp Object System (CLOS) is vastly more flexible than any modern object-oriented language. Sub-class methods can specify their relationship with the methods they override, such as being called after or just filtering the return values. Methods themselves are multiple-dispatch and so can be selected based on the types of their arguments as well as their target. The basic mechanisms can be overridden or extended using a meta-object protocol.

Then there are macros. It’s easy to underestimate these: after all, C has macros, doesn’t it? Well, yes – and no. A C macro is a function from strings to strings that can do literal string substitution of its arguments. A Lisp macro is a function from code to code that can perform arbitrary computation. They’re really not the same things at all, and it’s misleading that the same word is used for both. (C++ templates are a closer analogy, but still limited in comparison.)

The persistence of hardware 1: Stupid operation names

The complaints about car and cdr are long established: they were originally derived from machine-language instructions on the IBM 704 that was used for the first Lisp implementations. They’re a terrible hold-over from that terrible decision … aren’t they?

Well, yes – and no. Of course they’re terrible in one sense. But car and cdr are basically nouns as far as Lisp programmers are concerned. One could replace them with more modern usages like head and tail (and indeed many Lisps define these using macros).

But it’s important to remember that even “head” and “tail” are analogies, sanctified by familiarity in the computer science literature but still inexplicable to anyone outside. (If you doubt that, try explaining to someone who isn’t a programmer that a shopping list has a “head” consisting of the first entry, and a “tail” consisting of another, shorter, shopping list, is “in fact” a recursive type, and you have to acquire each item of shopping sequentially by working your way down the list from the head.) car and cdr are artificial nouns, and cons is an artificial verb – but really no more artificial that head, tail, and append, their rough equivalents in other languages.

One can argue that the persistence of car and cdr drives the persistence of compounds like caaddr. But those are unnecessary and seldom used: barely anyone would mind if they were removed.

The persistence of hardware 2: It happens a lot

The suggestion that Lisp has hardware holdovers that should be removed also neglects these holdovers in other languages.

As an example, check the definition of std::memcpy in C++. It doesn’t work with overlapping memory areas. Why is that? – why is it so fast, but so dangerous? Does it relate to underlying machine features, such as machine code move instructions on particular machines with particular restrictions? Doesn’t this introduce the risk of security flaws like buffer overruns?

Languages with more abstracted machine models don’t have these issues. I struggle to think of how one could even introduce the concept of a buffer overrun into Lisp, other than by using some external raw-memory-access library: the language itself is immune, as far as I know.

The different choices

For the sake of argument, let’s turn the argument around and ask: give that early Lisps had proper macros, arbitrary-precision integers, and so on, why did these features disappear from what we now consider to be “the mainstream” of programming language design?

Lisp’s designers had a goal of building a powerful machine in which to think: indeed, they intended it to eventually have its own hardware designed specifically for it to run on. They therefore didn’t buy into the necessity of immediate performance, and as their applications were largely symbolic AI they didn’t need numerical performance at all. They chose instead to create high-level constructs even if these couldn’t be compiled efficiently, and explored using these to create more code as they identified more and more abstract patterns whose details could be automated away. (Paul Graham has a great essay on this.)

Other language designers had other priorities. Often they needed to do numerical simulation, and needed both performance and scale. So they chose a different design pathway, emphasising efficient compilation to the hardware they had available, and made the compromises needed to get it. These have persisted, and that’s why we have languages with fixed-width integers scaled to fit into a single machine register, and compilers that generate – but don’t directly execute – the code of programs, which limits our ability to abstract and automate code generation without recourse to complicated external tools.

It’s interesting to explore these choices. They’re at one level “just” historical: accidents that shaped the present. But at another level they’re still very much present in the hardware and software landscape we inhabit. I think it’s important that we remind ourselves, continuously, that much of that landscape is a choice, not a given, and one we can question and change as we wish.