The art of the metaobject protocol

The art of the metaobject protocol

Gregor Kiczales, Jim des Rivières, and Daniel Bobrow. The Art of the Metaobject Protocol. MIT Press. 1991.

What is a meta-object protocol? – or indeed a meta-object? This book is perhaps the clearest exposition of these ideas.

In most modern object-oriented languages an object is an instance of a class. In keeping with using objects throughout, classes are often also objects (or can be thought of as such), but are more informatively thought of as meta-objects that to facilitate the construction of “real” objects. The methods on classes can also be thought of as meta-objects defining the code executed by the objects when invoked.

The defining feature of CLOS is that these meta-objects are all just Lisp objects, but objects that exist “off-stage” (to use this book’s very intuitive metaphor) and largely invisible to a basic user. But they’re as available to a power user as any other objects: the “meta”-ness is a matter of design, not of implementation. The interactions between objects and meta-objects, for example which methods are called when invoked on a particular object, are defined by the meta-object protocol (MOP), which is itself defined in terms of methods on the meta-objects that shadow the objects themselves.

(Meta-object protocol uses a term common in a lot of the earlier object-oriented literature to mean a collection of functions: meta-object API would be a more modern rendering, although the protocol includes the sequencing of API calls and their relationships.)

The goal of MOP programming is to let the programmer extend the programming language towards to application domain, by automating a lot of boilerplate code and providing the structures needed to re-structure or analyse the code the programmer actually needs to write. In this sense it’s a continuation of the idea of macros as powerful and potentially very domain-specific language and compiler extensions. It’s also a continuation of reifying underlying language mechanisms in the language itself where they can be re-specified and re-mixed.

The first part of the book explains MOPs by defining a slightly simplified version of CLOS (“Closette”). It assumes the reader knows some CLOS, for example from Object-oriented programming on Common Lisp: A programmer’s guide to CLOS (or there’s a stand-alone introduction in Appendix A), but it only assumes the knowledge level of a relative newcomer – and the features then defined in Closette are just those parts of CLOS that such a user would actually know and be comfortable with, which is a brilliant piece of pedagogy that simplifies without trivialising. It’s really noticeable that Closette doesn’t need any extensions to Common Lisp: it’s defined directly in the language itself, which shows how powerful the underlying language is. (Full CLOS requires a bit of language support too, at least for efficiency.)

Next come several examples of MOP usage, for example to re-define how classes store their slots, or how to add attributes to slots that can store metadata about their use or could be used to provide higher-level operations. There’s also a long discussion about protocol design and how this has a massive impact on how easy a system is to use for the programmer.

The second part is a manual for the CLOS MOP, which is thorough and useful, but perhaps less exciting than the first part. The Common Lisp package closer-mop provides this API as a portable compatibility layer for use in real programs.

There’s also a discussion of practicalities like where awkward circularities occur and how to break them, which is actually a great example how to do good protocol/API design. In an example of Paul Graham’s dictum that modern languages evolve by mixing Lisp concepts into a different base, MOP ideas appear in lots of other languages, either for real (Smalltalk, at to a lesser extent Python) or just for introspection (Java). Even someone not planning on writing Lisp would benefit from reading this book just to see the ideas in their full generality.

Object-oriented programming on Common Lisp: A programmer’s guide to CLOS

Object-oriented programming on Common Lisp: A programmer’s guide to CLOS

nil

Sonja Keene. Object-Oriented Programming in Common Lisp: A Programmer’s Guide to CLOS. Addison-Wesley. ISBN 0-201-17589-4. 1989.

The definitive practical guide to using the Common Lisp Object System (CLOS). It’s written from a similar perspective to other object-oriented tutorials, which makes it very accessible for those who’ve had experience with something like Java or Python. However, CLOS isn’t just objects in Lisp, and isn’t in any sense just an object-oriented extension. It can take some time to change mindset enough to use it properly, and this book is a great guide to the core differences.

Firstly, it follows a completely different model of how to associate functions with data. Instead CLOS uses “generic” functions, where the exact code called is dispatched dynamically based on the types of any or all parameters: so it’s perfectly possible to have several definitions of the same generic function operating on objects of the same class, but taking arguments of different types. This multiple dispatch is a lot more flexible.

The second point actually follows from this. CLOS’ generic functions can be defined to any Lisp types: in fact they’re not statically associated with classes at all, and can operate on any types (classes or not) across the type hierarchy. This makes it closer to Haskell’s type classes than to Smalltalk’s (or Java’s) virtual methods, which are strongly bound to classes.

Thirdly, CLOS methods can be combined in a range of interesting ways, not simply by overriding previous definitions – and indeed you can define your own if you need to. And like Smalltalk (but unlike Java) CLOS classes have “metaclasses” that can re-define their basic functions. The art of the metaobject protocol is a better source for this level of detail.

The examples in the book delve into these features by means of sensibly-sized challenges that can be used to illustrate both basic design and implementation. and more advanced ideas like re-defining classes on the fly.

The roots of Lisp

The roots of Lisp

http://www.paulgraham.com/rootsoflisp.html

(Only has an PostScript version, but a PDF is available here.)

Re-visits McCarthy’s discoveries (or inventions, depending on your point of view), translating the earliest work into modern Lisp notation.

It’s worth understanding what McCarthy discovered, not just as a landmark in the history of computers, but as a model for what programming is tending to become in our own time. It seems to me that there have been two really clean, consistent models of programming so far: the C model and the Lisp model. These two seem points of high ground, with swampy lowlands between them. As computers have grown more powerful, the new languages being developed have been moving steadily toward the Lisp model. A popular recipe for new programming languages in the past 20 years has been to take the C model of computing and add to it, piecemeal, parts taken from the Lisp model, like runtime typing and garbage collection.

Does a great job of making the central insights accessible, including re-phrasing the meta-circular Lisp interpreter so as to be executable in modern Common Lisp.

TIL: The most powerful one-line program in the world

TIL: The most powerful one-line program in the world

Well, the most powerful I’ve found so far, anyway.

Given my current obsession with Lisp you might reasonably expect it to be in that language. But it isn’t: it’s in APL, and it performs one complete generation of Conway’ Game of Life in one line of code:

  Life←{↑↑1 ⍵∨.∧3 4=+/,¯1 0 1∘.⊖¯1 0 1∘.⌽⊂⍵}

…and does so inexplicably to anyone who doesn’t know APL, obviously, but the basic algorithm is simple:

  1. Take an array with 1 in each occupied cell and 0 elsewhere
  2. Build four new arrays by exchanging each element with its neighbour up, down, left, and right
  3. Sum these arrays, which places the number of neighbours into each cell
  4. Cut-off these values to be 1 if the cell has a value of 3 or 4, and 0 otherwise
  5. Re-format the arrays back into the starting configuration

I checked it out using GNU APL and it works fine.

I discovered this gem by accident, actually implemented in APL in Forth where someone has developed APL as an embedded DSL within Forth (another language with which I have history). After a bit of digging I found a similar APL in Lisp, April, which clearly needs exploring.

In many ways APL and Lisp are parallel tracks within programming language evolution, taking a single data structure (lists or arrays) and providing powerful ways to manipulate them. Lisp of course has been extended with other data structures, including arrays, which makes the fusion of array- and list-based programming rather attractive.

I can’t help asking myself what would have happened if APL hadn’t fallen by the wayside. (I think this was inevitable, incidentally, once the syntax became fixed: any language that requires its own character set was always going to struggle.) We now have huge applications for array processing, from graphics to machine learning, and GPUs are from one perspective just APL accelerator co-processors. The ideas are still massively relevant.

A New Kind of Science

Stephen Wolfram (1997)

I have very mixed feelings about this book. On the one hand it’s a triumph of computational experimental technique, taking an idea that was well-known (cellular automata) and subjecting it to rigorous exploration. This uncovers a lot of new science, not least showing that complex systems can arise from even the simplest set of rules, but also that this complexity falls into classes that show graded complexity depending purely on the fine structure of the initial conditions. That’s a massively important discovery.

On the other hand…. It’s hard to describe this as a “new kind” of science. “New science,” yes, but not indicative of a new paradigm or way of doing science, other than by emphasising the structured use of simulation. Wolfram completely over-sells what he’s achieved, making claims he can’t substantiate to aggrandise his own contributions. And that’s a shame, not least because it’s so unnecessary: Wolfram has made some important discoveries, both here and to several other branches of physics and computer science, as well as popularising computational methods and tools. It’s tragic that he doesn’t seem to be able to appreciate himself as being sufficient.

The book is also quite terribly written, being almost half-composed of footnotes, meaning a reader is constantly skipping forwards and back: difficult in print, and I suspect impossible in a digital edition unless it’s been very carefully hyperlinked. And the footnotes are often important! – indeed, they often provide evidence to back-up a claim in the main text that’s entirely unsupportable from what’s been presented. So it can be seen as both excessively long and insufficiently detailed, which is quite an achievement.

3/5. Finished Sunday 7 July, 2024.

(Originally published on Goodreads.)