Two Lisp compilers written in Lisp

Two Lisp compilers written in Lisp

A Lisp compiler to ARM written in Lisp

A Lisp compiler to RISC-V written in Lisp

Two native-code compilers written in the uLisp dialect that targets microcontroller-class machines. Both use a combination of stack and register allocation to get efficiency – and they’re very efficient, with the compiled versions sometimes being 100x faster than the interpreted code.

These are not complete ports, and indeed not on a complete or standard underlying Lisp implementation. But it’s still fascinating to see how simple it is, built as a recursive-descent tree-walker that emits assembler directly. With careful initial design even a compiler with no optimisation pathways can still get great speed-up over an interpreter.

The different energy footprints of different programming languages

The different energy footprints of different programming languages

I’ve recently been thinking about low-power computing, from AI and data centres down to sensors, as part of a new initiative the University is considering. It’s easy to forget the resource footprint of our computing systems – especially those that are “out of sight, out of mind” in the cloud – but there’s growing evidence that their growth threatens the renewable energy transition. Some of the figures for AI electricity and water usage are astonishing.

One aspect of this is the effect of choice of programming language. I can across some work from 2017 on this:

Rui Pereira, Marco Couto, Francisco Ribeiro, Rui Rua, Cunha Jácome, João Paulo Fernandes, and João Saraiva. Energy Efficiency across Programming Languages: How Do Energy, Time, and Memory Relate? In Proceedings of the 10th ACM SIGPLAN International Conference on Software Language Engineering. 2017.

The authors compare 13 benchmarks run in 27 different languages, with the benchmarks being chosen widely to avoid being biased by numeric performance. I was expecting some patterns: compiled languages doing better on performance, memory, and energy usage, for example. But I wasn’t expecting exactly how widely the figures diverged, or some of the details.

The following table is from the paper, annotated by me. The figures are normalised against the best result in each category (so the top language has value 1, and so on).

pl-energy.png

The two most-used languages for web application, Python and JavaScript, perform uniformly pretty badly: 75 times C’s energy usage, in Python’s case. But although JavaScript does substantially better on energy (only a factor of 4), TypeScript – which is usually thought of as JavaScript with type pre-processing – requires 21 times C’s energy, or 5 times JavaScript’s. Why is that? – I can’t think of a reason.

But the real surprise was that “research” languages like Haskell and Lisp both hold up well: twice C’s energy, in Lisp’s case. I don’t think that would surprise modern Lisp programmers, who are used to their compilers’ efficiencies – but it would surprise someone used only to the “hearsay” about Lisp. The same for Haskell, actually, whose modern compilers really leverage the extra structure. When you consider that both those languages are pretty much dependent on garbage collection and so are doing substantially more work than the equivalent C program, it’s impressive.

(Also look in the table for Racket, consistently lower than Lisp despite their close similarities. I suspect this is a compiler optimisation issue more than anything else.)

This work clearly isn’t complete or definitive. Clojure is entirely missing, as is Scala, and there will have been compiler improvements since 2017 for the languages with the most active developer communities. But it’s still quite sobering that the differences are so wide, and that we’ve chosen to push languages that exacerbate energy usage rather than managing it.

Purely functional data structures

Purely functional data structures

Chris Okasaki. Purely Functional Data Structures. Cambridge University Press. ISBN 978-051153-010-4. 1998.

Not a Lisp book per se, but a treatment of data structures from a functional programming perspective. The code examples are in Standard ML, but the ideas apply strongly to Lisp and Scheme. Definitely a useful source for an alternative take on data structuring that doesn’t start from assumptions of imperative programming and mutability.

Lisp as the Maxwell’s equations of software

Lisp as the Maxwell’s equations of software

Lisp as the Maxwell’s equations of software – DDI

A take on Lisp as the computational version of fundamental equations in physics. The claim is that learning Lisp is a foundational skill, and this page goes on to develop a “TiddlyLisp” interpreter in Python. As you’d expect this isn’t an espcially practical Lisp: but it’s remarkably functional, and I suspect will demystify Lisp for programmers familiar with interpreters for other languages.

See also a conversation with Alan Kay where he uses the “Maxwell’s equations”:

That was the big revelation to me when I was in graduate school – when I finally understood that the half page of code on the bottom of page 13 of the Lisp 1.5 manual was Lisp in itself. These were “Maxwell’s Equations of Software!” This is the whole world of programming in a few lines that I can put my hand over.

And in the second half of this article, the Lisp interpreter in Python is translated into a Lisp interpreter in Lisp, which is a very concrete way of showing how metacircularity can work in McCarthy’s original style.