Practical Common Lisp

Practical Common Lisp


Peter Seibel. Practical Common Lisp. Apress. ISBN 978-1-4302-0017-8. 2005.

The classic, very thorough and hands-on tutorial introduction that doesn’t skip the hard parts like the condition system and non-local blocks and exists (and the relationship between the two). It’s also got good chapters on CLOS.

The text is complemented by a set of modern examples, for web services, database, and binary file parsers: quite a long way removed from the examples in many introductory texts. It doesn’t make much use of macro programming in these examples, which is a shame, so follow with On Lisp or Let over Lambda once the structure of the language is clear.

TIL: An RSS-focused search engine

TIL: An RSS-focused search engine

Today I learned about feedle, a search engine focused on searching blogs and podcasts – web sites that export an RSS feed, in other words. And the search results are themselves RSS feed that can be subscribed to live.

This feels like a quite a big thing for accessing content without resort to the internet giants, and for the IndieWeb in general. It means that search can prefer syndicated and typically small-scale content rather than being influenced by search engine optimisation (SEO) or sponsorship affecting the link rankings.

Of course this also need management, and feedle is a curated source: you have to submit your RSS feed to it for review and (hopefully) inclusion. I’ve done that for this site’s feed.

Locally overriding a function throughout a dynamic extent

Locally overriding a function throughout a dynamic extent

A horribly dangerous but occasionally useful Lisp technique.

My use case is as follows. ebib has a command to copy a formatted reference to the kill ring, using citar-citeproc-format-reference to actually do the formatting. This means it’s easy to change the style of the formatted reference. However, citar-citeproc-format-reference itself uses citar-render-bib with a plain-text formatter. This is a sensible default, but since I’m almost always copying references into org-more documents, it loses a lot of information: it’d be better to use the org formatter, but there’s no argument to specify it.

Clearly the correct solution is to change citar-citeproc-format-reference to take a key or optional argument to specify the formatter, but that involves changing someone else’s code. The hacker solution is to change the call (citeproc-render-bib proc 'plain) to (citeproc-render-bib proc 'org), but without re-writing the entire surrounding function to keep the change just to the case where I need it.

One way to do this would be to define a variant citeproc-render-bib that ignores its second argument (the formatter) and always uses 'org instead, and then substitute this variant for the original – but only in the dynamic extent of a particular call to citar-citeproc-format-reference. In most languages this would be impossible – but not in Emacs Lisp.

The solution is to use cl-letf, which overrides the values of general places for the duration of its body forms and restores the original value on exit (normal or otherwise). The important point is that the change occurs across the extent of the body – the body and all the code called from the body – and not merely in the scope of the body, which would only affect calls made there directly.

For example, consider in the following:

(defun f(a b)
  (+ a b))

(defun first (a)
  (f a 10))

Which when called gives:

(first 26)

If we want to override the default value (10) that’s passed to f and instead use 25, we can define a new version that ignores the second argument and uses our preferred default, and then temporarily override the definition of f in the calling environment. If we want to use the original in the overriding definition we need to grab it first. This gives:

(let ((origf (symbol-function 'f)))
  (cl-letf (((symbol-function 'f) (lambda (a b)
				    (funcall origf a 25))))
    (first 26)))

What’s going on? The cl-letf macro is like let but works with general places (as in setf). It sets the places in its argument list for the duration of its body, and then restores them on exit, regardless of whether that exit is normal or via a condition.

The (symbol-function 'f) form returns the place that stores the function associated with symbol f. We use it twice: once to capture this function so we can use it later, and once to identify the place where we store our new variant function. This new binding is then used for all calls made from the body of the cl-letf, regardless of depth, so the call to first makes use of our variant definition of f rather than the original – but with the original then being used in the variant in our case!

If we’d used let or cl-flet instead of cl-letf we wouldn’t have got the behaviour we’re looking for:

(let ((origf (symbol-function 'f)))
  (cl-flet ((f (a b)
	      (funcall origf a 25))))
    (first 26))

Why? Because let and cl-flet work over the scope of the body, so only calls to f directly from the body of the assignment are affected – not calls from calls. This is a great illustration of the difference between the closely-related concepts of (static, lexical) scope and (dynamic, run-time) extent, incidentally.

I did say it was horrible :-). It’s basically like adding temporary :around advice, and could probably benefit from a macro to wrap it up. It’s also inconceivable that it’s thread- or co-routine-safe, although I haven’t checked.

Part of the horribleness comes from the fact that the redefinition is made for the entire dynamic extent of the body forms, which means all instances of the overridden function will use the overridden value. There might be more than you think! But for well-understood code it’s sometimes useful, avoiding duplicating code to make tiny changes.

This Is How They Tell Me the World Ends: The Cyberweapons Arms Race

Nicole Perlroth (2021)

A hugely detailed and deeply researched history of the market for “zero-day: exploits, the faults and technologies underlying computer viruses and ransomware. It’s a hugely complicated and technical field which Perlroth does an amazing job of making accessible to a non-technical audience. (I should probably say here that I teach computer security.)

Most of the book is a real page-turner, deeply embedded with the government agencies, companies, and hackers who compose the zero-day market. It’s scathing of the US’ trying to play both sides of the street, developing and buying zero-days in order to collect intelligence while weakening the security of ordinary users in the process by not informing the software developers of the problems they’ve found. They clearly knew this was dangerous, and even developed a doctrine for us: “NOBUS”, bugs that “no-one but us” would be smart enough to find or develop. This idea goes wrong spectacularly, as other nations realise how cheaply they too can have cyberweapons programmes: ironically they’re encouraged by the deployment of the Stuxnet virus to damage the Iranian nuclear programme. The leaks of the NSA’s zero-day stockpile by the Shadow Brokers – an event that’s somewhat under-explored – and their later use in hacks against US elections, are payback for hubris.

Perlroth is scathing of the Trump presidency’s neglect of cybersecurity and unwillingness to sanction Russia for known attacks – in part because it might cast doubt on Trump’s legitimacy as an elected president, but also seemingly from willful blindness and a mistrust of the professionals (including the military) tasked with protecting US networks. She was writing during the pandemic and before Trump conceded the 2020 election (to the extent that he ever did), and so if anything she understated the impacts of disinformation spreading.

The conclusions are a little breathless, but well-intentioned and technically appropriate, if a little US-centric – and in fairness the US has at least attempted to set up a more transparent approach to managing cyberweapons, even though the approach is drastically compromised by the desire to keep intelligence-gathering capabilities. Cybersecurity is an area where offence and defence are closely intertwined, and there’s a strong argument that the costs to society of the former mandate a focus on the latter. We need to accept that many cyberweapons that are used (or leaked) can be reverse-engineered and re-used against their original developers with little real up-front financial investment.

There’s some editing. including a repeated mis-use of “affect” rather than “effect”, and a really disastrous throw-away reference to the book Dune, the description of which is almost entirely wrong: surely an editor should have picked that up?

4/5. Finished Sunday 21 January, 2024.

(Originally published on Goodreads.)

Edible Economics: A Hungry Economist Explains the World

Ha-Joon Chang (2022)

A book that combines food with economics? Not really.

I’m torn by this book. I enjoyed the food parts, especially the author’s anecdotes about his move to the UK from Korea, and how he’s observed the UK’s food scene change from incredibly insular and conservative to amazingly open and dynamic over the course of a couple of decades. It’s a change I also lived through and remember well.

I also enjoyed the economics. Chang is an eclectic collector of economic theories – all the more surprising because he’s an academic. He has an appropriate degree of scepticism for ideology and single explanation of complex questions, which is refreshing. He skewers some of the common myths, such as the “explanation” that poor countreis stay poor because their people don’t work hard enough, ingoring the massive structural factors in play. He’s equally scathing about the other “explanation” about the free-trade roots of the successes of the US and UK economies, given that they were actually massively protectionist during their main periods of growth. And he makes several policy suggestions for modern economies.

But…. as a book, I don’t think it works at all. The conceit of explaining economics through food remains just that: a conceit that’s not really threaded through the narrative in a meaningful way. The links are often just too tenuous. To give one example, a chapter that leads with anchovies ends up talking about natural-resource extraction economics using the example of bird guano – well, birds eat anchovies, so… Most of the chapters are basically diviided between food and economics with an often desperate attempt to tie them together. The economics is accessible, and a writer who can do that probably doesn’t need a gimmick to structure his work.

3/5. Finished Saturday 20 January, 2024.

(Originally published on Goodreads.)