Distinguished lecture on artificial life

Our most recent distinguished lecture was the best I’ve heard so far, and on a fascinating topic I’d like to know more about. We run two distinguished lectures each academic year, inviting an academic in to teach a four-hour course on some topic that often we don’t really have expertise with in St Andrews. It exposes undergraduates, grad students and staff to new topics and ways of thinking about research. The series goes back to 1969 and includes some of the best-known names in computer science. Last semester’s speaker was Larry Yaeger from the University of Indiana (on sabbatical at the University of Hertfordshire), who talked about artificial life: using computers to study the processes of evolution, speciation and adaptation. It’s a topic that sits on the boundary between novel computing and theoretical biology. Artificial life sounds too much like science fiction to be “real” computer science: do we understand enough about life to be able to study it in the abstract?, and, if not, can artificial life have any scientific content. If you confine yourself enough, of course, then the answer to both questions is a qualified “yes”, but the question then becomes, does it tell you anything meaningful about anything? Larry’s talk showed that even quite abstracted artificial life scenarios still give some high-level information about the potential for systems design, especially for very dynamic adaptive systems in changing environments. Larry’s work has focused on building multi-agent simulations of processes, seeing how simple rule sets can give rise to complex behaviours. This has culminated in a system called Polyworld, that lets users set up “genetically” based  behaviours for agents. (There are some very cool movies of it all working.) The genetic basis — completely synthetic and higher-level that real genetics — means that agents can evolve through mutation and cross-over. The part I found most interesting was the way that these systems — like the natural systems they’re abstractions of — tend not to do optimisation per se. Instead they find, and stick with, solutions that are “good enough”. You get to a balance between evolutionary pressure not being strong enough, and the benefits not being great enough, for further changes to take place. The difference with traditional engineering is quite profound, both in this satisfaction with the sub-optimal but also in the fact that the selection is dynamic, so if the chosen approach ceases to be “good enough” as the environmental pressures change it will shift to another process as a matter of course. You get this dynamism all over chemistry, too, where chemical equilibrium remains a dynamic process with lots of reactions going on all the time without changing the macroscopic concentrations of the reagents involved. It’s easy to mistake this for a static system, which it most definitely isn’t: I think this is a mistake a lot of scientists and engineers make, though, and it’s something we probably need to address when designing adaptive systems or sensor networks that need to operate against or within a complex environment. To do this we’d need to give up a lot of intuitions we have about design, and the possibility of a single “correct” solution to a problem, and think instead of a design space in which the system is (to some extent) free to explore — and make this design space, and the exploration of it, concepts that are propagated to run-time. I think this kind of approach makes sense even if you don’t embrace the genetic algorithms style view of the world (which in the main I don’t). In some ways this is a measure of the success of artificial life research: it’s illuminating concepts that are of general utility outside the context from which they’re being drawn, that can be used to influence other approaches to systems design without our having to junk everything we already know, which we’re clearly not going to do. These sorts of incremental changes are far more useful than revolutions, in many ways, but they come about from thinking that’s more-or-less divorced from mainstream thinking. It’s a good illustration of why blue-skies research is important, and that knowledge really is all one piece with interconnections and interactions that we can’t predict.

Forth, 2^5 years ago

2^5 years ago this month, Byte magazine devoted an issue to the Forth language. Byte (“the small systems journal”) volume 5 number 8, August 1980, was largely devoted to Forth. I only discovered this by accident, researching some background for a paper I’m writing on extensible virtual machines. What’s even more remarkable is that you can download the issue — along with a lot of others — as a PDF. That the premier hobbyist/hacker magazine of its day would give over most of an entire issue to one language tells you something about the way people thought about programming their machines back then. There was a premium on compactness: one of the first adverts in this issue of Byte is for a Cromenco Z-2H with 64Kb of RAM and 11Mb of hard disc, and proudly claiming that it  “is under $10K”. One article is a history of Forth, one is a tutorial, and two are deeply technical programming pieces aimed at people comfortable with the idea of writing their own software pretty much from scratch  — and indeed, keen to get on with it. What’s more, they could write software as good or better as that which they could buy (to the extent that there was any hobbyist software to buy). That’s not something we’ve been able to say for at least the last 2^4 years: hobbyist software hasn’t competed with commercial offerings in most domains for a long time. I think there were a number of things going on. The simplicity of the machines was obviously a bonus: one could understand the hardware and software of a personal computer in its entirety, and contemplate re-writing it from the ground up as an individual or small group. Expectations were lower, but that works both ways: low expectations coupled with low-performance hardware can still lead to some impressive software. But it’s certainly the case that one of the main barriers to software development from-the-ground-up these days is the need to interface with so many devices and processes in order to do anything of interest: any new system would need to talk to flash drives and the web, which probably means writing device drivers and a filing system. You can get round this using hardware packages, of course: Zigbee radios have simple programmer interfaces and encapsulate the software stack inside them. Another factor, though, was a difference in ambition. A hobbyist in the 1980’s only had herself and her friends to impress (and be impressed by): the horizon was closer. I’m sure that led to a lot of re-definition and duplication that the internet would allow one to avoid somewhat, but in some senses it provided a better learning environment in which a sequence of problems needed solution from a programmer’s own creativity and resources. That’s a significantly different skill set than what’s required today, where we place a value on compliance, compatibility and re-use at least as high as we place on creativity and innovation. I’m not advocating a return to the past — although programming in Forth for sensor networks does give me a tremendous sense of pleasure that I haven’t felt in programming for a long time, at least partially derived from the old-school nature of it all. However, I would say that there’s also value in this old-school style even today. The hackers who read Byte wouldn’t settle for sub-standard tools: they wouldn’t think twice about re-coding their compilers and operating systems (as well as their applications) if they were sub-standard. That power brings on a sense of power — the ability to change what’s perceived to be wrong in something— that’s to be celebrated and encouraged even today, amongst programmers who sometimes seem to be constrained by their toolchains rather than freed by them.

MSc/PhD positions available in wireless location systems

UCD Dublin is looking for potential PhD students interested in wireless location systems.

Ph.D. and M.Sc. Studentships in WiFi Location

UCD Dublin is currently potential Ph.D. and M.Sc. students in the Complex and Adaptive Systems Laboratory and the School of Computer Science and Informatics, University College Dublin (UCD), Ireland. The studentships are part of a collaborative international project to research and develop novel algorithms, solutions and applications for indoor localization based on WiFi. The successful candidates will have obtained, or will expect to obtain, a 1st class or 2.1 Honours degree in Computer Science, Electronic Engineering, or a related discipline. For the Ph.D. position, a Masters degree or commercial experience in a relevant area is an advantage. Preference will be given to applicants with expertise in one or more of the following areas: Digital Signal Processing, Location Estimation, and Wireless Communications. We expect that the Ph.D. positions will be funded for 4 years while the M.Sc. position will be funded for 2 years. All positions are full-time. The positions will include payment of a tax-free student stipend and fees. The anticipated starting date for the positions is 1st October 2012, or as soon as possible thereafter. Enquiries should be sent to Dr Chris Bleakley, Complex and Adaptive Systems Laboratory (CASL), School of Computer Science and Informatics (CSI), University College Dublin (UCD), Belfield, Dublin 4, Ireland; email: chris.bleakley@ucd.ie; tel: +353 1 716 5353. http://www.csi.ucd.ie/vacancy/phd-and-msc-studentships-wifi-location [Updated 13Aug2012 with final confirmation of the grant.]

Blackout (All Clear, #1)

Connie Willis (2010)

While I enjoyed the descriptions of the scenes in this book, and its invocation of an Oxford in forty years’ time that’s almost identical to the present, I found it quite slow and lacking in resolution. I realise (now) that it’s the first in a series, but it leaves far too much hanging to work as a stand-alone novel (as many series manage to do).

2/5. Finished Wednesday 1 August, 2012.

(Originally published on Goodreads.)

Scholar positions for Erasmus Mundus MSc in Dependable Software Systems

We have some short-term Scholarship positions available in dependable systems. The University of St Andrews and NUI Maynooth have the first edition of the new Erasmus Mundus MSc in Dependable Software Systems (DESEM) starting in September. There are EMMC Scholar positions available on this programme. These scholars will engage in teaching and help with the evaluation of student project work among other duties. Positions have to be held for a minimum of 2 weeks. The Scholarship value is € 2,400 for a two-week stay with a maximum of € 14,400 for a three-month stay. We have the equivalent of 17 two week scholarships available for 2012/13. Applicants have to be associated with a third country (non-European) HEI. Details are available from: http://erasmusmundus.nuim.ie/courses/desem/node/53