In the past I’ve worked on a range of areas. Some are still of interest; some not so much; and some are now much more in the mainstream than they were back in the day.
Programming languages
OK, not so much of a past interest, it’s a current interest that I don’t have an active research programme in. In another sense, though, everything I do is about programming languages, or at least about ways to capture computational processes in a way that’s simple and accessible to people. At their broadest, programming languages are about allowing people to express their computational ideas easily, and that’s something that underpins all computer science research.
I’ve developed a number of software systems in the past, including a system for prototyping programming language interpreters (Vanilla) and a modern Forth compiler that we used for experiments in sensor network programming (Attila). Both are still available, and may even work on modern systems: your mileage may vary.
High-performance computing
My PhD concerned programming languages for scalable parallel programming approaches for Transputers, a 1980s processor family that combined processing with four network adaptors for talking to neighbouring Transputers. They were intended to be constructed as 2d arrays, computing in parallel and exchanging data with their neighbours. A great idea, but hard to program effectively. We explored data structures that expanded onto neighbouring machines naturally as they grew, while being processed in parallel. It was sort of a generalisation of what later became map-reduce.
(Despite being very innovative at the time, it proved impossible for the Transputer to keep up with the advancing speeds of more traditional processor architectures – not least because they also had to simultaneously speed-up the interconnect to keep the balance right. I can still remember when we acquired an IBM RS/6000 workstation and realised the extent to which it blew the performance of our 32-Transputer array out of the water, without needing any special programming techniques.)
I did a postdoc that similarly looked at programming language solutions for high-performance computing, re-visiting the idea parallel computing based around data types as part of a project called TallShiP. We managed to get these structures (“shared abstract data types” or SADTs) to perform efficiently on workstations and supercomputers by weakening the memory consistency model in type-specific ways.
Autonomic computing
Making large computer systems more manageable involves improving the ways in which they model and control their own behaviour in response to changing conditions. I worked a lot on whole-system models of adaptive behaviour, which eventually narrowed into an interest in managing sensor systems.
Pervasive computing
Another topic that’s still of interest, although perhaps not in a mainstream way. I used to work on assisted living and other applications of pervasive systems; now I focus on situation recognition, which — while it’s a technique with deep applications in pervasive systems — is somewhat broader and can be applied to all aspects of data science and sensing. Essentially we don’t care where the data comes from when interpreting it, so we can look at the deeper algorithmic issues rather than just the applications.
Distributed and web systems
I got into the web early, including putting up one of the first really large-scale web servers backed by a database and a content manager (before “content manager” was a thing). Many of the techniques we looked at back then (in the mid-1990’s) have since morphed into the mainstream, in radically different (and mainly better) forms to the ways we envisioned them.