The (new) idea of a (21st century) university

What should the university of the 21st century look like? What are we preparing our students for, and how? And how should we decide what is the appropriate vision for modern universities? There’s a tendency to think of universities as static organisations whose actions and traditions remain fixed — and looking at some ceremonial events it’s easy to see where that idea might come from. Looking in from the outside one might imagine that some of the teaching of “old material” (such as my teaching various kinds of sorting algorithms) is just a lack of agility in responding to the real world: who needs to know? Why not just focus on the modern stuff? This view is largely mistaken. The point of teaching a core of material is to show how subjects evolve, and to get students used to thinking in terms of the core concepts rather than in terms of ephemera that’ll soon pass on. Modern stuff doesn’t stay modern, and that is a feature of the study of history or geography as much as of computer science or physics. Universities by their nature have a ringside view of the changes that will affect the world in the future, and also contain more than their fair share of maverick “young guns” who want to mix things up. It’s natural for academics to be thinking about what future work and social spaces will be like, and to reflect on how best to tweak the student experience with these changes in mind. What brought this to my mind is Prof Colm Kenny’s analysis piece in the Irish Independent this weekend, a response to the recently-published Hunt report (“National strategy for higher education: draft report of the strategy group.” 9 August 2010) that tries to set out a vision for Irish 3rd-level (undergraduate degree) education. Although specific to Ireland, the report raises questions for other countries’ relationships with their universities too, and so is worth considering broadly. A casual read of even the executive summary reveals a managerial tone. There’s a lot of talk of productivity, broadening access, and governance that ensures that institutions meet performance targets aligned with national priorities. There’s very little on encouraging free inquiry, fostering creativity, or equipping students for the 21st century. The report — and similar noises that have emerged from other quarters, in Ireland and the UK — feel very … well … 20th century. Life and higher education used to be very easy: you learned your trade, either as an apprentice or at university; you spent forty years practising it, using essentially those techniques you’d been imparted with plus minor modifications; you retired, had a few years off, and then died. But that’s past life: future, and indeed current, life aren’t going to be like that. For a start, it’s not clear when if ever we’ll actually get to retire. Most people won’t stay in the same job for their entire careers: indeed, a large percentage of jobs that one could do at the start of a career won’t even exist forty years later, just as many of those jobs haven’t been thought of now. When I did my PhD 20 years ago there was no such thing as a web designer, and music videos were huge projects that no-one without access to a fully-equipped multi-million-pound studio could take on. Many people change career because they want to rather than through the influence of outside forces, such as leaving healthcare to take up professional photography. What implications does this have for higher education? Kenny rightly points out that, while distance education and on-line courses are important, they’re examples of mechanism, not of vision. What they have in common, and what drives their attractiveness, is that they lower the barriers to participation in learning. They actually do this in several ways. They allow people to take programmes without re-locating and potentially concurrently with their existing lives and jobs. They also potentially allow people to “dip-in” to programmes rather than take them to their full extent, to mash-up elements from different areas, institutions and providers, and to democratise the generation and consumption of learning materials. Some students, on first coming to university, are culture-shocked by the sudden freedom they encounter. It can take time to work out that universities aren’t schools, and academics aren’t teachers. In fact they’re dual concepts: a school is an institution of teaching, where knowledge is pushed at students in a structured manner; a university is an institution of learning, which exists to help students to find and interact with knowledge. The latter requires one to learn skills that aren’t all that important in the former. The new world of education will require a further set of skills. Lifelong learning is now a reality as people re-train as a matter of course. Even if they stay in the same career, the elements, techniques and technologies applied will change constantly. It’s this fact of constant change and constant learning that’s core to the skills people will need in the future. (Ten years or so ago, an eminent though still only middle-aged academic came up to me in the senior common room of the university I taught in at the time and asked me when this “internet thing” was going to finish, so that we could start to understand what it had meant. I tried to explain that the past ten years were only the start of the prologue to what the internet would do to the world, but I remember his acute discomfort at the idea that things would never settle down.) How does one prepare someone for lifelong learning? Actually many of the skills needed are already being acquired by people who engage with the web intensively. Anyone who reads a wide variety of material needs to be able to sift the wheat from the chaff, to recognise hidden agendas and be conscious of the context in which material is being presented. Similarly, people wanting to learn a new field need to be able to determine what they need to learn, to place it in a sensible order, locate it, and have the self-discipline to be able to stick through the necessary background. It’s probably true, though, that most people can’t be successful autodidacts. There’s a tendency to skip the hard parts, or the background material that (however essential) might be perceived as old and unnecessary. Universities can provide the road maps to avoid this: the curricula for programmes, the skills training, the support, examination, quality assurance and access to the world’s foremost experts in the fields, while being only one possible provider of the material being explored. In other words, they can separate the learning material from the learning process — two aspects that are currently conflated. I disagree with Colm Kenny on one point. He believes that only government can provide the necessary vision for the future of higher education. I don’t think that’s necessary at all. A system of autonomous universities can set their own visions of the future, and can design processes, execute them, assess them, measure their success and refine their offerings — all without centralised direction. I would actually go further, and argue that the time spent planning a centralised national strategy would be better spent decentralising control of the university system and fostering a more experimental approach to learning. That’s what the world’s like now, and academia’s no different.

Why I don’t sign NDAs

Non-disclosure agreements (NDAs) are something I try not to sign, for various reasons. For one thing, they give a false sense of security; for another, they interfere with me doing my job. An increasing amount of academic research is supported in one way or another by companies, either directly or through co-funding agreements. This trend is only likely to increase as State funding becomes less common. As well as this, academics are sometimes approached to do consultancy or development work for companies on a more close-to-market basis. This can be great for all concerned: the companies get access to (hopefully) unbiased expert advice that can perhaps take a longer-term view, while the academics get real-world problems to work on and a reality check on some of their ideas. Despite all this, there are still some problems. Chief among them, I’ve found, are non-disclosure agreements (NDAs) whereby one or both sides agree not to disclose proprietary information to third parties. Some NDAs are one-way, so (for example) the university agrees not to disclose company information; many are symmetrical and protect both sides, which is obviously more desirable. (Interestingly it’s often university-led NDAs that are asymmetric, implying that the companies have no intellectual input…) Although they sound like they’re used for competitive reasons — and sometimes they are — it’s more likely that they used to protect information for use in later patent applications, where discussion with a third party might be regarded as “publication” and so endanger the patent. Anyone who works in commercial research has to be sensitive to this, and since I used to run a company myself I’m conscious of protecting others’ intellectual property. So why does this cause a problem? Mainly because of the bluntness of the instrument by which the protection happens. In my job, I have a lot of discussions that involve half-formed ideas. Many of these seem pretty much to condense out of the ether into several people’s heads simultaneously: it’s in no way uncommon to have two or three people discuss the same “novel” idea within days of each other. I suppose it’s just that there’s nothing really new under the sun, and people who share a common technical milieu will often see the same problems and arrive at similar solutions. Often the people involved are students: either undergraduates looking for projects, or PhD students with ideas for theses or papers. These people are my “core constituency” in the sense that my main job is to teach and supervise them in a research-led way. You can probably see where this is going. Suppose a student comes to me with an idea, and that this idea is related in some way to an idea presented by a company with whom I’ve signed an NDA. What do I do? Refuse to discuss the matter with the student, even though they’re fired-up about it? Try to re-focus them onto another area, even though it might be a great idea, because I can’t discuss it? Send them to someone else, even though I might be the right person to supervise the work? What I can’t do is get involved, because however hard I try, I’ll never be able to prove that information covered by the NDA had no effect on what I said or did — or didn’t say or do, for that matter. That leaves both me and the university open to legal action, especially if by some chance the student’s work got a high profile and damaged the company, for example by developing an open-source solution to something they were working on. This is something of a dilemma. I like working with companies; I love working with students; and I don’t like the feeling that my freedom to discuss technology and ideas is being legally constrained. I therefore minimise my exposure to NDAs and confidentiality agreements. It’s sometimes unavoidable, for example as part of EU project consortium agreements. But as a general rule I don’t think NDAs sit well with academics, and there’s too much danger of damaging the general openness of research within a university: too much of a sacrifice just to get a single funded project. I’ll happily agree to keep information confidential, but the risks of signing a blunt and broad agreement to that effect are just too great.

Modern postcodes

Ireland doesn’t have a postcode system — a state of affairs that causes endless problems with badly-designed web sites that expect them, as well as with courier deliveries. But of course in the internet age there’s no reason to wait for the State to act… It always surprises people that Ireland doesn’t have post codes, when pretty much everywhere else does. Dublin has postal districts — I used to work in Dublin 4, which covers about 50 square kilometres and so doesn’t really function as an aid to delivery or navigation — but there’s nothing similar in the rest of the country. Add to this the fact that many country villages don’t have street names either, and you start to understand why getting a package delivered by a courier usually involves a phone call to talk them along the route. In actual fact the problem is less severe than you might expect, because the postal system works rather well. This is because the postmen and women get very good at learning where each person lives by name, so a name, village and county will normally get through (outside a major town). The villages are small and people tend not to move too frequently, so human-based routing works well for frequent contacts. For couriers and infrequent service providers, though, it’s a different story. I usually have to take phone calls from the people who deliver heating oil, for example, because a twice-a-year delivery isn’t enough for them to remember where we live. One might think that this situation would be easily remedied: choose a postcode system and implement it. But this leads to the next thing that people often find surprising: many people in the countryside, despite the obvious benefits in terms of convenience and efficiency, are implacably opposed to postcodes. The fear is that such a system would do away with the quaint townland names: each village will often have several smaller townlands surrounding it, that get mentioned in the postal addresses. These often have a lot of history attached to them, and in some parts of the country are written in Irish even when nothing else is. Every time the idea of national postcodes is raised in the national press a whole host of letters are published opposing it and predicting the death of the rural Irish lifestyle, and it seems that this has been enough to stymie the implementation of the system on a national basis. There are a disproportionate number of Irish parliamentary seats in rural areas, so political parties are loath to do anything that alienates the rural vote. In past times, that would have been it: the State doesn’t act, end of story. But we’re now in the internet age, and one doesn’t have to wait for the State. I just came across Loc8, a company that has established an all-Ireland post code system. They’ve devised a system of short codes that can be used to locate houses to within about 6m. So — to take an example directly from the web site — the Burlington Hotel in Dublin has Loc8 code NN5-39-YD7. The codes are structured according to the expected hierarchy of a zone, a locality and then a specific location, with the latter (as far as I can tell) being sparse and so not predictable from neighbouring codes: you can’t derive the code of one property by knowing one nearly. (I might be wrong about that, though.) So far so good. If the service is useful, people will use it: and apparently they are. You can enter a Loc8 code into some GPS systems already, which is a major step forward. The courier companies — and even, apparently, the national postal service, An Post — will apparently take Loc8 codes too. There’s also a plug-in for Firefox that will lookup a Loc8 code from the context menu: try it on the code above. It’s a bit clunky — why does it need to pop-up a confirmation dialogue? — and integration with something like Hyperwords would make it even more useable, but it’s a start. What I like about Loc8 is that it’s free and (fairly) open: you can look a code up on their web site and it’ll be displayed using Google Maps. The integration with commercial GPS systems is a great move: I don’t know if it’s integrated with the Google navigation on Android, but if it isn’t it’d be easy enough for LOC8 to do — or indeed for anyone else, and that’s the great bonus over (for example) the closed pay-world of UK postcode geolocation. The real story here is that it’s possible to mash-up a national-scale system using extremely simple web tools, make it available over the web — and then, if there’s value enough, cut commercial deals with other providers to exploit the added value. That sort of national reach is a real novelty, and something we’re bound to see more of: I’d like something similar for phone numbers, skype names and the like, all mashed-up and intermixed.

How to publish an unpopular book?

I’ve been thinking about writing a book. It won’t be a popular success — trust me — but that raises the question of how I should publish it. I’ve never written a book, although I’ve written a lot of papers and edited a couple of conference proceedings and other collections: writing is one of the things academics do for a living. But I’ve been thinking for a while about writing a book on how to build a programming language. This isn’t something that JK Rowling (my neighbour in Morningside) needs to worry will eat into her royalties, obviously, but it’s something that’d be of interest to a certain group of people. I’ve been teaching programming, compilers, language principles and the like for several years, and have sometimes shown classes how to build interpreters and compilers from the ground up. I’d like to share this with a wider audience, and show how the tools and techniques of languages can be used to make a whole class of problems easier and more fun to solve. There’s something very liberating and exciting (to me, anyway) about understanding the tools of programming in their most raw, and of being able to change them to suit particular circumstances. It also brings a new perspective to things one might encounter in particular languages, that can encourage experimentation and invention and the re-purposing of tools to new domains. It’s not the writing that’s the problem: the problem is the publishing. Clearly, no matter how excited I get about these things, it’s going to be a pretty minority interest: not even most computer scientists write compilers. So it’s hardly going to be a best-seller. But that then raises an interesting question. Traditional book publishing is about getting visibility and distribution for your work, with a view to maximising circulation, impact and royalties. If there’s no money to be had, and the target audience is by definition computer- and internet-aware, are there better ways of getting the same (or better) visibility, distribution and impact, and reaching the audience more effectively than one can by traditional means? What, in the 21st century, is the most effective way to publish an unpopular book? In one way the internet answers this question in an obvious way: put a file on a web server. But that still leaves the visibility and impact parts to be solved — and there are half-a-dozen ways to make the text available on a web server, too. We can split the problem between these two parts, though: how to write and represent the text, and how to let people know it’s there.

Distribution and format

Web site. A lot of books have associated web sites, for errata and additional material, sometimes freely available and sometimes behind a paywall. Clearly one could put a whole book up on a site, as well as any associated software, with a suitable licence to allow for downloading and whatever copying seems permissible. I use this approach for this site: all the content is under a Creative Commons licence that allows non-commercial sharing with attribution, and allows re-mixing as long as the derived work is also shared under the same or similar terms. Most web sites require that you be on-line to read them, although that’s not necessarily the case for systems like TiddlyWiki that download in one file. And one can get all the benefits of non-linear browsing and re-purposing by using proper hypertext as opposed to PDF. E-book. E-books have a lot to recommend them, especially their portability and download-ability. PDF is a popular format, but EPUB is probably a better choice: you get reflowing, hyperlinking and portability to small devices with no effort, in a way that’s hard for a PDF. Of course these formats aren’t mutually exclusive, and one could easily come up with a writing system that can generate PDF, EPUB and indeed HTML from the same sources. Blog. The above are still fairly traditional approaches, varying in terms of delivery medium. What about blogging a book, allowing it to evolve over time? I can think of one immediate disadvantage, which would be the danger of a lack of flow and a disjointedness that comes from not wrapping-up a work as a single entity. But of course there are some significant advantages: there’s no reason one couldn’t write large chunks of text and them blog them over time, and refine the text using comments before generating an e-book or re-linking into a more conventional web site. Wiki/group blog. If we accept the no money/lots of copying philosophy, then perhaps there’s no reason to be precious about authorship either. A group blog or a wiki that encourages participation and updating might make sense: a sort of Wikipedia for programming languages, in which chapters can be commented on and edited by a community (if one forms). This might generate a work that’s more complete than one I could wrote myself, if one got contributions from the appropriate, knowledgeable people. It could also degenerate into a farce without clear editing guidelines and proper curation: essentially the problems of a normal blog, writ large. Wikis, and especially Wikipedia, often get trashed by academics. This isn’t an opinion I completely share. At their best, wikis harness the best people with an interest in a subject. Their content needs protection from the stupid, vain, deluded, vicious and malicious, but none of that outweighs the benefits of having potentially every expert in the world contributing. A traditional encyclopaedia is not necessarily more reliable — look up “anthropology” in an early Encyclopaedia Britannica to see how fallible expert opinion is to modern eyes — and with care there’s no reason why a wiki need be less exacting than a more traditional medium. (Encyclopaediae aren’t a good way to learn a subject, for which you really need a structured and knowledgeable guide — but that’s another story.)


Visibility subsumes impact, in a sense: if something is very visible and easily-found, then it’s popularity is a direct measure of its significance in its community of interest. And if something is highly visible and still unpopular: well, that’s just something to live with. We can split visibility between pull and push: people finding what they’re looking for versus being told that there’s something they might be interested in. SEO. Search engine optimisation has evolved from being a valuable skill, through a commodity, to an arms race in which sites try to get search engines to notice them and search engines try not to be manipulated away from whatever they regard as their core metric for importance. (PageRank in the case of Google.) Most content managers have SEO packages built-in or available that can help. Blogs. There are a number of great programming language blogs out there, through which one could solicit help and readers. If the internet does anything, it’s demonstrate that any small community or interest is globally large — or at least large enough to keep most people happy. Even language hackers. Software. For a book about writing languages, I suspect the most effective advertisement is the software that one can develop with the techniques described, or the tools one could use to follow them. The sincerest recommendation is for the software to be used, found useful, and improved by someone else, who’s then willing to share their experiences back with the net. Having written all of the above, I’m still not sure where it leaves me. I’d welcome any comments or descriptions of experiences before I start putting hand to keyboard in the wrong way. Who’d have thought it’s be so complicated? — although I must say that having these sorts of choices is in itself a major draw, and a great indication of how the web’s changing the world.

Call for papers: Programming methods for mobile and pervasive systems

We are looking for papers on programming models, methods and tools for pervasive and mobile systems, for a workshop at the PERVASIVE conference in San Francisco.

2nd International Workshop on Programming Methods for Mobile and Pervasive Systems (PMMPS) San Francisco, California, USA, June 12, 2011. Co-located with PERVASIVE 2011


Pervasive mobile computing is here, but how these devices and services should be programmed is still something of a mystery. Programming mobile and pervasive applications is more than than building client-server or peer-peer systems with mobility, and it is more than providing usable interfaces for mobile devices that may interact with the surrounding context. It includes aspects such as disconnected and low-attention working, spontaneous collaboration, evolving and uncertain security regimes, and integration into services and workflows hosted on the internet. In the past, efforts have focused on the form of human-device interfaces that can be built using mobile and distributed computing tools, or on human computer interface design based on, for example, the limited screen resolution and real estate provided by a smartphone. Much of the challenge in building pervasive systems is in bringing together users’ expectations of their interactions with the system with the model of a physical and virtual environment with which users interact in the context of the pervasive application. The aim of this workshop is to bring together researchers in programming languages, software architecture and design, and pervasive systems to present and discuss results and approaches to the development of mobile and pervasive systems. The goal is to begin the process of developing the software design and development tools necessary for the next generation of services in dynamic environments, including mobile and pervasive computing, wireless sensor networks, and adaptive devices.


Potential workshop participants are asked to submit a paper on topics relevant to programming models for mobile and pervasive systems. We are primarily seeking short position papers (2–4 pages), although full papers that have not been published and are not under consideration elsewhere will also be considered (a maximum of 10 pages). Position papers that lay out some of the challenges to programming mobile and pervasive systems, including past failures, are welcome. Papers longer than 10 pages may be automatically rejected by the chairs or the programme committee. From the submissions, the programme committee will strive to balance participation between academia and industry and across topics. Selected papers will appear on the workshop web site; PMMPS has no formal published proceedings. Authors of selected papers will be invited to submit extended versions for publication in an appropriate journal (under negotiation). Submissions will be accepted through the workshop web site,

Important dates

  • Submission: 4 February 2011
  • Notifications: 11 March 2011
  • Camera-ready: 2 May 2011
  • Workshop date: 12 June 2011


  • Dominic Duggan, Stevens Institute of Technology NJ
  • Simon Dobson, University of St Andrews UK