What should the university of the 21st century look like? What are we preparing our students for, and how? And how should we decide what is the appropriate vision for modern universities? There’s a tendency to think of universities as static organisations whose actions and traditions remain fixed — and looking at some ceremonial events it’s easy to see where that idea might come from. Looking in from the outside one might imagine that some of the teaching of “old material” (such as my teaching various kinds of sorting algorithms) is just a lack of agility in responding to the real world: who needs to know? Why not just focus on the modern stuff? This view is largely mistaken. The point of teaching a core of material is to show how subjects evolve, and to get students used to thinking in terms of the core concepts rather than in terms of ephemera that’ll soon pass on. Modern stuff doesn’t stay modern, and that is a feature of the study of history or geography as much as of computer science or physics. Universities by their nature have a ringside view of the changes that will affect the world in the future, and also contain more than their fair share of maverick “young guns” who want to mix things up. It’s natural for academics to be thinking about what future work and social spaces will be like, and to reflect on how best to tweak the student experience with these changes in mind. What brought this to my mind is Prof Colm Kenny’s analysis piece in the Irish Independent this weekend, a response to the recently-published Hunt report (“National strategy for higher education: draft report of the strategy group.” 9 August 2010) that tries to set out a vision for Irish 3rd-level (undergraduate degree) education. Although specific to Ireland, the report raises questions for other countries’ relationships with their universities too, and so is worth considering broadly. A casual read of even the executive summary reveals a managerial tone. There’s a lot of talk of productivity, broadening access, and governance that ensures that institutions meet performance targets aligned with national priorities. There’s very little on encouraging free inquiry, fostering creativity, or equipping students for the 21st century. The report — and similar noises that have emerged from other quarters, in Ireland and the UK — feel very … well … 20th century. Life and higher education used to be very easy: you learned your trade, either as an apprentice or at university; you spent forty years practising it, using essentially those techniques you’d been imparted with plus minor modifications; you retired, had a few years off, and then died. But that’s past life: future, and indeed current, life aren’t going to be like that. For a start, it’s not clear when if ever we’ll actually get to retire. Most people won’t stay in the same job for their entire careers: indeed, a large percentage of jobs that one could do at the start of a career won’t even exist forty years later, just as many of those jobs haven’t been thought of now. When I did my PhD 20 years ago there was no such thing as a web designer, and music videos were huge projects that no-one without access to a fully-equipped multi-million-pound studio could take on. Many people change career because they want to rather than through the influence of outside forces, such as leaving healthcare to take up professional photography. What implications does this have for higher education? Kenny rightly points out that, while distance education and on-line courses are important, they’re examples of mechanism, not of vision. What they have in common, and what drives their attractiveness, is that they lower the barriers to participation in learning. They actually do this in several ways. They allow people to take programmes without re-locating and potentially concurrently with their existing lives and jobs. They also potentially allow people to “dip-in” to programmes rather than take them to their full extent, to mash-up elements from different areas, institutions and providers, and to democratise the generation and consumption of learning materials. Some students, on first coming to university, are culture-shocked by the sudden freedom they encounter. It can take time to work out that universities aren’t schools, and academics aren’t teachers. In fact they’re dual concepts: a school is an institution of teaching, where knowledge is pushed at students in a structured manner; a university is an institution of learning, which exists to help students to find and interact with knowledge. The latter requires one to learn skills that aren’t all that important in the former. The new world of education will require a further set of skills. Lifelong learning is now a reality as people re-train as a matter of course. Even if they stay in the same career, the elements, techniques and technologies applied will change constantly. It’s this fact of constant change and constant learning that’s core to the skills people will need in the future. (Ten years or so ago, an eminent though still only middle-aged academic came up to me in the senior common room of the university I taught in at the time and asked me when this “internet thing” was going to finish, so that we could start to understand what it had meant. I tried to explain that the past ten years were only the start of the prologue to what the internet would do to the world, but I remember his acute discomfort at the idea that things would never settle down.) How does one prepare someone for lifelong learning? Actually many of the skills needed are already being acquired by people who engage with the web intensively. Anyone who reads a wide variety of material needs to be able to sift the wheat from the chaff, to recognise hidden agendas and be conscious of the context in which material is being presented. Similarly, people wanting to learn a new field need to be able to determine what they need to learn, to place it in a sensible order, locate it, and have the self-discipline to be able to stick through the necessary background. It’s probably true, though, that most people can’t be successful autodidacts. There’s a tendency to skip the hard parts, or the background material that (however essential) might be perceived as old and unnecessary. Universities can provide the road maps to avoid this: the curricula for programmes, the skills training, the support, examination, quality assurance and access to the world’s foremost experts in the fields, while being only one possible provider of the material being explored. In other words, they can separate the learning material from the learning process — two aspects that are currently conflated. I disagree with Colm Kenny on one point. He believes that only government can provide the necessary vision for the future of higher education. I don’t think that’s necessary at all. A system of autonomous universities can set their own visions of the future, and can design processes, execute them, assess them, measure their success and refine their offerings — all without centralised direction. I would actually go further, and argue that the time spent planning a centralised national strategy would be better spent decentralising control of the university system and fostering a more experimental approach to learning. That’s what the world’s like now, and academia’s no different.