There's something very 21st century about the Wikileaks/Julian Assange affair. And not in a good way.
The Wikileaks saga dominated the airwaves for the last months of 2010: the revelation of a huge mass diplomatic communications between the US and the world, drip-fed to newspapers and searched incessantly for data to support each and every pet theory in the world.
The story undoubtedly has enormous colour, and superficially is a story for our time. The data is released on the web, and the authorities strain to close the offending site down. Pressure is put on service providers to withdraw support. This only leads to extensive mirroring of the site, frustrating any attempt to close-off access to the data, while the service providers are the subject of distributed denial-of-service attacks by outraged groups of hacktivists. The site's founder has on-again/off-again troubles with the law, being threatened with everything from espionage to treason in the US (despite not being in the US, and despite not being a US citizen, which would seem to make treason rather a long shot), but is arrested and bailed in the UK on foot of a warrant from Sweden for a seemlingly unrelated matter. It sounds like a soap opera, and I'm rather afraid that that's all it really is.
Let's start by recognising the legitimate tension between a desire for transparent government and a desire for anonymity and even secrecy in government communications. On the one hand it's clearly in the public interest to have the whistle blown on unsavoury or illegal State activities, and a blanket claim that national security trumps this interest is absurd. On the other hand, some of the Wikileaks data identifies individuals who may be endangered by having their names publicised. Anonymity is often essential for people to reveal information; similarly, it's in the public interest to have diplomats be able and willing to express their opinions openly without fear of public censure or ridicule, since the alternative would be to distort the open exchange of ideas. This is the principle that underlies scientific and other meetings held under what in the UK are referred to as Chatham House rules: any comments may be used in any context, but with no individual or institutional attribution.
The Wikileaks disclosures are still on-going, but what strikes me about what's been seen so far is the almost complete absence of anything that would justify disclosure -- or indeed secrecy. It's simply a recitation of diplomatic chit-chat that sometimes supports information already in the public domain but certainly provides nothing of any additional significance. It's surely not a revelation, for example, that Arab governments are concerned about the Iranian nuclear programme, or that UK diplomats suspected that Sinn Féin knew about the Northern Bank robbery by the IRA (to take two stories at random): we either knew or could surmise this already.
Many have made allusions between Wikileaks and the release of the Pentagon papers in 1971, classified information leaked to the New York Times which fuelled the growing sentiment against the Vietnam war. But in that case there actually was information revealed -- the extent of the military's prior analysis and its subsequent dilution -- that could be argued to be of importance. Wikileaks lacks this sense of solidity.
To me, the whole affair feels like a piece of reality television that happens to have happened over the web, happens to have a frisson of illegality, and happens to have a link to the diplomatic and intelligence communities. The chatter that we're seeing is just that: chatter. It's surprising only to the extent that it's unsurprising. There are no smoking guns, no support for conspiracy theories, no examples of significant ineptitude or corruption -- nothing. It is to a real journalistic coup what "Big Brother" is a television documentary.
Perhaps we can draw three conclusions from this, The first is the fascination that the 21st century has with real-world data, regardless of its information content. People watch "Big Brother" despite the fact that most of the time nothing happens, and when something does happen it's probably been contrived by the participants or the producers. But even when nothing is happening it can be obscurely compelling. Wikileaks is similar: no real content, but a compulsion to keep looking just in case. And of course if (for the sake of argument) some interesting revelation does surface, how will we know whether it's real or contrived? How will we tell information from dis-information?
The second conclusion is that data is no substitute for interpretation in context. The Pentagon Papers' significance came from the fact that the journalists involved could see there was a story there, and link it to the rest of the news happening alongside, to the people and organisations involved. This journalistic addition is conspicuous by its absence in relation to Wikileaks. This of course goes against Tim Berners Lee's recently-asserted position that the future or journalism is basically data analysis. I have the greatest respect for Tim, but on this point I think he's badly mistaken. The problem is that real journalism isn't exclusively, or even primarily, about the data: it's about the people, their motivations and behaviours, which often aren't represented in the raw data with which web science concerns itself.
Thirdly, even though the Wikileaks affair is really just an old-fashioned leak to the press, we can see that the web has changed things somewhat. The US authorities moved quickly to attack the site initially hosting the data, but only succeeded in triggering its replication to a enough geographically-distributed sites to defy further suppression. Once data is out there and judged to be significant, it'll stay out there through the independent actions of concerned individuals. This makes further analysis and contextualisation possible, but doesn't guarantee that it'll happen -- and that's where the real value lies.
A couple of weeks ago I attended my professorial inauguration.
In most universities (in the UK and Ireland, anyway), a professorship is just a job that doesn't involve any particular ceremonial: it doesn't come with a degree, and so doesn't require a graduation. In St Andrews, of course, things are rather more formal. All new professors have to attend a St Andrew's Day graduation in order to swear an oath to the university. I should have been inaugurated last year but couldn't make it because of a prior engagement, so this year I attended along with my colleague Aaron Quigley and ten other new profs.
(Photo courtesy of Brad Herbert.)
The day took place in sub-zero temperatures with swirling snow at times, despite St Andrews overall having the mildest climate in the UK: I had to stay up overnight just to make sure I'd make it in, since commuting in from Edinburgh as usual would have been a bit risky.
We started in chapel for a thanksgiving for graduation, where I discovered that the "school song" (or at least the hymn to St Andrew) is sung to the tune of Deutschland uber alles, for reasons that remain unclear. Immediately afterwards was the graduation ceremony, held next door in the university's Younger Hall.
Like most "ancient" universities, St Andrews' graduations are full of symbolism. Most of the important parts of the ceremony are conducted in Latin,although not to the extent of excluding all English as happens in Cambridge and in Trinity College Dublin. The academic procession happens juniores priores, with junior staff preceding senior -- and new professors coming first of all, since they're regarded as nobodies until inaugurated. We sat at the front of the hall while the other academics processed into the stage, and waited while the undergraduate and postgraduate degrees were conferred -- which involves being hit on the head with the birretum, a piece of cloth allegedly taken from John Knox' britches.
Academics always play spot-the-gown at these sorts of events. There are basically two classes of gowns. University officers performing their function wear gowns related to their office, so the Principal, while officiating, wears a black-and-white Principal's gown rather than the one for her Harvard doctorate. Regular academics wear the gown related to their highest degree, with the colours, hoods and patterns being determined by the awarding university. A lot of doctoral degrees involve red and gold; St Andrews doctorates are sky blue. (Mine is a fawn/grey gown and hat with a red hood, being a DPhil from the University of York. It's quite understated in comparison to a lot of others, which suits me fine.) St Andrews is unusual in also having an undergraduate gown, different to the gowns for graduates, which is bright red and made of fleece rather than cotton. Useful for keeping warm on the pier walk. A surprising number of undergraduates wear their gowns around town, which just goes to show the bond that exists between the university and its students, and isn't something one encounters in many other universities.
The inauguration of new professors involved us all coming onto the stage and being asked to swear the oath to uphold the rules and traditions of the university (in Latin, of course -- the important word was polliceor, "I so promise"). At this point I discovered the term for computer science in Latin: computandi ope machinali promovendae. After that we were each presented with a book as a token of our new office (which they took back immediately after we left the stage, although they did buy me a nice copy of Hugh Trevor-Roper's "History and the Enlightenment" later, with a commemorative bookplate in it). The Principal then addressed us, in Latin again: Quod felix fortunatumque sit, spartam nacti estis: hanc exornate ("I wish you all happiness and good fortune. You have been allotted Sparta: do it credit."). And it was done. We processed out of the hall (in reverse order, seniores priores, this time) and back to the main quad of St Salvator's college for photos in the snow and lunch. There was a garden party in the afternoon (thankfully in a heated marquee), and a formal graduation dinner in the evening.
Several years ago, after I got married, someone asked me whether it had made a difference to how I felt. I replied that I'd expected the answer to be "no", in that the important thing was the deciding to stay together and not the piece of paper and the public promise -- but that in fact it did make a huge difference. The public act had a significance in and of itself (for me, at any rate), and did make the whole thing feel more real and more certain. Slightly surprisingly, inauguration feels the same way. In practice it makes no difference to my holding or doing the job, but the public act gives a significance to it that I wasn't expecting (again). The common theme between the two situations may be that the commitment is two-way, between myself and my university colleagues and students in this case. A number came up to me afterwards to welcome me -- despite my having been here for over a year -- which makes one feel rather wanted.
There's a lot to be said for ceremonial like this. It's important for the students, of course, to mark their achievement: but it's perhaps equally important for the academic staff, for holding an institution together, especially at times like these when there's something of a feeling of being under siege in a world that doesn't necessarily understand or appreciate what universities are for or what we're trying to accomplish. The coming-together to mark key events, and to affirm that a university is a community with shared goals, is something that can only be good for morale, and is something we should do more of.
- Case studies of development, deployment and execution of representative 3D applications
- Programming systems, abstractions, and models for 3D applications
- What are the common, minimally complete, characteristics of 3D application?
- What are major barriers to the development, deployment, and execution of 3D applications? What are the primary challenges of 3D applications at scale?
- What patterns exist within 3D applications, and are there commonalities in the way such patterns are used?
- How can programming models, abstraction and systems for data-intensive applications be extended to support dynamic data applications?
- Tools, environments and programming support that exist to enable emerging distributed infrastructure to support the requirements of dynamic applications (including but not limited to streaming data and in-transit data analysis)
- Data-intensive dynamic workflow and in-transit data manipulation
- Abstractions and mechanisms for dynamic code deployment and "moving the code to the data"
- Application drivers for end-to-end scientific data management
- Runtime support for in-situ analysis
- System support for high end workflows
- Hybrid computing solutions for in-situ analysis
- Technologies to enable multi-platform workflows
- Daniel S. Katz, University of Chicago & Argonne National Laboratory, USA
- Shantenu Jha, Louisiana State University, USA & e-Science Institute, UK
- Jon Weissman, University of Minnesota, USA
- Gabrielle Allen, Louisiana State University, USA
- Malcolm Atkinson, eSI & University of Edinburgh, UK
- Henri Bal, Vrije Universiteit, Netherlands
- Jon Blower, Reading e-Science Centre, University of Reading, UK
- Shawn Brown, University of Pittsburgh & Pittsburgh Supercomputing Center, USA
- Simon Dobson, University of St. Andrews, UK
- Dennis Gannon, Microsoft, USA
- Keith R. Jackson, Lawrence Berkeley National Lab, USA
- John R. Johnson, Pacific Northwest National Laboratory, USA
- Scott Klasky, University of Tennessee & Oak Ridge National Laboratory, USA
- Bertram Ludäscher, University of California, Davis, USA
- Abani Patra, University of Buffalo, USA
- Manish Parashar, Rutgers & NSF, USA
- Omer Rana, Cardiff University, UK
- Joel Saltz, Emory University, USA
- Domenico Talia, Universita' della Calabria, Italy
Papers are invited in all aspects of software engineering for adaptive systems, for the SEAMS symposium in Hawaii in May 2011. The deadline is now quite close.
CALL FOR PAPERS
6th International Symposium on Software Engineering for Adaptive and Self-Managing Systems (SEAMS 2011) (Sponsored by ACM SIGSOFT and IEEE TCSE)
Waikiki, Honolulu, USA 23-24 May 2011
An increasingly important requirement for a software-based system is the ability to self-manage by adapting itself at run time to handle changing user needs, system intrusions or faults, a changing operational environment, and resource variability. Such a system must configure and reconfigure itself, augment its functionality, continually optimize itself, protect itself, and recover itself, while keeping its complexity hidden from the user.
The topic of self-adaptive and self-managing systems has been studied in a large number of specific areas, including software architectures, fault-tolerant computing, robotics, control systems, programming languages, and biologically-inspired computing.
The objective of this symposium is to bring together researchers and practitioners from many of these diverse areas to engage in stimulating dialogue regarding the fundamental principles, state of the art, and critical challenges of self-adaptive and self-managing systems. Specifically, we intend to focus on the software engineering aspects, including the methods, architectures, algorithms, techniques, and tools that can be used to support dynamic adaptive behavior that includes self-adaptive, self-managing, self-healing, self-optimizing, and self-configuring, and autonomic software.
TOPICS OF INTEREST
We are interested in submissions from both industry and academia on all topics related to this important area. These include, but are not limited to:
- formal notations for modeling and analyzing software self-adaptation
- programming language support for self-adaptation
- reuse support for self-adaptive systems (e.g., patterns, designs, code, etc.)
- design and architectural support for the self-adaptation of software
- algorithms for software self-adaptation
- integration mechanisms for self-adaptive systems
- evaluation and assurance for self-* systems (e.g., run-time verification)
- modeling and analysis of adaptive systems (e.g., run-time models, cost-benefit analysis, architectural styles and patterns, requirements)
- decision-making strategies for self-adaptive and self-organizing systems support for run-time monitoring (for requirements, design, performance, etc.)
- model problems and exemplars
- mobile computing
- dependable computing
- autonomous robotics
- adaptable user interfaces
- service-oriented systems
- autonomic computing
PAPER SUBMISSION DETAILS
We are soliciting three types of papers: research papers and experience reports (up to 10 pages, ACM SIG Proceedings Format) and position papers for new ideas (up to 6 pages, ACM SIG Proceedings Format). Research papers should clearly describe the technical contribution and how the work has been validated. Experience reports should describe how an existing technique has been applied to real-world examples, including lessons learned from the experience. New idea papers provide an opportunity to describe novel and promising ideas and/or techniques that might not have been fully validated. All submitted papers will be reviewed by at least three program committee members. Papers must not have been previously published or concurrently submitted elsewhere. The accepted papers will appear in the symposium proceedings that will be published as ACM conference proceedings.
Submission deadline: 12th December 2010
Author notification: 15th February 2011
Camera ready copy: 1st March 2011
General Chair: Holger Giese, HPI/Univ. of Potsdam, Germany
Program Chair: Betty H.C. Cheng, Michigan State University, USA
Publicity Chairs: Basil Becker, HPI/Univ. of Potsdam, Germany; Thomas Vogel, HPI/Univ. of Potsdam, Germany
- Colin Atkinson University of Mannheim, Germany
- Robert Baillargeon Panasonic Automotive, USA
- Luciano Baresi Politecnico di Milano, Italy
- Nelly Bencomo University of Lancaster, UK
- Yuriy Brun University of Washington, USA
- Vinny Cahill Trinity College Dublin, Ireland
- Shang-Wen Cheng Jet Propulsion Laboratory, USA
- Simon Dobson University of St. Andrews, UK
- Gregor Engels University of Paderborn, Germany
- Cristina Gacek City University, UK
- David Garlan Carnegie Mellon University, USA
- Kurt Geihs University of Kassel, Germany
- Carlo Ghezzi Politecnico di Milano, Italy
- Svein Hallsteinsen SINTEF, Norway
- Paola Inverardi University of L'Aquila, Italy
- Jean-Marc Jezequel IRISA-INRIA, France
- Gabor Karsai Vanderbilt University, USA
- Jeff Magee Imperial College London, UK
- Nenad Medvidovic University of Southern California, USA
- John Mylopoulos University of Trento, Italy
- Hausi Müller University of Victoria, BC, Canada
- Sooyong Park University of Sogang, S. Korea
- Anna Perini FBK-IRST, Center for Information
- Technology, Italy
- Masoud Sadjadi Florida International University, USA
- Onn Shehory IBM-Haifa Research, Israel
- Roy Sterritt University of Ulster, UK
- Danny Weyns Katholieke Universiteit Leuven, Belgium
- Andrea Zisman City University, UK
- Betty H.C. Cheng Michigan State University, USA
- Rogério de Lemos University of Kent, UK
- David Garlan Carnegie Mellon University, USA
- Holger Giese HPI/Univ. of Potsdam, Germany
- Marin Litiou York University, Canada
- Jeff Magee Imperial College London, UK
- Hausi Müller University of Victoria, Canada
- Mauro Pezzè University of Lugano, Switzerland, and
- University of Milan Bicocca, Italy
- Richard Taylor University of California, Irvine, USA
Symposia-related email should be addressed to firstname.lastname@example.org
The Lord of the Rings is about men, their deeds and courage in the face of seemingly overwhelming odds. Which makes it strange that they in the main get an incredibly raw deal.
I have to say I absolutely love the books: The Hobbit, The Lord of the Rings (LOTR), The Silmarilion, The Unfinished Tales, and the other works in the same area. The films were cinematographic masterpieces, although not sufficiently true to the original for my tastes. The problem I have is one of motivation for the men involved.
If you're a man in Middle Earth, you're pretty much guaranteed to be inferior in some significant way to any other intelligent creature you come across. You can't be wise, in any absolute sense: the elves and the wizards live essentially forever, and they have wisdom sewn-up. You can't be strong, because dwarves and elves (again) are stronger and have more stamina. You can't be artistic, because elves (yet again) are the undisputed masters of all the high arts, and will condescend to teach you what crumbs of their learning you can grasp during your incredibly short lifespan. You can't even be relaxed and laid-back, because hobbits will always make you look stressed-out. Evil's not an option, as Sauron is the ultimate baddie and has legions of orcs (who are corrupted elves in the books, and so have many of the advantages of elves without the goodness).
If you're a man, all you can do is die gloriously.
LOTR is essentially an epic poem in the mould of the Iliad. At the risk of gross over-simplification, epic poems only have three sorts of characters:
- Heros, who do all the things worth reporting
- Love interest, who are pursued (and generally won) by heros
- Arrow-fodder, who are killed by heros
Compare this with LOTR. The major characters are obviously Frodo, Gandalf, Aragorn and the rest of the fellowship, plus Saruman. There's Arwen, who (in the book at least, although not the films) is just love interest, and Eowyn. And then there are the minor characters: hordes of orcs to slaughter, Men of Rohan and Gondor — and that's about it.
There are a small number of less-than-major characters: Denethor, Faramir (who's essentially just another love interest to get Aragorn off the hook with Eowyn), Treebeard. And there's another population of less-than-major characters who are major elsewhere: Bilbo, Tom Bombadil, Galadriel, Elrond. All these are bit-players in LOTR but have a major part in either The Silmarillion or one of the other tales. (Elrond is something of an exception, in that his main claim to fame is having been around at lots of significant events. He just never gets a chance to be a protagonist.)
But the thing with being a man in Middle Earth is the way in which your actions are so circumscribed. No man in LOTR actually dies anything other the gloriously. Even Boromir is redeemed through his death. Even the evil men are happy to go down fighting. It's as though men's sole virtue is to have a bad time and then die.
Aragorn escapes this fate, and is the only heroic (in the epic sense) man in the story, but even he is doomed to failure by his mortality, and you can't escape the suspicion that Gondor will collapse as soon as he's dead. From the elves' perspective, destroying Sauron is an absolute good, and they can then all leave for the lands over the Sea which are a remnant of earlier, better times; from men's perspective, it'll probably be a Pyrrhic victory at best.
In many ways LOTR is a story about passing. The elves' time in Middle Earth is past — although I think the films grotesquely overstate their predicament compared to the book — and men can't ever build anything permanent because they just don't have the wisdom/lifespan/art/strength/goodness to cut it. That of course is what sets LOTR apart from "happily ever after" fantasy (like David Eddings' works). By the end of the book, though, everything looks drab, and without Sauron there won't even be any worthwhile evil left. You have to wonder what will men find to motivate themselves, when the best is all in the past and even glorious and worthy death is denied them. One can imagine a lot of drunken fireside reminiscence going on.
If that weren't bad enough, anyone who's read The Silmarillion knows that Middle Earth has been passing for a very long time. Even though in LOTR Sauron is the ultimate in badness, in the greater scheme of things he was only a servant of Morgoth, the really, really ultimate baddie. Even an evil that threatens to engulf the entire world in the Third Age is only a shadow of what evil used to be like in the First Age — modern evil just can't cut it.
It seems a pretty depressing view of history and heroism, but maybe that's the point: that the Fourth Age will be a post-heroic let-down with everyone left dissatisfied and wishing for the days when there were orcs and dragons and Black Riders and magic.