Caught this article (Guardian interviewing Alvaro Cassinelli) amongst my reading today, and it seems to fit quite nicely as the first post of the year here:
…Soon, we realised that, were we to succeed, something very interesting would be happening: although the computer would be “dead”, it would perform fine thanks to the intervention of external, invisible powers. Life would be instilled from the outside; the corpse of the computer would be “possessed”, animated like a puppet by a ubiquitous “ambient intelligence”. And nobody would notice the difference. When looking for a name, I first thought about “poltergeist computing” or “zombie computing”, but the connotations were all too negative. We settled on invoked computing which seemed more appropriate: it makes you think about spiritual invocation (of invisible, higher powers) as well as of the software engineering technical expression: “invoking an application” meaning “to launch it”…
That’s where computing should be in this day and time. Not saying that there’s not a need for dedicated terminals, but that we really limit ourselves when we see media only in a device-channel construct, or that we can’t reach beyond the physics of current devices so why ask for more (same points that Lainer was putting forth in You Are Not A Gadget).
What these folks are doing should be what is tantamount to the way we do computing in the 21st century. Feeds and speeds need to go the way of 80s fashion and music (and perhaps those holding to those methods need to keep their own niche for a time as well, but be bludgeoned into not forgetting what year it really is). Its to the point that I’m considering Code Year because I’m so frustrated with the pace, speeed, and lack of connected (spatial, methodological, logical, and ethical) activity in digital development that there’s just got to be something else (the louder ones amongst us apparently are seeing this).
Mind you, I write this having the hardest time getting my mobile (the Nokia N8) into that place where it morphs into whatever’s needed. Maybe because I too have taken on that religion of needing separate devices, and therefore can only (initially) read that Guardian article as “they turned a banana into a computer.” I’ve got to get back to the consistent acting and thinking that sparked me working with groups like Maemo and contributing to ideas like TAT’s Future of Screens.
We can do computing better. And if like this, would we even do living better?