Home

A few days ago, while writing the piece on the N9 actually, I was reviewing some comments on Twitter when I saw this one by Martin Geddes:

Weakness of Apps model is many are really “relationships”, “situations”, “contexts”. Belong in the (app) formerly known as “address book”.

For as much as the iPhone has trained people to think about applications, and even how previous computing models never really seemed to go away from them, there seems to be a push to revisit how we interact with data. Not just in terms of the device or scheme, but how those interactions are initially categorized and then life evolves from them.

One of my primary complaints about the latest mobile platforms stems from this fact that they really haven’t changed the mental box by which you interact. There have been refinements and fringe case developments, but for the most part, we interact with devices and services through a “button” paradigm – push it and it responds. Very rarely are we asked to explore to learn a new behavior that better represents how we naturally deal with data.

But if you look at the Windows Phone platform (and to a degree Timescape on some Sony Ericsson devices), you see something different. You see a different perspective taken towards the “start” of information. On Windows Phone devices, content is organized into “pods.” These pods act as a collectors for similar types of data. For example, all your photos, whether they were taken on the device or sit in an online service, reside under the “pod” Pictures. Same with Contacts, Documents, etc. If you will, the source or intention of the data doesn’t matter as much as its primary category. From that pod you can do all that you’d normally do with data, with the incentive of already starting from its nea-exact cultural meaning.

The N9 looks to be a step in this direction. The idea of swiping – whether this is between applications/services or just to get around the device, begins to take apart the idea that data sits in a jar, but rather that its a head-turn (a swipe) away. Speaking personally now, I tend to orient myself spatially, and then logically/abstractly. Such a metaphor works very well for me and I can see where it works but also has its limitations. Namely, you actually do a bit more in terms of organizing and remembering the connections between data. Not that its hard, but if something moves too much, your memory gets fuzzy.

What’s not been totally figured out is how to get to some kind of interface that alleviates that weakness. When you take a picture, there’s a ton of direct and implicit data along with it. For example, I was out picking blueberries the other night and took several pictures. The camera was only able to capture the date, location, and the subject. It isn’t able to also attach the non-online/mobile persons around me to the image. And there’s no way to encode other information besides the geo-coordinates unless I use the device/system’s tagging system (which takes away from the moment of taking pictures). There’s an event and relationship happening there, but little that’s taking advantage of it.

I’ve got ideas of how to tackle this but really need more time and skills than what I have in order to explore it. That said, the N9 and Windows Phone 7 devices recognize that we’ve got to have something a bit more to how we interact with these magic wands. I wonder if someone will actually pull it off… there are prospects in the wings which look quite promising.

Advertisements
Posted in: MobileComments Off on Relationships and Events as a Mobile UX Paradigm