Am sitting with my bro LJ on IM while typing this. Have got the Nokia spinning tunes from Last.FM, iPad following the Sixers and action on Twitter, and a PC running the IM session and this post. I live with multiple computing platforms – always have. So when I hear things like “moving from PC to Mac isn’t viable in part because of software repurchases” I start to wonder if I’ve been doing computing all wrong.
Here’s the start of things – when I started with computing in the mid-1990s, I had a PC to work with at home and Macs and PCs to work with at school. I had no choice but to figure out how to live with both in my floppy-driven world because it was a requirement that I got things done.
Once I figured that out, then there was the issue of applications. There was no way my parents were paying for AutoCAD, Photoshop, or Illustrator for the home PC, so I had to learn how to work quickly in school, and leave certain works to be done at home. This got easier when I dealt with smaller graphics and papers, then it just needed to be in a Word DOC format and I could go from there.
In college, I added the PDA to my platforms, and then it was a matter of making sure that I simply had a means to sync from one platform to the other. I learned how to install sync software in the computer lab – knowing that the next day it would be removed – so that I could transfer documents from the lab to my PDA. That got tiresome, and so I would eventually move to getting a keyboard for the PDA, and letting my home PC serve as the backup area for content.
As I moved to connected PDAs and smartphones, the web became my place of residence more and more. Right now, I’m at the point where I am online just about all the time. I’m doing things like asking people to share things in Dropbox, versus sending attachments. When documents or presentations are sent my way, I’m miffed when they aren’t on hosted services, or use some version of PowerPoint that none of the devices around me can read. In a sense, I don’t care much about the platform, I just want the content. The web has made it easy to make sure that’s the only thing that counts.
And yet, I had this discussion with my bro about his potential moves with a new laptop and kept coming into issues related to the platform being the issue. We wondered about doing video encoding with a hosted service, rather than needing to own the hardware to do it efficiently. We wondered about hardware upgrades to his existing system, because performance issues there weren’t so much a reason of an application, but how the operating system is built to work with certain heavier applications. But, it didn’t matter – we kept coming to that point of it being a problem for him to move to any other platform because certain pieces of content weren’t versatile enough to move elsewhere.
I know that its happening in bits and pieces – but I really long for the day when the web is the only platform that matters. Not in the respect that we need to have everything in a cloud we don’t control, but in the respect that in order to get things, done, we only need to point our device to it, and life gets done as it needs to.
Shouldn’t computing be that simple? Aren’t we in the future yet?
2 thoughts on “When Is the Web the Only Platform that Counts”
man, you just keep making me think, don’t you?!
Hey, you make me think too (hence the post). I don’t know that there is a single, one-size-fits-all answer, but there are questions about how we do computing that should be asked from time to time. For me, content portability is a big question – in respect to why it isn’t so portable. We really should be past that point.
Comments are closed.