Zach Beauvais

Web two dot oh plus one, in the cloud, with bells on…

Written by Zach Beauvais

Jun 9, 2009

Originally appeared on Nodalities Blog: http://blogs.talis.com/nodalities

The tech world is telling a story about the Web and computing, and the mainstream media seem to be catching on. They’re hearing about clouds, wikis, and the history of the World-wide Web. The whole thing reads like some sort of legend…

It was an era, long ago, when the folk of Middle Class plugged in their Mo-Dems and listened to arcane, magical sounds as their £120 beige box enabled a blazing 14.4 kb/s connection, and they only had to wait a few minutes to call forth script and from anywhere on earth. It was an age that saw the beginnings of email, where people composed messages and sent them down the phone lines at lightning speeds (unless a packet dropped…). This was the time of Web 1.0.

Then, the web collapsed. No one used the internet any more. Modems became paperweights and millions of metres of ethernet cable were grubbed up to make room for under-floor heat in offices. The world was quiet, and the people of Middle Class forgot what they knew.

Until, there dawned the advent of Web 2.0. People re-learned their former ways, and improved upon the innovations of their fathers. Instead of sites and pages, they began to use “Web Apps” which accomplished Tasks, and they became their masters. The great titan Google was made, and he knew all and directed the world toward knowledge. The elves of the web taught men the ways of blogging and messenging and eventually (when they’d mastered all these things with wiki-training to boot) Social Media and Networking.

Only, that’s not exactly how it happened; is it? Many commentators and Alpha Geeks have divided the story of the web into convenient phases, and they’ve roughly settled around a versioning metaphor common to software. Have a look at your favorite browser, and you’ll see a version number (Safari 4 for me, if you’re interested) which lets you know how many iterations have been and gone before. There are certainly noted differences, and turning points, where people phased out their dependence on one thing for the convenience and utility of something better. Tim O’Reilly, who coined the phrase Web 2.0, wrote a much-linked post in 2005 trying to explain and crystalise some of the trends he was seeing which were different from the first few years of the web. The fact that he had to clarify what he meant, and that it took the non-geek world three years to catch up testifies to the notion that the change was gradual. It makes me think that we missed out all the .1-.10’s in the version numbers, and many alpha and beta tests along the way.

Now we are engaged in the great Web 3.0, where we are applying the logic of the past to the present and guessing at the future. Only, because no one is actually releasing versions of the web like a good, reliable software company should, the story is much more complicated—and interesting!

There are notable trends, with backers and bloggers riding various waves. But it seems to me that the point of this is a convergence. The mobile web is bringing new sorts of information to people, and they can make use of this info wherever they happen to be because of advances in devices ad connectivity. As phones and web-enabled devices get better, so to do the chips we seem to have embedded all over the place, and we can now begin to have a more clear picture of what we do through the information we gather from our heaters, cars, and pedometers. Also, as more objects become connected, the grunt-work of number-crunching and storage is becoming commoditized into big, efficient, utility-like cloud services, which host and work with our collected information much more effectively than the gadget in your hand could ever hope to do. Others, like ourselves, talk about the Semantic Web, which allows for an evolution from a bunch of connected documents to the explicit connections between bits of information.

But, I see a trend there which is common to all candidates: information. The web allowed for information to be shared, then collaboratively worked. Now, I see this information becoming useful in and of itself…as data.

Walt Mossberg talks about Web 3.0 as if it is riding on the backs of mobile and connected devices. And I think it probably is. Tim Berners-Lee recently spoke to the BBC about the future of the web including some incredible future of pixels everywhere, where any surface could display information. He’s also repeatedly talked about the future of the web being semantic (he invented the term, let’s not forget) where Linked Data is the web done right. And who am I to argue with the inventor of the Web?

But I don’t think there’s so much a conflict or competition as a coming together here. If there will be a Web 3.0 (and it seems a likely, media-friendly label), I think it will include all of these trends centred around the focus of data. The connected devices allow us access to cloud-computing and storage (computing and storage of data…). Many chips gather data about ourselves, which we can use to personalise our view on the web of data, and the Linking of this data through semantics lets it all be calculable, programmable, and useful. It kind of reminds me of a computer, you know… The chips and our collective use of web applications are input and sources, and the various devices we use are displays and UI’s onto a massive, scalable CPU in the cloud. Linked Data could be the Operating System, allowing and enabling anything to be connected and programmed.

Web 3.0, to me, is a convergence of the trends, and it’s all about data. It’s not a simple story, and any convenient label is to convenient to be comprehensive, but I’m pretty sure the next things will all centre on our ability to make use of and personalise vast chunks of previously-obtuse data.

Image “#Black rain : Convergence” by FredArmitage via flickr—Creative Commons License.

Related Articles

Related

Information as a Civil Liberty

Originally appeared on Nodalities Blog: http://blogs.talis.com/nodalities “Free citizens must be able to hold big institutions and powerful individuals to account.” I attended a speech at the Institute for Government by UK Deputy Prime Minister Nick Clegg at which he...

read more