Over the past year or so, cloud computing has become all the rage. Everyone is talking about it as the next great hope for the IT industry, in general, and communications, in particular. But I have to wonder if those who see cloud computing as a sort of miracle worker really know what it is. If you ask 10 people to supply a definition of cloud computing, you’ll likely get 10 divergent responses.
But cloud computing is not a technology play. In fact, I would be hard-pressed to identify a single piece of new technology that is fundamental to cloud. Unlike Twitter or Facebook, it's not a social-psychology phenomenon in any real sense - there is no “man-on-the-street” movement that is driving the uptake or need for cloud computing. It is one of those rare beasts - a practical, common sense-driven initiative.
Simply put, cloud computing makes much more efficient use of resources. In the early stages, these resources are essentially processing power and storage, but increasingly the focus of cloud will converge on efficient use of software resources from a bewildering array of sources. The concept of a user gaining access to, and paying for, these resources on a per-use basis makes great economic sense for everyone: from the lone mobile game developer in his garage to the uber-large-scale financial institution. It also happens to be industry-changing. Unless someone spots a fatal flaw with the concept, over the next 10 years we will move from a predominantly distributed computing and storage world to a centralized computing and storage world.
What makes cloud very interesting is that every one of the global vertical industries (telecom, financial, retail, etc.) has to have two conversations about the cloud: first, how do we become a cloud user to enable more efficient operations; second, how do we leverage our existing platform assets to become a cloud provider?