In the last week there were a handful of interesting developments in the ‘cloud’ space. Early last week, Microsoft officially launched their Azure cloud service. Shortly after, and perhaps in response, Amazon lowered pricing on their cloud offerings. And finally, it seems Oracle has decided to better articulate their integral role in clouds. I thought it would be worthwhile checking out Google Trends to see how Amazon, Google, Microsoft and Oracle are faring in cloud “mind space”. If you’re not familiar with Trends, it allows you to get a graphical, chronological representation of how often different terms are searched for at Google.
The problem with “cloud” though is that it has become a term applied to a myriad of technologies. Behind the overuse and misuse of the term though lies a fundamental trend that many believe will redefine how individuals and companies consume computing resources.
How do clouds, and in particular public clouds, change things? Today, many tech startups don’t buy physical servers anymore – they simply provision the appropriate services in Amazon’s cloud and pay as they go. Many software companies now do proof-of-concepts (POCs) in the cloud versus at customer premises. And what happens as the technology improves and pricing declines? The adoption will accelerate and move from startups to small business to eventually medium business. And some day, yes, to some cross-section of the computing requirements of large enterprises too. There will be a day when the majority of businesses in North America no longer own any servers, and instead consume computing resources the same way they do their utilities.
So what does this do to the business models of chip manufacturers, server vendors and software companies? How will the incumbents adapt their businesses to this new reality? What old technologies are no longer relevant? And what new technologies will be needed in this new paradigm?