The web hosting industry has undergone many changes and has encountered several innovations along the way which have caused considerable disruption. None of these caused as much disruption as the ‘ cloud ‘ and its scalable approach to cost.
What is the background to scalability in cloud computing?
The concept of web hosting used to be quite straightforward in the early days of the industry-if you needed website hosting, you rented part of a server (Shared Hosting) and hosted your site. If your site was successful and more traffic got, you moved from a pooled solution to something like a Virtual Private Server (VPS) that provided more space, or you went all nine yards and reserved a server that was dedicated to your needs and was yours to use in its entirety (Dedicated Server hosting). Few have gone so far as to purchase their own computer and rent space in a data centre (collocation), but this has always been an option.
You simply bought what you thought you needed and that was it.
However, as the internet became more and more popular, companies became more focused on their websites as a marketing tool. The question then became a resource matter. If a website unexpectedly received a spike in traffic and there were not enough resources available (RAM, data transfer, disk space, etc.), the website would crash and users (meaning potential customers) would have no access to its content.
To overcome this issue, companies would invest in more resources. They would upgrade hosting accounts and spend money to ensure that their website was online 24/7. So, while a site might for the bulk of the time only require shared hosting, a company would purchase VPS just in case. Likewise, if companies needed only VPS hosting, they would upgrade to a Dedicated Server account, just to ensure that if a spike in traffic occurred, their hosting would have the power to deal with it.
Paying for what you don’t need
So, essentially, businesses used to pay for resources they didn’t need to keep in reserve for the odd occasions that they did. Clearly, as a business model, this was akin to paying a costly retainer, and one that any business in its right mind would avoid if possible.
With the need for increased resources came other costs – having staff on board to maintain service, possibly even maintaining a full IT department. And as servers became used for more than just hosting websites, costs spiralled – remember Lotus Notes? It kept the documentation of entire companies organized so that it was at the fingertips of any staff member. But it was costly, often requiring dedicated staff to manage the solution. Ultimately, great swaths of resource-hungry business activity from accounting to human resources and beyond found its way onto servers. But such convenience came at a cost. Usually a great cost.
Then came the cloud – a paradigm shift.
The cloud offered a promise of addressing the excessive cost of IT. Of course, there is myth and lore regarding just how the cloud-first came about, but the fact is that some major companies (and many points at Amazon’s retail business for this) recognized that they maintained a lot of server resources that they didn’t always need. Someone then had the idea of selling these unused resources for other businesses to use.
That was possibly the ‘eureka moment’ for hosting – rather than rent individual machines (or parts thereof) to individuals and businesses, why not join all servers together in a network (to form a ‘Server Farm’) and rent server space and resources.
With all server resources available to all customers always, then came what has been one of the chief drivers and selling points of the cloud: Scalability.