Rent vs. Buy? The Cloud Conundrum

5 Min Read

 

 

Over the long-run, is cloud computing a waste of money? Some startups and other “asset lite” businesses seem to think so. However, cloud computing for specific use cases, makes a lot of sense—even over the long haul.

 

 

Over the long-run, is cloud computing a waste of money? Some startups and other “asset lite” businesses seem to think so. However, cloud computing for specific use cases, makes a lot of sense—even over the long haul.

A Wired Magazine article emphasizes how some Silicon Valley startups are migrating from public clouds to on-premises deployments. Yes, read that again. Cash poor startups are saying “no” to the public cloud.

On the whole this trend seems counter intuitive. That’s because it’s easy to see how capital disadvantaged startups would be enchanted with public cloud computing: little to no startup costs, no IT equipment to buy, no data centers to build, and no software licensing costs. Thus for startups, public cloud computing makes sense for all sorts of applications and it’s easy to see why entrepreneurs would start—and then stick with public clouds for the foreseeable future.

However, after an initial “kick the tires” experience, various venture capital sponsored firms are migrating away from public clouds.

The Wired article cites how some start-ups are leaving the public cloud for their own “fleet of good old fashioned computers they could actually put their hands on.”  That’s because, over the long run, it’s generally more expensive to rent vs. buy computer resources. The article mentions how one tech start up “did the math” and came up with internal annual costs of $120K for the servers they needed, vs. $320K in public cloud costs.

For another data point, Forbes contributor Gene Marks cites how six of his clients analyzed the costs of public cloud vs. an on-premises installation, monitored and managed by a company’s own IT professionals. The conclusion? Overall, it was “just too expensive” for these companies to operate their workloads in the public cloud as opposed to capitalizing new servers and operating them on a monthly basis.

Now to be fair, we need to make sure we’re comparing apples to apples. For an on-premises installation, hardware server costs may be significantly less over the long run, but it’s also important to include costs such as power, floor space, cooling, and employee costs of monitoring, maintaining and upgrading equipment and software. In addition there are sometimes “hidden” costs of employees spending cycles procuring IT equipment, efforts for capacity sizing, and hassles of going through endless internal capitalization loops with the Finance group.

Thus, cloud computing still makes a lot of financial sense, especially when capacity planning cycles aren’t linear, when there is need for “burst capacity”, or even when there is unplanned demand (as there often is with fickle customers). And don’t forget about use cases such as test and development, proof of concept, data laboratory environments and disaster recovery.

Another consideration is resource utilization.  As I have stated before, if you plan on using IT resources for a brief period of time, cloud computing makes a lot of sense. Conversely, if you plan on operating IT resources at 90-100% utilization levels, on a continual and annual basis, it probably makes sense to acquire and capitalize IT assets instead of choosing “pay per use” cloud computing models.

Ultimately, the cloud rent vs. buy decision comes down to more than just the price of servers. Enterprises should be careful to understand their use cases for cloud vs. on premises IT. In addition, watch for hidden costs in your TCO calculation that underestimate how much time and effort it really takes to get an IT environment up, running and performing.

 

 

Share This Article
Exit mobile version