Rob Cotter
Chief Technology Officer | Dec 17 2014

In today’s economy, every business, no matter the industry, must constantly do more with less. The days of “local optimization” are over.

For example, the conventional thinking of optimizing the costs and time of a particular group within IT does not yield the kind of quantum leap change that enterprises need to substantially affect their business models.

In fact, it is often beneficial to increase the costs of one group so that revenue can increase or the overall costs of the organization can go down.

I find “local optimization” thinking to be fairly pervasive and, all too often, I see people trying to save money in places they should actually be spending more.

Let’s look at two current technology examples to illustrate: flash storage and cloud computing. While flash storage can be more expensive than SAS or SATA storage, and cloud computing can be more expensive than private cloud, people frequently use the incorrect metric to compare the different options.

In the case of flash, people often focus on $/GB. For archive data, this may be the correct way to compare the two, but, for production data, completely different metrics should be considered. Example metrics might include: how many software releases could I generate per year? How much can I reduce my customer loss rate on my web site and subsequently how would that affect my revenue?

For cloud computing, the same logic applies. Ask yourself: how can leveraging the cloud increase my ability to deliver services to my customers? How many more releases of software or new features can I introduce per year? How many more market segments can I enter?

To further drive the point home, let’s say that the fixed run rate for a software company is $1M per year and the company is able to release two software products per year. Let’s also say that the software development team believes their bottleneck is a lack of compute resources as well as software build and testing times. They believe that if they could reduce those bottlenecks, they could cut software release schedules in half (i.e. increase to four releases per year). Finally, let’s also say the cost to reduce these bottlenecks is $250,000 per year.

Under the old way of doing things in this simple example, the cost per software release is $500K. Using the new way, the cost per software release is $312,500. So by increasing our storage and cloud computing costs by $250,000, our cost per software release was reduced by 38%. Clearly this is a fictitious and rather simple example, but the point is important. Namely, $/GB or $/CPU-hr may not be the right metric for comparing options; it very well may be software releases per year or incremental revenue per year. It behooves us all to step back and look at the big picture. If you are ready to move your business forward, contact us today and let’s talk about it.