Accommodating a business’ storage needs all too often creates a snowball effect of infrastructure costs. Particularly for businesses mining Big Data for B.I. purposes – but also in certain industries such as healthcare, media production, or utilities – data growth rates frequently more than double the amount of storage required annually. The end result of this massive reoccurring expenditure is a shifting of funds away from projects that actually matter.
Every terabyte of unnecessary media, particularly for high performing applications, can cost your organization thousands of dollars to acquire, and 3-10 times that much over its life cycle. Unfortunately, most businesses are going about the storage expansion process all wrong, and this is a costly mistake. The crux of this issue is accurate capacity planning and effective capacity management.
When asked why a 70% storage expansion was planned for this year, one infrastructure manager replied: “because that’s what we did last year”. So why is this method used so frequently? Ultimately, it is easy and safe. Essentially, the size of the ‘bucket’ of available storage this year is used as the basis for next year’s ‘bucket’. All too often, storage expansion becomes a self-fulfilling prophecy that isn’t actually connected with what matters – the individual drivers of new data. By focusing on data growth, rather than storage growth, organizations can trim the fat off of their infrastructure budget.
This approach sounds great in theory, and does work in practice, but there is a risk here. Under-provisioning of storage could occur if growth estimates are too conservative, and can interrupt your ability to conduct ongoing business. Hence the critical role of accurate, and granular, capacity planning. The current options available for capacity planning are either limited in view or unreasonably expensive. This is something that we have attempted to address with a new planning tool in our recent blueprint “Tackle Explosive Data Growth on a Tight Storage Budget” (click on the infographic below to go to the project blueprint).
As a side benefit, understanding the sources and requirements of individual data creators doesn’t only ease the strain on your IT budget. It also lets you make smarter long-term investment decisions impacting performance and other end-user requirements, since for the first time you truly understand the inner workings of your ‘buckets’.
A change in mindset is the important first step; move from reacting to data growth to actually mitigating it and shaping how it will occur.