• Votes for this article no votes for this yet
  • Dashboard Insight Newsletter Sign Up

Using Analytics to Rein in Storage Costs
Get the most bang for your storage buck

by Steve Polilli, Media Relations, SASMonday, May 02, 2011

Article Written by Stephen Sanger and Robert Woodruff

The current economic environment is pressuring businesses to squeeze savings out of existing IT assets. Storage, in particular, has become one of the fastest growing IT expenditures due to the exponential increase in an organization’s structured and unstructured data. Cloud computing, virtualization, and myriad complex storage technologies further complicate how storage is being consumed and monitored. 

However, the emerging use of business analytics to optimize storage environments is providing IT managers with the ability to go beyond typical storage monitoring. By generating the intelligence needed to support the optimization of corporate storage environments, analytics provide a better understanding of the business value derived from every corporate data asset, while developing and driving storage policies for provisioning, tiering, archiving, and data retention.

Furthermore, analytical insight supports decisions related to buying and deploying the right amount of IT storage at the right time, while optimizing existing investments, cost savings, and ensuring that internal and external customer service level commitments are met.

According to the recent Gartner report, Controlling the Cost of Storage Growth Requires Storage Modernization1, “the implications of this are obvious – if users want to control costs resulting from the growth of storage, modernization of the storage environment is required. However, it's important to ‘look before leaping’, which often requires improvements in processes and management.”

To effectively optimize storage, companies need to ensure that the storage allocated to business units or functions is actually being used optimally and, through the use of analytics, determine the value of the data that is being stored. This, however, is the challenge. To truly make informed storage optimizations decisions, IT managers need to go beyond basic storage management to understand who is using what data, how and the associated costs.

Across all industries, organizational processes create a lot of data and files which are often redundant and inefficiently stored – not to mention backed up. The storage industry today is touting tiered storage as means of controlling costs, by allocating categories of data to different types of storage media. While tiering may help control costs sometimes, it is not applicable in all situations and could make things worse. This is where storage analytics can help.

With analytics, IT managers can see if departments or users are making optimal use of the space they have been provided and determine whether there’s a pressing need for purchasing additional storage or implementing tiering strategies – be it more servers or cloud computing services – or whether a storage environment can be optimized through thin provisioning or data de-duplication, otherwise known as buying back space or “living off the fat”.

Looking at the right data is the key to effective storage analytics. Along with pulling in data from storage devices, data is also derived from associated servers, workstations and applications using the storage. Perhaps the most important information needed is the business data describing how storage has been allocated and configured relative to the needs of the organization. Known as meta data, this information relates storage to computers, applications, departments, customers and business functions. It contains information for calculating true costs and charge-back. When these data sources are combined, and accurate view of storage utilization emerges.

More and more, attention is turning to private clouds as a more cost-effective, secure alternative to public clouds, which deliver services on demand. SAS has deployed its own private cloud, originally to support the construction of software demonstrations, to support the education department, research and development testing, marketing, partners, and technical support services. As this environment grows to support an array of ever-changing requirements, large chunks of storage need to be purchased, but only when analytics have been used to ensure the environment is optimized and the extra space is warranted.  

When it comes to monitoring the storage capacity and utilization of a cloud environment, matters are complicated more by multiple sources of data and layers of abstraction, such as storage, network, SAN fabric, virtualization, provisioning and the environment’s orchestration layer. This is where analytics can help cut through the complexity and inform optimization decisions. The metrics needed for optimizing a cloud environment are different and there are more of them compared to traditional computing models. To optimize the entire environment, data from all these sources can be brought into a Capacity Management Database (CMD) where analytics can be applied to achieve economies of scale, while meeting service level commitments. In a virtualization environment, analytics are also effective at helping to understand how much of a server can be partitioned to service another function, without impacting its overall performance.

When deciding on whether to invest in a public or private cloud environment, or simply stick to a traditional computing model with dedicated servers, organizations need to understand the value of the data they have, whether a public cloud environment will offer the right level of security for the data to be stored, and which environment will provide the best economies of scale. Analytics can be used to forecast the growth rate of an organization’s data and what the total cost of ownership would be for storing the data in a traditional environment compared to a public cloud.

There are many ways to slice, dice and augment a storage environment to optimize space and increase performance, while lowering expenses. But true optimization can only be realized by applying robust analytics, combining multiple data sources to understand the data you have, its value and where it should ultimately reside.

About the Authors

Stephen Sanger has over 18 years at SAS  and currently manages the Service Availability Management and  the Center for IT Analytics teams.   He specializes in applying IT Service Management (ITSM)  and IT analytics best practices to SAS’s environment.  He also provides domain expertise for the development, testing, and marketing of SAS’s IT specific products, including creating case studies and speaking engagement.  He holds a BS in Human Factors Engineering. 

Robert Woodruff has many years of experience designing, building and leading complex software engineering and analytic projects and currently serves as a Senior Analytic Engineer  with SAS’s Center for IT Analytics. An expert with SAS’s Enterprise BI Server environment, he holds a BS in Computer Science and an MBA.

 

1. Gartner, Controlling the Cost of Storage Growth Requires Storage Modernization, March 15, 2010, G00173027

Tweet article    Stumble article    Digg article    Buzz article    Delicious bookmark      Dashboard Insight RSS Feed
 
Other articles by this author

Discussion:

No comments have been posted yet.

Site Map | Contribute | Privacy Policy | Contact Us | Dashboard Insight © 2014