Access and Feeds

Storage: Reducing Costs and Improving Efficiency with Storage Virtualization

By Dick Weisinger

While unstructured Data in the enterprise is growing at a break-neck speed, the actual enterprise dollars being spent on hardware for new storage has remained relatively flat over the last 10 years.  Moore’s law when applied to the increasing storage density of storage media seems to be keeping costs down.  But Hitachi’s Hu Yoshida argues that people aren’t looking at the big picture.  In reality, storage costs are spiraling because of huge increases in operational costs.  Yoshida estimates that companies are seeing increases operational expenses for storage to be as huge as 78 percent per year.

One cost factor is infrastructure costs.  The cost to power and cool an ever increasing pool of storage hardware devices is rising rapidly.  Companies are also running increasingly on 24/7 schedules that can wreak havoc on maintenance costs. Because of that, another factor is that maintenance and storage media migration projects are often  scheduled at low-volume off-peak hours, which can result in additional  labor costs.

Yoshida suggests that companies need to start factoring in these additional types of overhead costs when calculating the true cost of operating a megabyte of storage.  Metrics tied alone to the cost of storage hardware at the time of purchase are deceptive.

To reduce the total cost of storage, Yoshida advocates data virtualization.  Such an approach provides much more agility.  It allows storage to be provisioned as needed.  It is more efficient in allocating storage usage and doesn’t require the need to dedicate large blocks of storage to be set aside for anticipated future growth of application data.  Yoshida estimates that this kind of over-provisioning of storage with per-application allocations means that, on average, companies have 32 percent of their storage unused and allocated to support future growth.  And with technologies like hot backups, volumes set aside to replicate the primary copy of the data also include the unallocated storage space in their copy.

Yoshida explained that their goal at Hitachi in the area of storage virtualization is “to try to create a virtual pool of resources that are easily accessible and can dynamically provision for today’s datacenter environment.

He explains that IT managers also need to be able to have better insight in how they can rank the relative value of the data that they are storing.  Doing that would allow the administrators to be able to channel data into tiered storage layers to achieve better storage cost efficiencies.

Digg This
Reddit This
Stumble Now!
Buzz This
Vote on DZone
Share on Facebook
Bookmark this on Delicious
Kick It on DotNetKicks.com
Shout it
Share on LinkedIn
Bookmark this on Technorati
Post on Twitter
Google Buzz (aka. Google Reader)

Leave a Reply

Your email address will not be published. Required fields are marked *

*