Close

IDC on What’s Driving The Data Deluge

Monsoon. Do you hear the rain? - Monsun

Monsoon. Do you hear the rain? – Monsun (Photo credit: alles-schlumpf)

We’re now producing an always on data stream that’s being monitored, analyzed, and copied across the enterprise. IDC estimates that last year, the amount of information created and replicated surpassed 1.8 zettabytes (ZB) — or 1.6 trillion gigabytes (GB) and is expected to grow at a 45% CAGR from 2010 to 2015.  Big Data, indeed. And Big Problem as well.

Hoping to get some objective and really smart people to take a closer look at how organizations can reduce their storage infrastructure cost and complexity, we posed a few pointed questions to Laura DuBois, IDC’s Vice President for Storage, including this one:

What’s the root cause of the exploding cost of storage for an IT organization? What’s the effect on business?

In her recent report published here, Laura explains that the data explosion is actually comprised of two growth curves… one for live production data, and one driven by the growing number of systems making copies of everything in production, copy data. According to her research, the amount of disk storage being consumed by copy data — from archive, backup, and recovery — is a huge and growing percentage of overall storage capacity, averaging between 30%-40% of total disk storage installed.  Copy data is what’s driving more and more IT budgets into the red, as the costs of storage, licensing, infrastructure maintenance, and the dedicated staff time required to manage all that complexity expand.

Laura goes on to explain key strategies organizations are using to deal with these issues, and we’re pleased to be among them. But there’s a lot of insight in this report, beyond the usual blah-de-blah about our enabling technology.

Read her recommendations for yourself, and let us know what you think right below, on twitter, facebook or linkedin.

Enhanced by Zemanta