News Article
Survey: Storage budgets shrink while capacity balloons
July 3, 2014
Storage budgets are down while capacity balloons, and flash storage use is widespread but not yet in all-flash arrays.
Meanwhile, there is significant interest in private and public cloud, but not by a majority, and most are not convinced by the idea of the software-defined data centre.
Those are the findings of the latest wave of interviews with storage and IT professionals at more than 250 mid-sized and large enterprises worldwide by 451 Research’s TheInfoPro service.
Below are the top 10 takeaways from the research.
Storage budgets still shrinking
Average storage budgets have shrunk for the second successive year. Large enterprise budgets dropped by an average of 22% to $14.5m, while mid-sized enterprise budgets fell by 19% to $1.3m.
At the same time, capacity continues to grow, and storage continues to account for an increasing portion of the overall IT budget – now up to 13.5% on average, versus 9.5% last year – which suggests storage budgets are not falling as fast as IT spending overall.
But, that doesn’t get away from the fact that we seem to be entering a new phase of constrained spending in enterprise storage.
Data growth the biggest pain point
The number one pain point for storage professionals, by some distance, is dealing with rapid capacity growth. The average storage professional manages 285TB of primary storage capacity, compared with 260TB in last year’s study, and 215TB in 2012.
Therefore, it’s no surprise that storage pros spend most of their time dealing with tech refreshes and capacity expansion. On average, organisations spend as much of their storage budgets on operational overhead as they do actual products. And migrating data between systems during refreshes/upgrades is a particularly challenging issue.
Storage optimisation technologies such as thin provisioning, data deduplication and compression are now common functions. As the data deluge increases, the question is whether these technologies are enough to stem the tide of data growth, or whether software-defined approaches to enterprise storage that promise to relieve the management burden as much as the capital expense should now be considered.
Backup and DR still high on the agenda
Backup and disaster recovery (DR) feature prominently as major storage projects. Though they lag tech refresh/capacity expansion by some distance, the level of focus here reflects the degree to which technologies such as cloud and virtualisation have affected data protection and business continuity functions.
Research respondents said more staff time is spent on backup administration than on storage administration.
Adoption of data deduplication technologies remains strong, while deployment of dedicated backup appliances is also growing. Meanwhile, the term copy data management is entering the storage professional’s vocabulary as an increasingly visible issue.
Incredibly, one respondent in five said their business had no disaster recovery plan in place, even though 40% told us they had suffered an unplanned storage outage in the past 12 months.
Storage performance a growing issue
Storage performance is increasingly cited as a challenge, with 21% more respondents citing performance as a key pain point in this study than in last year’s.
This seems to be particularly the case in server and desktop virtualisation scenarios. Notably, IT organisations are not looking for performance at any cost, but rather performance balanced with affordability.
To support this, “implementing flash” emerged as a major project for the first time in the latest study. Adoption of flash in all its forms remains strong, particularly as a tier in a hybrid array. Databases, virtual desktop infrastructure VDI) and analytics are the top three applications to which flash is deployed.
2014 is the year of the all-flash array
TheInfoPro service has said for some time that flash changes everything in enterprise storage. The latest research provides proof points that this is starting to filter its way into real-world deployments.
The driver? Organisations say storage performance is a growing pain point while the all-flash array has captured the imagination of the industry as a solution.
Though most organisations have already deployed flash as a tier in existing arrays, all-flash arrays are top of TheInfoPro Wave 18 Heat Index, leaping 16 places from a year ago.
Just 8% of organisations have an all-flash array in use today, but 22% intend to implement the technology in the next 18 months, which is up substantially from a year ago.
Pure Storage was the second most exciting supplier overall for storage professionals, marking the first time a startup has registered such a lofty position. Overall, flash technologies occupy three of the top four places on this year’s Heat Index.
Private cloud storage remains hot for a minority
Cloud storage is heavily focused on private cloud (ie, on-premise). It’s the second hottest technology on the Heat Index for the second year running.
However, homegrown offerings still high rank highly, implying suppliers do not yet offer complete solutions, although IT organisations do look at storage virtualisation products as an important component.
Larger enterprises are the early adopters of cloud storage, having the skills in-house to configure and deploy systems. Meanwhile,object storage suppliers attract some interest as a cloud technology, supporting TheInfoPro’s thesis that object storage is poised for broader enterprise-level adoption.
Public cloud still to take off
Public cloud storage is still exploratory for many enterprises, though there is a difference in adoption levels between large and mid-sized businesses.
Small and mid-sized enterprises, with smaller IT teams, seem to be quicker to implement external cloud storage, especially for file-sharing and synchronisation purposes. Here, interest and adoption is around the same for external cloud as it is for private cloud.
Converged infrastructure interest stalled?
The notion of converged infrastructure remains a hot topic in the server and networking worlds, though interest at the storage level showed signs of slowing.
This is not to say there isn’t interest – around 19% of enterprises have already adopted converged offerings – and only a third of respondents indicated their organisations do not have a strategic goal of convergence.
However, 25% of respondents identified organisational limitations among the inhibitors to moving to converged systems. This declined from the previous year’s results, but still beat out cost, the next highest factor.
For these customers, the transition will have to be led by a motivated CIO or CFO, or guided by a partner that can smooth the way by transforming processes for them.
Cisco and VCE are the most deployed converged infrastructure products. As converged offerings mature and as a new wave of hyper-converged products hits the market in 2014 – most notably VMware’s VSAN – we’ll be watching this space closely.
Most skeptical about software-defined datacentre
Software-defined is part of the lexicon in 2014, but it has yet to assume the status of a clear strategic goal. Only 17% of interviewees agreed they are strategically planning to move to asoftware-defined datacentre.
The majority do not yet plan to move towards a software-defined datacentre, but this outlook is changing. During this study, we asked storage professionals to define software-defined storage. In the first quarter, the answers were predominantly skeptical, with interviewees characterising software-defined storage as marketing hype.
But in the second quarter, real definitions began to appear. The three most common definitions of software-defined storage are: storage virtualisation (abstracting the location of the data); storage managed from the virtual server layer; hardware abstraction (abstracting the requirements of the underlying hardware).
It is interesting to note that only one of these specifies management, and in that case the management occurs at a higher level in the stack. Increasingly, hypervisors and hypervisor-level management tools are moving data and even deciding where data is stored. Storage and data management is increasingly moving to a virtual machine-centric model.
Tape gives way to disk as backup target
Predictions of the death of tape are almost as old as the industry itself. But, could a combination of disk-based backup and archiving, object storage and a cloud-based business model finally combine to deliver on this prediction?
Probably not, but organisational reliance on tape for backup purposes appears to be waning dramatically. The research found that tape accounted, on average, for just 40% of an organisation’s backup capacity. That’s down from 54% in the previous study, and the first time it has dipped below half.
The economics of tape at significant scale means it still has a useful purpose for some long-term data retention requirements, but the emergence of economic disk-based approaches are certainly marginalising tape’s role in backup.