
We’ve seen the benefits that digital businesses can get by adopting DevOps methods for application development. At the same time, the substantial importance of business applications cries out for the fastest, most efficient development resources. If DevOps is about collaboration and process, application execution and output are all about the power and capabilities of technology.
At or near the top of that list is a solid capability to maximize all the potential for productivity derived from accelerating Test Data Management (TDM).
In part, Gartner defines TDM as “the process of creating realistic test data for use in development, testing and training.” They also include “support for masking sensitive data.” Their assessment of the current state of TDM as an element of application development is that somewhere between 20-50% of organizations are getting it right. That means those developers are getting up-to-date access to production data, appropriately masked and secured. The other half or more have little or no automation and are using handcrafted data without the necessary security and privacy. Effectively, they are missing out on the potential of a very competitive edge.
Timely access to production data has long been a challenge. To get what they need, and to avoid time delays getting and refreshing full data sets, developers frequently revert to using dummy data. For example, a 500gig sample might be used to represent a 5TB database. As a result, quality issues pop up as applications move from development through QA to final User Acceptance Testing (UAT). Realistically, what works perfectly in development and QA environments can still fail the production readiness test.
It’s the dummy data. It doesn’t test at scale. It’s out of date and incomplete. It won’t expose boundary conditions. So problems can go undetected for weeks until the application finally gets to user acceptance. Then it gets sent back. The overall process means poor quality and long delays to get a final product. Using low-fidelity test data may seem to save storage space and time. And the resources required for creating, masking, refreshing and processing endless production copies can be overwhelming. However, the negative impacts on application quality and speed make for an unproductive trade-off.
The essence of rapid, well-crafted application development is carried by the quality of test data. Gartner’s point, and ours, is that full production data sets should be available for development, testing and QA. The data should be appropriately masked and secured to maintain regulatory compliance. And automation should be in place to frequently refresh for complete and accurate performance testing.
That’s where data virtualization can help. Application development functions frequently struggle to stay cost effective but still gain the required speed, flexibility, simplicity and scale. They need immediate business value, lower costs, and improved agility. By capitalizing on data virtualization they gain a resource to transform physical, infrastructure-bound assets to virtualized, portable, instantly accessible data. In the process they’re controlling cost, risk, time and quality. They’re meeting regulatory pressures and showing senior management they know how to deliver strategic business value.
Virtualization of Test Data Management has reshaped traditional mindsets to accelerate time-to-value capabilities and build new opportunities that extend efficiency and expand profits. It’s just the edge that any board of directors is looking for.
[hs_action id=”13556″]