More and more of our customers are using Actifio to optimize their approach to test data management, breaking free of traditional infrastructure limitations to dramatically cut time to market for new applications and features. For an independent perspective on the trend, we asked IDC analyst Melinda Ballou to take a look, and share some perspective on critical application development and test functions. In the below guest post Melinda highlights how software creation and deployments are directly influenced by the quality of data lifecycle management. Citing recent IDC research, she notes the effects on data quality that are directly linked to application development speed, quality, control and cost. And, as she points out, all of these are addressable through new approaches provided by automation and data virtualization.
Optimizing Test Data Management
By Melinda-Carol Ballou, Program Director, Application Life-Cycle Management & Executive Strategies, IDC
Software availability drives business optimization and competitive position on what IDC refers to as the 3rd platform — or multi-modal deployment across and leveraging of mobile, cloud, social, and big data analytics environments. The 3rd platform is now moving to encompass embedded environments and “systems of systems” or the Internet of Things (IoT). These are both driving and are dependent on evolving software development models and approaches. They include agile and DevOps for end-to-end deployment across complex environments that require frequent, rapid, iterative testing and continuous release management. To accomplish this successfully, access to current, reliable data is critical.
It starts with the Data Lifecycle Management that encompasses data management and quality from application inception to software testing and deployment. It covers multi-usage through the data lifecycle and to eventual decommissioning (to the extent that software is actually decommissioned).
Poor data lifecycle management leads to problems for business critical, agile and Dev/Ops approaches to software creation and deployment. Inadequate access to current, accurate data is problematic in software creation, quality and release management on many levels, but four primary challenges that can stymie company software efforts include:
Speed – traditional methods of getting copies of data for development, user acceptance testing (UAT), and quality assurance (QA) teams are sluggish and can slow down production systems. Typically, teams must either pull the data from backup systems (which aren’t made for rapid data access), or hit their production systems with I/O, and risk impacting live customers/users (not a desirable option).
Quality – because of the slowness and difficulty in getting access to complete, real (not faked/synthetic) and current data, testing & development is nearly always out of sync with the “reality” of operational/production systems data and the actual levels of complexity involved.
Control – Regulatory compliance and customer protection issues demand data obfuscation of sensitive data. Also, there are security risks to lack of data control. All too frequently, development and test groups hang onto whatever data they can get, storing it under their desks or wherever may be most convenient. And according to recent survey information, often this data is not masked or obfuscated, so the risk to the enterprise is a total data breach. All it takes is one disgruntled, underpaid developer with a copy of your customer database and an axe to grind. The corporate data could be on the Internet, acquired by a shady hacker syndicate, or worse.
Cost – Enterprises frequently combat these issues by building large, expensive hardware infrastructure dedicated only to serving development environments. They duplicate complete data copies for multiple teams. But the required data may still be unavailable in the required forms to all the teams that need it. So even with hundreds of terabytes consumed for dedicated developer copies, requirements of some developers, testers or contractors can remain unsatisfied.
Inadequate access to current test data can lead to failures of core systems and data breaches that we see increasingly in public venues (even as the majority of these problems remain hidden but equally pernicious). These problems are occurring with ever-greater frequency and leading to serious loss of revenue, reputation and customer trust.
The impact of low quality data as part of testing and test management undermines a company’s business. It impacts development timelines and complicates the often difficult internal politics between testing, operations and development teams.
So, how can organizations address these data management challenges? IDC has seen increasing options for automation to help resolve Test Data Management problems using virtualization capabilities with copy data virtualization [see “Worldwide Automated Software Quality Market Shares, 2014: Acquisitions Dominate Market Transition,” IDC #256714]. One option is to make virtual copies of complete (and automatically maskable), data that is kept current & refreshed from production. Data is updated incrementally and iteratively for development, operations, UAT and test/integration teams.
Just as IDC has seen virtualization benefit other areas such as servers and networks, test data virtualization and automation brings a variety of potential benefits:
- Hardware cost savings and flexibility for rapid provisioning to spin up virtual infrastructure, as well as access for multiple teams (which can improve efficiency through broader resource availability);
- Management available through a single test data management system can mean improved support for roles-based access controls, data masking integration and regulatory compliance benefits = Control;
- Current, refreshed, complete data, leading to improved quality;
- Automated workflows, leading to greater responsiveness and increased efficiency.
IDC research indicates that the challenges of increased complexity and high-end development across diverse platforms (mobile, social, embedded) increase software problems and the need for iterative development combined with data lifecycle automation [“Worldwide Automated Software Quality Forecast, 2015–2019: Competitive Velocity to Fuel Future Growth,” IDC #256812]. Poor lifecycle coordination between development, test and data can mean increased costs, debilitating performance consequences, downtime, customer data breaches and defects pre- and post-deployment. But there is inadequate visibility and understanding about these issues across many organizations, even as they battle the complexity of development and deployment across the third platform.
Companies cannot let optimism mask the need for change. They must become better educated about the business impacts and labor costs of poor data lifecycle management. Multimodal sourcing, software design and deployment demand it.
Businesses should evaluate their current status in coordinated data lifecycle process change and automation with organizational approaches that evolve outmoded testing and data management. This is a need that bridges industries. Poorly managed and problematic multi-modal software leads to brand perception impact above and beyond individual problems. This situation demands a response that encompasses data quality to drive business agility.
Calls to action:
- Evaluate your organization’s current strategies for data, development and quality from design to deployment to transition to effective data lifecycle development processes and automated tools evaluation, and adoption
- Establish a combined Data and Quality Life-Cycle strategy that assesses, adopts and leverages test data management automation (along with pragmatic processes) to obtain application quality, compliance, and customer support benefits
- Drive towards an effective quality development and data lifecycle strategy to help cut costs, increase efficiency and business agility, and sustain brand to address competitive challenges
Sign up for blog updates via email.Subscribe