Data-intensive applications can hobble DevOps velocity, but one IT team makes fresh test data available to its developers with a tool that makes rapid copies of large data stores.
Large enterprises lug data-heavy legacy apps with them to DevOps. To keep rapid development on track, teams must take fresh approaches to IT operations at the deepest levels of infrastructure.
For ActiveHealth Management Inc., a New-York-based subsidiary of Aetna International, a large health insurer in Hartford, Conn., that problematic app was a 150 TB Oracle database deployed on a six-node Oracle Real Application Cluster, to produce analytics reports on member data. The amount of data on such a complex server infrastructure presented a major obstacle to the company’s planned implementation of a continuous DevOps test process in early 2017. A manual refresh of database test data through a traditional backup copy of the cluster would require an estimated minimum of 350 hours of work, over 30 days.
“Our QA team wanted live real-time data in our lower test/dev environments,” said Conrad Meneide, then the vice president of infrastructure at ActiveHealth, now executive director of affiliate infrastructure services at Aetna. “But a 150 terabyte production database takes an insurmountable amount of time to copy, and importing a full copy of that data to a test environment would require a costly storage footprint.”