Chances are high that every enterprise has a mission-critical application built around a database such as Oracle, HANA, MS SQL Server, MySQL, PostgreSQL, Db2 or MongoDB. And these enterprises’ #1 priority is to release new features ASAP. Why? Because this reduces time-to-market and allows them to differentiate against their competition while delivering their roadmap to customers faster.
But the #1 application development challenge most enterprises face is that they can’t test against high-fidelity copies of these mission-critical databases fast enough. The digital economy has driven so much data into these enterprise databases that it’s not easy to clone them for Dev/QA/UAT testing purposes fast enough. Creating dozens of physical copies is not a viable solution because of storage costs and also the time it takes to create these physical copies.
Testing against subsetted copies of production databases reduces cloning time and storage infrastructure. But this leads to a situation where Dev/QA/Integration teams are testing against small datasets where the APIs and queries might work just fine. But the moment the same code runs in a UAT environment against a very large 10+ TB database, they find many defects and/or performance issues that are hard to fix. They now have to delay the release or release it with known defects.
So what’s the solution?