The transition of application development to a comprehensive DevOps environment is a natural but not necessarily smooth progression and Quality is no less important than Speed. The intent is simple; accelerate quality software development. Catch and fix bugs early. Minimize last minute surprises. Release products on time.
If business differentiation comes in part from the quality and sophistication of applications, DevOps helps. It’s designed to remove barriers between operations and engineering – to improve quality. It can smooth and speed communications and help integrate development, operations and QA functions. Customer satisfaction improves and profits increase. That’s a lot. It means that DevOps, done right, is a connected, interactive and collaborative unit that makes businesses more successful. But DevOps is also a classic “easier said than done” target with some tough organizational and technical challenges.
Timely access to production data has been one of those challenges. So, to avoid the time delays involved in gaining full data sets, developers have frequently reverted to using dummy data. A 500gig sample might be used to represent a 5TB database. But quality issues pop up as applications move from development through QA to final User Acceptance Testing (UAT). What works perfectly in development and QA environments can still fail the production readiness test.
It’s the dummy data. It doesn’t test at scale. It’s out of date and incomplete. It won’t expose boundary conditions. So problems can go undetected for weeks until the application finally gets to user acceptance and then gets sent back. The overall process means poor quality and long delays to get a final product.
Take the example of an insurance company developing a new broker application. The dummy data sample didn’t adequately represent the actual broker segmentation and geographic regions. But that wasn’t discovered until final testing with a full data set sent the whole project back to the beginning. Everyone wants high quality in the final product. But to do it, test and production environments need to match as closely as possible, including data sets. Consistency and speed depend on careful infrastructure build, deployment and management.
In the old model, interactions between development and QA were frustrated in discovery, demonstration and correction of even simple application problems. For example, say a QA engineer needs to reproduce a bug for a developer. He can’t afford to waste time and so doesn’t want to disturb the environment ‘till the developer sees it. Now suppose they’re in different time zones. It’ll be a long coffee break.
Managing that same scenario using Actifio, the QA engineer could freeze the environment for developer review. At the same time, QA could remount the original testing snapshot and continue working without a break. The same approach works when production support needs to reproduce and debug problems in an active customer-facing application. A near instant copy can be created to quickly find the root cause and get it fixed.
Application development is a complex and demanding process. Anything that makes it simpler is welcome and appreciated. With access to multiple near instant virtual data copies quality issues and time delays get substantially reduced. With minimal storage consumption, applications can attain maximum scalability, consistency, data control, automation and ease-of-use fundamentals. All of that helps DevOps improve quality while accelerating development and release cycles. That’s a lot.[hs_action id=”13417″]