The concept of combined Development & Operations, DevOps has emerged from Agile approaches as the preferred method for enterprise application development. It’s a time-to-market accelerator, and it’s clearly shown improved innovation results over previous sequential practices. It’s also a model with historical precedence. For more than four millennia, a flexible “design-build” project delivery system has been used in building construction. “Master builders” have used it to save expenses, accelerate schedules and encourage innovation through a flexible construction process. So, if you’re adopting a DevOps building framework, there’s a few thousand years of validation to back you up.
Because applications are integral to business transactions, there’s a heavy burden on application developers to provide modern, integrated, worry free applications in very short time frames. And they need to reiterate quickly with new features, functions, and competitive benefits. It’s about higher quality. Faster opportunity advantage. They need simple and direct access to all of the technical resources. And they need access to production data. Their ability to efficiently and quickly access needed data has an impact on business flow, which in turn affects revenue and the lifecycle – the viability – of an enterprise.
While DevOps and associated Agile approaches are proving their acceleration value, much of the discussion has focused on how to make a transition from prior development models. How do we maintain skills? How do we productively blend operations and development expertise? What cultural and organizational shifts are required? What are the processes and tools that can facilitate those changes? And embedded in all of that are lots of questions about who does what. Is it cooperation among teams or a wholly new and integrated team?
Then there is the data. More than any functional or technical considerations, applications are about data and how to develop and test new code. To easily access that data, we need to rethink functional relationships that govern the storage, compute, network and systems administration of all kinds. DevOps is all about simultaneous advances, the continuous improvement made possible by freeing software from hardware. So, applications, new or legacy, need to separate data from the physical architecture. New data management tools and practices are required. Development speed and application mobility limitations will remain until simple and rapid data access is easily accomplished.
The way most development progresses today, gaining access to a single database copy entails multiple steps and weeks of waiting. It’s even more difficult with today’s capacities measured in terabytes and petabytes. Gaining access to large data sets will typically require multiple teams to provision storage, network, and OS before the app developer can get at it. It’s a technical problem, a security problem and a business problem. So, we start by getting the data. Sources will vary, touching virtual and physical platforms, databases, hypervisors and operating systems. If storage and compute resources and security permissions are not already in place the time stretches even longer and costs increase. If the database is very large, the time extends again and may involve added infrastructure costs and wait times.
Any sensitive data must be protected. That can mean intensive manual work for data masking to protect privacy and assure security. Reliable data control requires a hard line between production and Dev/Op environments. A fully integrated workflow is needed to guarantee that only the correct/masked data is accessible. If masking is skipped or ignored, there can be no confidence that data will be protected from exposure to hacking or potentially harmful leaks.
With all of the delays and multiple steps, businesses are understandably impatient. Missed opportunities mean lost revenues. They see costs and inefficiencies further compounded when more applications are added and the number of development teams increases. Later, after releasing applications into production, the delays continue in change management, approval, and provisioning processes.
Now, imagine a change.
Imagine you have a data virtualization platform that already has copies of the production data. Imagine that you can non-disruptively access those copies and repurpose them for development and test environments. Imagine all of the hardware and software pieces virtually in place and ready with security and data masking already set.
No need to imagine. Quoting one Actifio client “I heard about Actifio and my first thought was ‘It can’t be that easy.'” But actually, it can be.
We’ll show you how in next week’s post.
Sign up for blog updates via email.Subscribe