
Applications are central to efficient and successful operation of every business and government agency. Essential success depends upon the attributes of Speed, Quality, Control, and Cost. Speed is fundamental to timely execution of objectives. Quality is vital to successful implementation and long-term viability. Control, the command of data use, security, access, and process is crucial to safe operations. And cost, of course, is a pivotal consideration in nearly all budget decisions.
We’ve directly addressed these four elements and we have discussed each in previous blog posts. Our customers see how DevOps can apply data virtualization to benefit the management priorities that propel successful application development. The growing interest in Actifio’s contribution to enhanced DevOps prompted us to reprise some key points here. Links to earlier posts are included in each section below.
Speed
The need for speed is critical to DevOps and the pressure for faster execution steadily increases. Rapid application development and continuous refresh are a high priority. Response to new requirements, improved capabilities, and accelerated application development can accelerate business objectives. That means taking advantage of every potential enhancement to DevOps that helps drive quality, innovation, efficiency and speed. Always speed.
Standard procedures used in DevOps haven’t supported the need for speed. But now, with data virtualization enabling immediate data access, long delays can be eliminated. Data virtualization reduces the time and cost of finding and reproducing problems; fixing and testing patches. With a new self-service environment developers get near-immediate data access and simple regular refresh when they want it. No waiting.
For DevOps, it’s an alternative that allows rapid development, testing, release and refresh of applications with as much as 90% reduction in provisioning times. Data virtualization means application development with an exceptional edge.
It means speed.
Quality
Transitioning application development to a comprehensive DevOps environment is a natural but not necessarily smooth progression where quality is no less important than speed. The intent is simple; accelerate quality software development. Catch and fix bugs early. Minimize last minute surprises. Release applications on schedule.
DevOps, done right, overcomes organizational and technical challenges. It’s a connected, interactive and collaborative unit, focused on the quality that makes businesses more successful.
Because virtualized data provides timely access to production data, it resolves what has been a persistent quality challenge: eliminating the use of dummy data. It means reducing bugs that popped up as applications moved from development through QA to final User Acceptance Testing (UAT). Now, testing with complete data sets exposes boundary conditions and defects that were previously difficult to spot. Problems that would have gone undetected for weeks are corrected early, eliminating long delays.
Control
The third of our four essential application development elements is control. It means quite simply that data is available to authorized users for sanctioned purposes and fully restricted from any unauthorized use or purpose. We know that developers get more complete and accurate results when working with full production data sets. Copy data virtualization facilitates full developer access, both immediate and secure. Any chance of illicit access is diminished by creating fewer copies. Then, sensitive data is masked, and an audit trail is created to reduce overall risk.
Audit logs and access controls combine with additional security protocols like intrusion detection and integrity monitoring. Essential technical standards combine with multiple levels of data security to address physical, virtual and cloud environments. It’s fast, simple to operate, and reinforces broader enterprise security strategies.
All of these secure and efficient development capabilities are available for local, remote or cloud development, including replication optimization, continuous updates, and automated data masking. Capabilities can be efficiently deployed in remote office locations, or in a secure cloud. The critical thing is that data is kept under control.
Cost
When data virtualization technology is applied to DevOps, capacity requirements shrink by terabytes. Costly purchases are averted. Storage space can be reclaimed or recalibrated and budgets repurposed to recover expenses that would otherwise be wasted. Self-service access reduces DBA workloads. And energy consumption gets reduced along with the data center storage footprint. The truism of IT budgets always applies: “Do more with less.” Data virtualization opens the potential for new thinking – even transformational change.
We know that adoption of DevOps provides a very effective opportunity to enhance application quality and time-to-market. Moving to that model and including data virtualization only adds to the benefits. A single golden copy replaces many physical copies. Instead of multiplying physical copies that drive up costs, virtualized data avoids massive storage expenses and decreases or even eliminates burdens previously placed upon on DBAs. It introduces self-service and near instant access to automatically access database environments along with masked data sets. In effect, costs are reduced in multiple dimensions – staff costs, capital costs, delay costs, complexity costs, and, most important in the eyes of many agency leaders, the costs of time.
Call it a soft benefit, but as staff resources become more efficient, they’re less stressed. Lower stress means happier and more productive people.
The IT world has changed. Data matters. Not infrastructure. Data virtualization drives the new model of DevOps.
[hs_action id=”13556″]