Application Recipes and Refinements

Think of applications as the essential services that feed a business – food. Of course food isn’t one thing, one ingredient, but a very long list of fruits and vegetables and meat and spices, and, for some of us, beer. With few exceptions, the ingredients are more satisfying when combined, cooked. In fact cooking has often been used as an appropriate analogy for application development. It follows a cycle of need, preparation, creation, testing, troubleshooting and adjustment. Some recipes are favored and refined. Some retired for good. End-user feedback is always the critical test.

Like cooking, we know that the flow of application development isn’t just for the purpose of new creations, but continuous refining, troubleshooting, updating and tweaking of favored recipes. Once deployed there will be new features to add, user experiences and suggestions to consider. More paprika? And in all of it – cooking or apps – turnaround times are always a factor.

As businesses applications have become more strategic, so has managing the flow of data that powers them. Data virtualization is already being used to address the range of challenges encountered at every stage of the application lifecycle. It satisfies the whole Application Development & Maintenance (ADM) continuum that creates and then supports our applications. Especially when rapid adjustments are needed.

recipe_imageIt’s in the post-release that we often get the most useful application insights. It’s where we get pointers for new or refined features, improved performance, cost reduction, stronger compliance and efficiency. Once the app is up and running, there’s also an expectation of stability, uptime, responsive service and rapid problem resolution. If in application development we create a product, it’s in maintenance and support services we learn how well we did.

The compelling case for data virtualization is made not only as support of initial application development but also once an application is completed and rolled out. Data virtualization simplifies and accelerates application care and feeding, updates, security, bug fixes, and performance improvements.

For example, one client, a multi-national big-project construction company, was struggling with distinct shortcomings in their data management environment. A major sore point was the painful inefficiency in data provisioning for application development and testing. This applied to new applications as well as production apps in need of immediate troubleshooting or feature upgrades. The most severe obstacles were the time and complexity involved. That also created excessive staff, resource and process costs. As they implemented data virtualization for backup, they realized that same virtualization approach could greatly smooth data provisioning for applications. It could accelerate initial development as well as ongoing maintenance and support.

Initial success with this approach came when their helpdesk system needed upgrade. They used virtualized data to build the application development and test environment. They had a new environment up and running in an hour and a half instead of the weeks their old system would have taken. Their developers were both amazed and excited. And it may seem a soft benefit but their developer’s morale and productivity improved as wait time and friction decreased.

“Development teams are very happy because Actifio makes it all is so much easier.  People don’t have to wait to get something running.  Now we can get started in a few hours instead of weeks.”

 

In a status report to management they listed out the benefits they experienced from data virtualization:

  • Expense saved – data protection, reductions in hardware and software licensing – €1,170,000.
  • Time saved – operations, backup speed, implementation speed, protection and recovery, application development and refresh – weeks not wasted.
  • Operations simplified, stabilized and accelerated – easy fine-tuning of SLAs & fast reliable performance.
  • Data restoration speed – data recovery moved from hours and days to minutes.
  • Simultaneous functions – backup and live data access enable parallel operations, application testing, analytics, data migrations and upgrades without disruption.
  • Global operations support – immediate data access through three data centers to all remote offices in eighty remote locations.

Their initial objective was to improve backup and recovery capabilities. What they achieved was a comprehensive, application-centric approach to automating the full lifecycle of application data management.

[hs_action id=”13417″]

Recent Posts