Actifio is now part of Google Cloud. Read the full announcement.

Building Data Bridges to Hybrid Multi-Cloud

Building Data Bridges to Hybrid Multi-Cloud

Guest post by John Webster of the Evaluator Group

As we advance further into the era of cloud computing, enterprise end users are increasingly realizing that there will be on-premises information technology (IT) components for a while. This leads them to the conclusion that a hybrid cloud architecture with extensions to multiple public clouds is a viable architecture for the near future.

Evaluator Group conducted a survey of enterprise IT administrators that sought-out use cases for hybrid cloud architectures. These were, in order of those most often cited:

  1. Disaster Recovery / Business Continuance
  2. Data Protection
  3. Test and Development
  4. Data Archive
  5. Data Sharing / Content Repository
  6. Tier 1 Applications
  7. Application Mobility
  8. Analytics

Evaluator Group also asked respondents to estimate spending increases on a use case basis for public cloud storage over the next 2 years. First, it is interesting to note that spending increases of at least 20% are expected across all of the cited use cases for cloud storage. However, more significant increases are planned for at least five of these which are shown below in order of magnitude:

  1. Tier 1 Applications
  2. Disaster Recovery / Business Continuance
  3. Data Protection
  4. Data Sharing / Content Repository
  5. Test and Development
  6. Analytics
  7. Archive
  8. Application Mobility

When taken in combination, these two sets of survey results confirm a cloud adoption strategy now practiced by a majority of enterprise IT organizations. Note that Tier 1 Applications – i.e. running critical business applications in the cloud – moves from sixth position as a use case to the top position when spending over the next two years becomes a consideration.

Two data-center to cloud foundations have already been laid for most enterprise IT organizations. One is the disaster recovery capability that we see in the survey data. Once IT administrators understand its workings and confirm its functionality and reliability, they progress to building additional use cases on the hybrid cloud data foundation. Depending on priorities, these often include the enhancement of existing data protection processes, content distribution, digital archives, and test and development—all the way up to building Tier 1 applications in the cloud.

The other—object storage—has long been a staple in the public cloud. Object storage plus the ubiquity of the S3 API gives IT administrators to move data seamlessly across the divide. It other research conducted by the Evaluator Group, we saw that one of the biggest inhibitors to the advancement of hybrid cloud architectures was data center-to-cloud interoperability. Object storage not only breaks down barriers but, as we shall see, it also facilitates the advancement of higher-order hybrid cloud use cases such as analytics.

Bridging SAP HANA & Mission Critical Applications to the Cloud

An instructive case point demonstrating the viability of the DR data bridge for the establishment of Tier- 1 applications in the cloud is SAP HANA. Using SAP S4 in conjunction with the HANA database enables companies to modernize their processes, enhanced by the ability to make real-time decisions due to SAP HANA’s real-time processing capabilities. However, transitioning major business applications and their user groups from traditional SAP platforms to HANA will have profound implications for the entire organization and strain IT resources. For this reason, running HANA in the cloud has become a primary consideration. And this is where the on-prem to cloud data bridge, implemented for DR/BC, becomes a foundational SAP migration tool.

Consider what critical capabilities must be present in backup and DR tools to migrate critical apps to cloud:

  • Manage database and unstructured data copies in any public cloud or on-premises using any storage
  • Application consistent and block-level incremental forever data capture so that the target cloud environment is updated with latest changes from production
  • Flexibility to store data in block storage for mission-critical apps as well as object storage to reduce costs for longer-term retention
  • Ability to test rapidly and repeatedly in a sandbox environment without stopping the ongoing replication to cloud
  • Ability to bring up multiple VMs in a pre-defined order
  • Minimization of downtime
  • Continued data protection in the cloud, post-migration

Clearly, implementing a continuously functioning DR/BC capability using a hybrid cloud architecture is a sound strategy. It gives the enterprise an enhanced ability to survive adverse natural and man-made events while building a data bridge to the cloud to support future use cases such as instantiating SAP HANA in the cloud.

Viewing Object Storage as a First-Class Citizen

The other connection to the cloud—the data connection—is object storage. In my previous blog I mentioned the ubiquity of object storage and its growing ability to support primary Tier 1 applications. By far, more data in the virtually any public cloud is stored in object form than as blocks or files. The equally ubiquitous S3 API provides a standard way to access this data whether on premises or in the cloud.

For years, object storage has been a common denominator across all public clouds and as such it has matured into a storage platform that enterprises can now rely on. It offers durability that challenges on premises storage for availability and can be easily replicated to multiple availability zones. Its cost at scale is significantly less than could-based block and file storage. As such it has made an ideal target for backup copies of data as well as cold data that is, in the minds of users, retained forever.

Unfortunately, this had led to a categorization of cloud object storage as something of a graveyard where data goes never to be seen or used again. It’s time to look at this highly scalable, inexpensive object storage through a different lens – one that starts with a view to leveraging cloud object storage for an expanding list of uses cases that now includes Tier 1 applications and analytics/IoT.

A critical enabler of the transition from second to first class will be a technology that does real time object to block protocol conversion. Taking the on-site data protection use case as an example. In a traditional backup scenario that uses object storage as a target, block-based data is stored in object form. During a restore, that process is run in reverse where data is restored from object to block storage which typically takes a long time and increases Recovery Time Objective (RTO). What if the enterprise IT user could mount backups from object storage nearly instantly as if it was a block device and recover VMs, file systems, databases directly? This eliminates the need to restore to block storage as a separate step while delivering near-instant access to data.

At the center of this near-instant mount and recovery would be a read/write cache that caches all writes and reads after the first read happens from object storage. Such technology thus delivers SSD block storage-like performance at object storage costs.

Extending this process to the cloud, Enterprise IT could provision many near-instant mounts of the backup image from object storage to multiple development and test environments in the cloud (or on- premises for that matter). This enables the rapid provisioning of database copies to development environments running in the cloud VMs or containers by simply reusing the backups that are already stored in the cloud. As a result, backup storage costs are reduced significantly, and friction between DevOps and Operations staff that can delay application development is reduced. Furthermore, the backup of on-premises VMs could be done directly to cloud object storage. This would certainly improve upon if not eliminate the cost and management burden for on-premises backup storage.

The analytics use case is equally fertile ground. Cloud services for analytics such as Google Cloud Platform BigQuery and AWS Redshift could be loaded with real production data sets that originate simply as on premises backup copies. Using this technology, that standard, everyday backup process now becomes one that easily generates copies of production data for use in cloud-based DevOps, test and analytics environments. In short, the object storage data bridge can be foundational to all of the cloud storage use cases presented at the beginning of this discussion.

Hybrid, Multi-Cloud as a Destination

One of the attractions to public clouds is that they offer enterprises that ability to scale infrastructure and associated services at a speed that would be difficult if not impossible for even large enterprise IT organizations to match. But scaling infrastructure is only part of the challenge faced by enterprise IT today. Hybrid clouds are now multi-cloud environments. Enterprise data is increasingly distributed across multi-cloud architectures. It is the responsibility of each enterprise IT organization to secure, protect and manage this increasingly fractured data environment. Building on-premises to cloud data bridges allows enterprises to meet these responsibilities while expanding their business applications portfolios.

Recent Posts