Actifio is now part of Google Cloud. Read the full announcement.

Reduce Cloud Costs: How to Flatten the Curve

Cloud Backup and Cloud DR Best Practices

Cloud services offer an excellent opportunity to innovate and deliver new capabilities faster to the market. However, like anything else, without proper inspection and process, it could also lead to remarkably high costs. Let’s look at some of the various categories of cloud cost components that you can inspect and control or reduce the spending and reduce cloud costs.

Reduce Cloud Computing Costs such as AWS EC2, Azure and GCP VMs

  1. Look for 24×7 compute consumption in the cloud. Question your DevOps team about the 24×7 requirement in the cloud, especially the dev and QA machines.
  2. If your team determines that some applications need steady-state usage with reserved capacity, check out the On-demand vs. Reserved instance pricing. Reserved instance pricing can deliver up to 70% savings. Here is an example of the AWS pricing screenshot as of July 2020. (Display 1)
  3. Ask your team if they are using containers in the cloud, such as Google Kubernetes Engine, Amazon Elastic Kubernetes Service, or Azure Kubernetes Service. If they are using containers to run workloads 24×7, the costs will be terribly high.
High Cloud Costs - How to Flatten the Curve
Display 1

Even though some of the above may sound like no brainer, I have seen many cases where teams follow those expensive practices anyway.

Reduce Cloud Storage Costs such as AWS EBS, AWS S3, Azure Disk, Azure Blob, GCP Persistent Disk, GCP GCS

  1. None of the cloud vendors list discounts for cloud storage commitments like they do for cloud computing. However, I have done many TCO/ROI calculators with enterprise customers where they shared up to 28% discount on storage, and a 40% discount on data transfer charges. So be sure to have your procurement managers ask for such discounts.
  2. Cloud vendors offer a lot of storage tiers. For example, AWS offers AWS S3, S3 Standard – Infrequent Access, S3 One Zone, S3 Glacier, as shown in the picture below. Ask your team to use auto-tiering to schedule data movement from expensive to inexpensive tier. (Display 2)
  3. Look to eliminate redundant technologies. For example, is your team using native cloud snapshots, and also using a backup product to back up those cloud IaaS and PaaS components? Eliminate redundancy by picking a specialized data management tool that can manage snapshots, cloud backup & DR, test data with a single pane of glass.
  4. Block storage is 5x to 10x more expensive than object storage. For example, GCP Persistent Disk costs $170/TB/month, while GCS Nearline costs $10 per TB per month. So avoid using block storage for basic things like DBAs dumping database backups to block storage. Instead, look for a data management tool that can write to GCS Nearline or AWS S3 or Azure Blob storage and deliver rapid DR from it.
High Cloud Costs - How to Flatten the Curve

Reduce Cloud Costs for data warehouse and analytics such as Google BigQuery, AWS Redshift, Azure Synapse Analytics

  1. Go Serverless. In this day and age, it does not make any sense to manage your own infrastructure and do your own custom ETL on-premises. It’s best to rearchitect and point to cloud services like Google BigQuery, which allows you to do both ETL and ELT. This reduces labor costs, infrastructure costs, and allows your small team to focus on extracting value from data analytics.
  2. Some of the legacy vendors use traditional RDBMS databases for data warehouse solutions whose license costs tend to be very high. By using serverless cloud solutions, you can get stop paying through your nose for the legacy database license costs.

Eliminate on-premises data center costs

Even though this category does not reduce cloud costs, it will help your teams deliver better SLAs faster at lower costs and gets you out of the data center.

  1. Reduce your on-premises database infrastructure and license costs by migrating to Cloud PaaS DBs such as Azure SQL or CosmosDB, AWS Aurora, or DynamoDB, GCP CloudSQL. Identify inefficient backup architectures where backups go to local expensive block storage or legacy dedup appliances. Instead find a data management solution that can backup directly to infinitely scalable and inexpensive cloud object storage such as AWS S3, Azure Blob, GCP Nearline storage.
  1. Eliminate all the compute, storage, network in your DR data center by finding a data management solution that can deliver on-demand DR in AWS, Azure, or GCP by using inexpensive cloud object storage.

I hope you found this helpful!

cloud backup
Learn more about cloud backup and recovery.

Here is more information about cloud backup and recovery: download our Cloud Backup and Recovery Checklist.

Recent Posts