Close

5 Data Management Challenges In Every SAP Environment

manage SAP

SAP has been the global market leader in ERP solutions for many decades. SAP is invincible in the Fortune 500 market since 80% of such companies cannot think of anything else. Ranked 25 at the “World’s Most Valuable Brands” by Forbes, SAP has helped digitize many business processes for logistics departments, finance departments and sales departments.

In this article, we’ll focus on the major hurdles that users face with managing large SAP databases and some of the ways to overcome these key challenges.

SAP Databases

Non-HANA Environments:
Users have a wide choice of databases to be used: Oracle, MS SQL, Sybase (SAP owns Sybase) or DB2. Once users standardize on a specific vendor, they will typically use a separate database for OLTP (Online Transaction Processing) and OLAP (Online Analytical Processing). Oracle has been the most popular database in the past. A lot of customers are now looking to migrate towards SAP HANA.

SAP HANA Environments:
SAP HANA, on the other hand, is a totally different architecture. It’s an in-memory database. Instead of storing in relational tables, HANA stores information in columns, thus lending itself to very high compression rates. In many environments, users configure HANA to be used for both OLTP and OLAP. In such situations, because of OLAP, you should expect the HANA database to be 1TB+  with a high growth rate.

There are two main factors which determine the size of a SAP database:

  1. OLAP tends to store a lot of historical information. This increases the size of the database.
  2. Some environments are designed to store a lot of information in the OLTP database. In spite of an opportunity to archive large tables, DBAs choose not to archive either because of organizational policies to keep all data online or because of the fear that developers might change schema in future.

Large, mission critical databases in SAP environments bring their own set of challenges.

Typical Challenges in Managing SAP Databases

  1. Impact on Database Performance: Most enterprise backup products perform, if not daily full backups, then weekly full and daily incremental backups. For medium and large databases, recurring full backups greatly impact production database performance. For example – for a 5TB database, backing up 5TB instead of 100GB changes has a 50x production storage I/O and a 50x impact on production database performance.
  2. Time Consuming Operational Recovery & DR: For scenarios such as an accidental deletion, a ransomware attack or an issue during an upgrade, users need to restore to a previous point in time….very quickly! Imagine being in situation where you made a change in database schema but that broke and now you want to go back to a previous point in time to cut down on your downtime!
    What if the primary site goes down, users need to get back to operations quickly on a secondary DR site. What if users can’t restore instantly from a secondary DR site? The losses to business operations will be huge!
  3. Slow & Expensive DB Cloning: For the SAP landscape (Dev, QA, Prod), users typically want multiple copies across multiple release cycles so that they can test multiple environments simultaneously ito accelerate their test cycles and shorten release cycles and time to market. Traditional solutions are not only slow in creating physical full copies, they require the same amount of storage as the original database, thus increasing storage costs. Also, multiple users have different roles based on RBAC and they would want to create multiple copies in a self service manner, which is again difficult to achieve with traditional solutions.
  4. Delayed Analytics: For reporting and getting rich insights through analytics tools, SAP users typically make copies of their production database and feed to these analytics engines. But what if multiple users want to use copies of the SAP database and do “What-If” analysis on these copies that might alter the data for other users? This approach would not just be slow but also can result in erroneous outputs!
  5. Inheriting On-prem Challenges in Public Cloud: The promise of the public cloud is the speed with which infrastructure gets provisioned. But while the public cloud solves the infrastructure problem, the speed to access data in the public cloud is still slow. SAP users in the public cloud still face the challenges of slow backups, longer recovery time, and long delays in creating copies for test/dev. Accessing data stuck in cloud silos remains as big a challenge as it is for SAP on-premises.

What is the Ideal Solution?

  1. Incremental Forever Backup: What if there is a solution that helps users eliminate full backups even for multi TB databases? A solution that can just backup the 100GB that changed out of 5TB database will help users not only with fast backups but also lower impact on their production apps and reduce production storage I/O by 50x.
  2. Recovery in Minutes: What if there is a solution that helps users instantly mount a multi TB SAP database to another application server running SAP in a secondary data center or in a public cloud like GCP, AWS, Azure, so that if the primary site goes down, users can instantly get their operations back up and running within minutes? This would make the businesses more resilient! Not just that, in cases where an SAP admin rolled out a patch that unfortunately corrupts the database, users should still be able to get back to a previous point-in-time copy in minutes.
  3. Instant Database Thin Clones: What if there is a solution that reuses the backup copy to create multiple database thin clones of the production SAP database instantly without consuming any extra storage? As a user, based on role based access control, you can self-service provision thin clones to your test/dev environment to accelerate app testing and release cycles and be ahead of the competition.
  4. Efficient Analytics: What if there is a solution that can reuse the backup copy to provision multiple database thin clones to analytics teams so they can change data and conduct “What-If” scenarios without stepping over each other?
  5. Any Cloud, Any Data Center, Same Benefits:  An ideal  solution would deliver all of the above benefits, consistently, in your data center or any of the public clouds like Google, AWS, Azure, IBM.

Ideally, enterprises should have one solution that delivers all of the above capabilities instead of having to deal with multiple points tools such as one tool for backup, another for DR, and yet another for DB cloning. This reduces operational burden and licensing costs.

In a large enterprise, SAP is one of the most mission-critical applications. It is time to modernize SAP data management by adopting a solution that helps overcome some of the major hurdles outline above and that works in a symbiotic way with the critical SAP application… anywhere, in an on-prem data center or in any public cloud!

Webinar: Protecting SAP HANA in the Cloud
Watch the replay