Added value of distributed data environments
Due to numerous cloud transformation projects, the need arises for companies to develop consistent data management.
How this works with a data fabric approach can be demonstrated with SAP backups and restore in the cloud. Existing SAP customers should continuously protect their data in order to maintain operations.
Enterprise applications place high demands on high availability, security and performance - regardless of the environment in which they run.
Data management has to meet these requirements. However, this task is becoming more complex. Because preparing for a cloud future is increasingly moving up the agenda.
There are other reasons why the move to the digital cloud is so urgent: there are signs that enterprise applications are also gradually migrating to the cloud.
The Walldorf-based company is pursuing a cloud-first strategy and now only offers many products or new functions in the cloud. But without cloud storage, the meaningful application of IoT, Big Data, Machine Learning and AI will hardly be possible.
This is because sensors and machines produce huge amounts of data in a very short time. As a result, the volume of data is growing exponentially. Dealing with this flood is difficult.
Local storage systems can no longer cope with this. Edge computing for IoT scenarios also comes into play as a supplement to the cloud. These systems filter and analyze data directly at the point of origin before transferring the results to the cloud.
Anyone who wants to position themselves competitively should therefore consider the idea of operating a distributed infrastructure in the future. Hybrid multiclouds, i.e. a mixture of different public cloud services, private clouds and local systems, are increasingly emerging as the preferred model.
This gives enterprises the ability to choose the best solution from all the options for each use case. IDC predicts that by 2024, about 90 percent of global enterprises will have a multicloud management strategy.
Migrate SAP to the Cloud
Until now, the move of SAP systems to the cloud has mostly been a step-by-step process. Many companies first move the backup to the cloud. This allows them to gain initial experience and prepare the infrastructure for a later relocation of the productive systems.
The companies also aim to leverage the agile, flexible environment for development and testing to get new projects up and running faster. Their plan is to move workloads back on-premises when they go live. However, this plan cannot be achieved by traditional means.
The associated effort can be avoided if companies use a data fabric approach. This enables smooth access and exchange of data in a distributed data environment.
This is because the data fabric creates uniform and consistent data management that extends from on-premises to the cloud. It is not for nothing that the analyst firm Gartner declared the Data Fabric to be one of the most important technology trends for 2019.
For SAP users, this means that they can migrate their SAP systems to the cloud without changing the familiar operating concept. This is because data management with a data fabric in the cloud works identically to on-premises.
Even in a heterogeneous IT landscape, data is always available at the required speed where it is needed - regardless of where it comes from.
This gives companies the flexibility they need to move workloads back and forth between different environments as needed. Vendors like NetApp have been pursuing this concept for several years.
Combined with Cloud Connected Flash solutions, users get much of what they need as they move to the cloud: ease of use, high availability, agility and high performance.
Hana backups
If enterprise applications are now running in the cloud, a high level of fail-safety must be guaranteed in the cloud just as it is in the company's own data center. This requires, among other things, a fast backup and a fast restore. This is particularly difficult to implement in large database environments such as SAP Hana.
Traditional approaches such as streaming backup or backup to tape are not efficient enough for mission-critical data. Typically, a Hana backup requires a full backup.
With large databases and usually limited bandwidth, this takes hours. Therefore, usually only one or two backups can be created per day - that is far too little for business-critical systems.
Snapshot-based backup, on the other hand, allows database backups to be created and stored in the system within seconds without affecting database performance.
The speed can be achieved because the backups are made in the storage and thus do not consume server resources. This means that sufficient backups can be made at short intervals each day. Restoring from a snapshot also takes place within a very short time.
Disaster Recovery in Hybrid Landscapes
But backup is only the first step in protecting data in SAP systems. Disaster recovery in distributed environments also poses a special challenge. Because if there are problems with the entire data center, companies must be able to restore the backups.
It is not just a matter of not losing any data. The data must also fit together when restored, otherwise processes in the enterprise applications will no longer work. This means that data distributed across different systems must be backed up consistently at the same time - across data center and cloud boundaries.
One possible solution is so-called consistency groups. All systems that work together are combined in such a group and backed up in a common backup plan.
The backup takes place automatically and synchronously at the same time. If a system fails once, the same data status can be restored everywhere at the same time, so that consistency is guaranteed.
With automated cloning, test environments can be deployed within minutes at the push of a button. Thanks to the data fabric concept, companies are flexible across cloud boundaries.
For example, they can clone an SAP system in AWS and make it available in Azure in less than ten minutes. With Data Fabric, they are perfectly positioned for new technologies and the associated data growth, as well as SAP's cloud-first strategy. What seems certain is that in the future we will primarily see hybrid multicloud environments in corporate IT.
Technical measures for cloud operation
Migrating systems to the cloud or moving them back again can involve considerable technical effort. Most companies today operate their SAP systems with enterprise storage. They either use a Storage Area Network (SAN)/iSCSI block storage or Network Attached Storage (NAS) with the Network File System (NFS). In the cloud, the Fibre Channel protocol in combination with NetApp is not intended. When switching to classic cloud storage such as Elastic Block Store (EBS) from AWS or Azure SSD Managed Disks (Premium Disks), IT managers must therefore adapt the SAP system configuration at the cloud provider. There are also serious changes in operations. High availability for EBS or premium disks is achieved with several - typically three - redundant copies. It is often necessary to combine these premium disks using Logical Volume Manager striping technology to meet capacity or performance requirements. This impacts storage system management, for example, when it comes to backup, tuning or resizing. The alternative is the Data Fabric to significantly simplify migration and operations. This approach unifies data management and ensures seamless data transfer between data on-premises, in the private cloud or in public cloud services.