The global and independent platform for the SAP community.

Data management - a chore or a worthwhile investment?

The flood of data cannot be managed without intelligent data management. Nevertheless, only a few companies take advantage of all the opportunities offered by new technologies.
Frank Gundlich, Datavard
September 18, 2018
[shutterstock.com: 530318692, Purple Anvil]
avatar
This text has been automatically translated from German to English.

Data management does not enjoy great popularity, although most companies are aware that this future topic is essential against the background of a rising flood of data.

In practice, it is apparent that despite this, very few companies make full use of the possibilities of (automated) data management. Often, there is a lack of resources and a sufficient technical solution.

At the latest, when the Hana migration becomes too expensive because the database is too large, data cleansing becomes essential before the migration can take place.

Reasons for data growth

There are various reasons that have led to ever greater data growth in recent years, such as the increase in business transactions and new digital business processes, as well as legal regulations that require archiving over longer periods of time.

These include the regulations of the Sarbanes-Oxley Act for listed companies and provisions of Basel III for financial institutions. In the USA, there are the Securities and Exchange Commission (SEC) guidelines for the control of securities trading, and in the EU, the Data Retention Directive.

Further keywords are FDA in the pharmaceutical and food sector, HIPAA in the health sector as well as GDPdU (DE), ElDI-V (CH), BAO (AT) and FRCP.

About the value of data...

The more the volume of data in a system grows, the smaller the percentage of actively used, valuable data becomes. Older, historical (cold) data in particular is often only retained for possible audits.

However, cold data is still managed in the same way as actively used (warm and hot) data. On average, only 7 to 15 percent of the data in a database is productively used and ready for reporting.

The rest is master data, temporary data, historical data (older than two years). SAP BW systems in particular accumulate data that requires proactive, sophisticated data management.

On average, 20 to 30 percent of SAP databases are occupied by temporary data. These are generated with every SAP transaction and interaction and lose their value very quickly after generation.

Temporary data is often found in the areas of logs (application logs, change logs), staging (PSA & change logs), communication protocols (IDOCS, RFC, logs) and administration data (requests).

...and their costs

While it used to be the easier and perceived cheaper way to buy new storage as soon as things got tight because it was quite cheap, that has changed in recent years.

One of the reasons for this is that a little storage space is not enough: data that is little or not used at all goes through the same processes and incurs the same costs as productive data. For example, every GB of data from a productive system is replicated 7 times within the system landscape.

Gundlich

For our customers, we see an average data growth of 32.5 percent per year. The size of a system quadruples at 25 percent annual data growth in just five years if no data cleansing takes place.

In return, in the example system with an initial size of 774 GB, 3.6 TB can be saved in five years, provided the right measures for intelligent data management have been implemented.

A bloated database does not only cause unnecessary costs with regard to the required storage space. Higher costs are also incurred for maintenance, licenses, backup or storage.

Today, for example, we see significant variances in the cost of SAP Hana, between 50,000 and 150,000 euros, depending on the customer and setup. Cloud, hosting, tailored data center integration, appliance, HA/DR - all have an impact on costs, performance and scalability.

Data garbage affects systems

If the system is polluted with data garbage, system performance often deteriorates as well. This can be particularly unpleasant when navigating through the main transactions and during reporting.

Here, too, data ballast costs time and money. Time that is lacking in data cleansing, which leads to further data garbage - a vicious circle.

Contrary to the assumption that a lack of data management does not matter to the end customer, data consistency often suffers as well, as the example of our customer shows.

The IT department of the hospital and nursing home bed manufacturer Stiegelmeyer received feedback from the sales staff that the system was confusing due to duplicates in the master data and that the right customer could not be found right away. In addition, the system performance collapsed at times.

The reason for this was an evolved system landscape with many in-house developments. We determined the weak points and optimization possibilities with a system scan. For example, 700 in-house developments were not being used and could be shut down. There was also potential for optimization in data archiving: 85 percent of the data inventory was redundant and could be archived.

How does the database stay lean?

We look at five starting points for keeping databases lean and effective:

1. avoid data generation - or keep raw data only in a central data lake. The less unused data is stored in the SAP system, the better the performance of the system and the more optimized the operation can be.

2. delete and shut down unused data, applications, reports: What is not needed should be deleted.

3. automated housekeeping: Lean SAP systems without manpower? Rule-based and equipped with best practices, an automated housekeeping solution can delete up to 35 percent of database content without losing business knowledge.

4. offload hot and cold data to a scalable and less expensive medium: Different concepts and approaches for Big Data can be combined thanks to modern technologies such as Hadoop. For example, by outsourcing documents and files from SAP to Hadoop. Transactional and analytical data can also be outsourced. Thanks to certified interfaces, the data can still be accessed even if it is not stored in the primary database.

5. use selective copying: Test and validation systems do not require all data from a production system.

Selectively copying process-validated data from a production system to a test system makes non-production systems leaner. This speeds up the copying processes, which are thus repeated more often, reducing hardware and personnel costs.

Savings in data management

Typically, efficient data management can save 30 to 50 percent of system size and reduce the growth rate of data by 60 percent.

The first priority is optimal data classification in order to decide whether and how it is best stored: Hot data must be available quickly; it is kept in the main memory and, for example, migrated directly during a migration to SAP Hana.

Warm data can be outsourced to another (secondary) database, with adequate performance but lower costs. Cold data is stored, archived or deleted as cost-effectively as possible.

Typically, the classification of data is now supported by usage statistics and value analysis, which assist the data steward in his role as information lifecycle manager.

In addition to creating a Data Catalog that describes where which data is stored, in what format, and with what level of importance, a Data Steward deals with the identification and provision of data, the creation and maintenance of reference data, and the consistent quality of master data.

With the help of efficient automated data management, the KION Group was able to remove 30 percent of data ballast, increasing system performance by an average of 25 percent.

Microsoft Azure offers enterprise-ready solutions in the cloud with SAP on Azure and a wide range of database and storage options. Even very large systems can be operated effectively thanks to the scalability of SAP Hana, Hadoop and SQL servers. Hybrid scenarios with secondary storage in the cloud and the primary database on premise are becoming increasingly popular to drive the expansion of the cloud strategy.

Conclusion

Intelligent data management is indispensable for containing data growth, ensuring consistently good system performance and saving costs. It is worth overcoming the existing obstacles.

A centralized ILM is the key to more transparency, less effort and reliable execution of all necessary tasks. In addition, organizational approaches such as the installation of data stewards and data catalogs are important building blocks.

New technologies linked in the best possible way and provided with analytical tools form the basis for modern data/information lifecycle management.

Effective data management is becoming an important competitive factor for companies in times of rapidly growing data lakes, so that they can make the most of the opportunities and added value of the available data.

https://e3magpmp.greatsolution.dev/partners/datavard-gmbh/


Criteria for effective data management

Measure the value of the data: With the help of analytics tools, data must be properly classified to accurately identify its value. Modern approaches such as machine learning help here today.

  • The possibility of automated handling, storage, archiving, deletion of data according to their value
  • Thereby bringing the value of the data in line with the direct and indirect costs
  • Regular implementation taking into account new objects
  • Simplification of data management through automation
  • Transparency through central and cross-system documentation, rules, scheduling and monitoring
  • Security through resource-saving scheduling and alternatives to radical retention or deletion

 


Positive effects of a clever ILM

  • Direct and indirect cost savings by reducing system size and slowing data growth
  • Increased performance
  • Simplify system administration and speed up time-consuming tasks such as backup and recovery, system copies and upgrades
  • Direct and indirect cost savings when operating the Business Warehouse Accelerator Hana

 

avatar
Frank Gundlich, Datavard


Write a comment

Working on the SAP basis is crucial for successful S/4 conversion. 

This gives the Competence Center strategic importance for existing SAP customers. Regardless of the S/4 Hana operating model, topics such as Automation, Monitoring, Security, Application Lifecycle Management and Data Management the basis for S/4 operations.

For the second time, E3 magazine is organizing a summit for the SAP community in Salzburg to provide comprehensive information on all aspects of S/4 Hana groundwork.

Venue

More information will follow shortly.

Event date

Wednesday, May 21, and
Thursday, May 22, 2025

Early Bird Ticket

Available until Friday, January 24, 2025
EUR 390 excl. VAT

Regular ticket

EUR 590 excl. VAT

Venue

Hotel Hilton Heidelberg
KurfĂĽrstenanlage 1
D-69115 Heidelberg

Event date

Wednesday, March 5, and
Thursday, March 6, 2025

Tickets

Regular ticket
EUR 590 excl. VAT
Early Bird Ticket

Available until December 20, 2024

EUR 390 excl. VAT
The event is organized by the E3 magazine of the publishing house B4Bmedia.net AG. The presentations will be accompanied by an exhibition of selected SAP partners. The ticket price includes attendance at all presentations of the Steampunk and BTP Summit 2025, a visit to the exhibition area, participation in the evening event and catering during the official program. The lecture program and the list of exhibitors and sponsors (SAP partners) will be published on this website in due course.