Generative AI in SAP On-Premise
The future of AI is promising: companies dream of greater productivity, lower costs and competitive advantages. The use of chatbots enables easier operation of SAP functions, more efficient knowledge management and assistance in processing support requests and programming. The potential of AI-supported SAP functions lies in particular in easier access to SAP applications through natural language. Company-specific organizational specifications can be given to the applications in plain text, as "prompt", without programming knowledge.
Further specifications then enable the execution of pending tasks. The conversation with the chatbot is refined until the desired result is achieved. With access to company-specific documents, detailed analyses and evaluations can also be carried out. They then appear on the screen not only in text form, but also as AI-generated graphics, tables or images if required. In this way, the use of chatbots forms the basis for sound business decisions in an increasingly complex market environment.
Generate added value for working with SAP business data
The path to using AI in SAP begins with the definition of use cases and goals. The question that moves the industry is: Where does AI add value in connection with business data from SAP? The first areas of application can be seen in chatting with SAP documents. For example, business data can be queried. A second scenario can be seen in the easier operation of complex SAP processes. An internal decision must be made in advance as to whether the chatbot can access company-specific documents and thus extract and process knowledge from them. At this point at the latest, the crucial question arises: cloud or on-premise?
On-premise or cloud: security versus innovation?
Many companies use SAP on-premise because the security-related advantages outweigh the disadvantages for them. At the same time, local solutions can be tailored to the individual needs of the company with a corresponding amount of effort - a decisive reason for not simply switching to the cloud. Because the further in the cloud the data is stored, the less control there is over data security. Even if cloud solutions are associated with greater scalability, access to modern technologies and lower operating costs, the risks are too high for many. Companies that use SAP on-premise are at an impasse when it comes to AI with the options of switching to the cloud, introducing a different ERP or remaining cut off from AI.
SAP use cases do not need server farms
For SAP consulting companies, it is clear that there must also be offerings for on-premise. The Milliarum AI Construction Kit was recently launched as an infrastructure solution. It was developed specifically for companies that want to add AI-based functions to their SAP system. With preconfigured delivery scopes, chatbots are immediately ready for use. The first module-specific chatbots for SAP On-Premise will follow shortly. The aim is to provide flexibility that enables companies to select the large language model (LLM) and connect it locally to their own AI system.
These considerations are based on the assumption that local AI models are sufficient for most companies when weighing up security and innovation: To chat with SAP data, there is no need for a gigantic chatbot that speaks over 100 languages and has been trained with several hundred billion parameters if, in the end, only German is to be used for limited use cases. A server farm should also not be necessary for most use cases in the SAP environment.
Sustainable use of generative AI in the SAP environment
AI in SAP also requires the choice of a host that meets the security requirements. You can opt for the big players on the market, such as OpenAI or Microsoft Azure. Local open source LLMs on the company's own hardware are an alternative.
In addition to the monetary aspect, this saves a lot of electricity compared to the huge data centers that usually underlie cloud-based solutions. They do not require any company-owned hardware and are usually based on a usage fee depending on the number of words entered. Companies quickly reach a cost factor that is hardly inferior to a local solution. Cloud-based solutions can be up and running after just one day with the right preparation.
Local solutions, on the other hand, require the purchase of hardware. It must then be integrated into the company network and the AI model must be set up and trained with the company's own parameters. Although the introduction is complex, the operation meets the company's own security requirements and the running costs are significantly lower than with cloud-based solutions, where every API access is remunerated.
Autonomous SAP processes to make AI "intelligent"
The key to the added value of generative AI in SAP lies in the data: the better the data, the more powerful the AI. Preparing this data is a key step on the road to AI. However, it is less time-consuming to bring in SAP documents. The business data is already available in SAP and calling it up does not yet make the AI "intelligent". Rather, it is about extracting data from SAP as well as processing it and feeding it back into SAP. The experts are also concerned with the question of whether and to what extent the use of AI brings productivity benefits.
While AI is mainly used today as a pure chat function and in programming environments for creating evaluations directly from SAP data and for creating and changing SAP objects, it will be possible in future to operate complete processes in SAP based on natural language. The use of generative AI in the SAP environment is still in its infancy: the actual potential, whether cloud-based or on-premise, has yet to be proven.
1 comment
Berater
Die cloudbasierten Lösungen müssen auf Grund des sharing von Ressourcen effizienter sein, als eine OnPrem Lösung auf einem Server, der zu 50% ausgelastet wird. Ob die Kostenvorteile an den Anwender weitergegeben werden, ist eine andere Sache.