archive

Process Mining in the Cloud

Deliver Instant Insight into Process Performance with SAP Process Mining by Celonis, Cloud Edition

 

by Keith Grayson, Senior Director of Database and Data Management Solution Management, SAP

 

As organizations continue to expand their global reach, optimizing business processes becomes increasingly important to ensure quality customer experiences and improve margins. In the past, process analysis and change were usually performed by business analysts and consultants using workshops, interviews, and job shadowing over extended periods of time. In modern organizations, however, these methodologies and practices fall short of delivering the insights needed to identify best practices and quantify the benefits of change. The metrics used to measure the performance of diverse, globally distributed business units are often not the same, so there is no consistent basis for comparison. Interviewees might also present an idealized view of what is going on, which can skew results, and consulting engagements are often viewed as one-off activities, without concrete plans to quantify the results of any changes. So how can SAP customers achieve the level of visibility and transparency required to optimize their business processes, as well as support constantly changing regulations, a growing move toward service-based business models, and increasing requirements for business-to-business and business-to-government integrations?

Process mining solutions — such as the SAP Process Mining application by Celonis — have emerged as useful tools for supporting these types of initiatives by enabling businesses to drill down into their key business processes, uncover inefficiencies, and pinpoint areas for improvement. Previously, SAP Process Mining was available only as an on-premise deployment. This article introduces a new software-as-a-service (SaaS) edition — SAP Process Mining by Celonis, cloud edition — that delivers all the functionality of the on-premise version with some additional features that take advantage of the speed and flexibility of the cloud. In this article, line-of-business owners, business process analysts, and enterprise architects will get an introduction to the major components of this new edition, learn how it differs from the on-premise edition, and understand the different deployment options that are available to meet the unique needs of individual businesses.

Before diving into the details of this new application, it will be helpful to understand how process mining works, the roles it can play, and where the new cloud edition of SAP Process Mining fits in.

Understanding Process Mining

Process mining is an analytical discipline for discovering and visualizing business processes using the raw data — or “digital footprints” — captured from enterprise application audit trails and logs. Because the visualizations are constructed from the raw data, without the influence of preconceptions or preexisting process models that structure and limit the results, they deliver accurate, valuable insights about how business processes are operating. Views of the data can go all the way from high-level schematic diagrams of process variants, all the way down to text-based reports at a very granular level — an order item in a specific purchase request in an end-to-end procure-to-pay process, for instance.

A process mining deployment can also help support the adoption of robotic process automation (RPA) platforms. Automation is fundamentally a rule-based activity, and automation activities tend to fail when the rules they follow require more manual intervention than the manual processes that they replace. An optimal automation setup requires insight into the exact process flows that are followed by the individuals responsible for them — when people are involved in business processes, they often fill in the gaps in documented process steps in their own ways, and these manual interventions can be missed as the process is automated. Process mining helps identify process flows and variants, including undocumented paths and exception handling, so that organizations can decide which steps need to be reproduced in the automation. Process mining can be the difference that makes an automation project successful.

While a process mining deployment demonstrates clear and compelling business value once it is configured and integrated into a business landscape, the technical implementation can represent a significant part of the project. With the on-premise SAP Process Mining product, the digital footprints — in the form of process event logs from the different connected applications — need to be loaded into an SAP HANA data platform, and the initial setup and configuration of the integrations, the configuration of the application to use the correct data tables, and the screen setup can take weeks before the application can be used productively to deliver business process insights. While large enterprises with dedicated business process centers of excellence and centralized IT operations have the appropriate resources for this type of implementation, organizations that are less mature can find it more challenging. At the same time, many businesses are moving to cloud-first IT investment strategies, which means that an on-premise approach does not align with their organizational goals.

To address the needs of organizations that do not have the resources for an on-premise implementation or are pursuing a cloud-first strategy, SAP and its partner Celonis have introduced SAP Process Mining by Celonis, cloud edition, which can be up and running and productive in hours instead of weeks.

Process mining can be the difference that makes an automation project successful.

Introducing SAP Process Mining by Celonis, Cloud Edition

Cloud-based solutions not only offer businesses a number of economic and technical advantages — such as removing the need to invest in and manage an IT infrastructure and interdependent software release cycles — they also provide richer, more personalized services than on-premise solutions, such as centralized user collaboration, messaging, and alerting features. SAP Process Mining by Celonis, cloud edition, brings these benefits to SAP customers by extending the process mining functionality of the on-premise SAP Process Mining application with cloud-based capabilities. The speed of provisioning and productivity provided by the cloud-based edition leads to a rapid deployment with a low dependency on skilled technical resources for integration, which in turn leads to a quick return on investment and an accelerated time to value.

Released in October 2018, the cloud edition is available from SAP and is a complete managed SaaS application provided by Celonis — via its Intelligent Business Cloud environment — to support the rapid analysis of business processes, such as order-to-cash, in system landscapes. Figure 1 shows a visualization of an example order-to-cash process in the application — in this case, measuring on-time delivery.

 

Figure 1 — A visualization of the order-to-cash process measuring on-time delivery in SAP Process Mining by Celonis, cloud edition

 

As you can see, the screen shown in the example consists of a number of separate elements, which are configurable. In this order-to-cash example, at the top of the screen are the headline numbers in terms of performance, number of items delivered, and order value. On the left is the Process Explorer, which shows the process variants or the paths through the end-to-end business process that are taken. These process variants can be analyzed in depth, showing cycle times between steps and the frequency of each variant, for instance, and animations can even be used to show the path of a specific process variant in the visualization. At the upper right are the range of times taken to deliver orders from order receipt, and at the lower right is a breakdown in greater depth of the performance of specific distribution channels. If you decide to focus on a subset of process variants or a subset of the business process, the screen updates dynamically to reflect the new selection. You can also drill down into more detail on any of the selected areas.

The cloud edition includes all of the functionality of the on-premise edition — including functionality to support data integration, process discovery, and process analytics — as well as some brand-new capabilities, such as prebuilt connectors, an Intelligent Business Apps store for accessing ready-to-use content, a Transformation Center for process optimization support, and a Cloud Teams concept for collaboration. Figure 2 provides an overview of the major components of the cloud edition, with the areas in blue indicating how the cloud edition expands the functionality available with the on-premise edition, which is shown in gray.

 

Figure 2 — The key components of the cloud edition of SAP Process Mining, with cloud-only functionality in blue

 

A previous SAPinsider article covered the on-premise edition of SAP Process Mining in detail — here, we’ll take a closer look at the new capabilities that are included in the cloud edition.

Intelligent Business Apps Store

The Intelligent Business Apps store (Figure 3) is a key new service provided in the solution. It is accessed through the home screen provided by the service and provides prebuilt content, including out-of-the-box connectors and preconfigured analytics, that dramatically decreases the time to productivity of the solution. The store provides two types of prebuilt content: prebuilt processes, including out-of-the-box connectors to enable access to process details, and prebuilt apps to support process analysis.

 

Figure 3 — The new Intelligent Business Apps store provides prebuilt content that speeds productive use of the solution

 

The Intelligent Business Apps store offers predefined process definitions for commonly used processes, such as order to cash and accounts receivable, that can be customized using a wizard-based tool, as well as preconfigured connectors to support extracting process data from typically connected systems. To connect SAP Process Mining to a system, select a process and choose to install it. This allows you to either create a new connection or use an existing connection that will be presented as a tile on the configuration screen. If you choose to connect to a new system, the wizard-based tool takes you to a screen to enter the connection details.

These connections use scripts and adapters to extract the required process event details from the connected source system, and are controlled by data jobs that trigger the data extraction and prepare the data for process mining. All connections aggregate data through an event collection component that feeds data pools. The data pools cluster data connections (which handle the connection required to extract the process event data); data jobs (which extract the data from the connected systems and prepare it for process mining); and schedules (which determine when the data extraction runs). After setting up the connection, you can adjust details such as the process start date, currency, primary language, and the scheduling for the data extraction runs (weekly, nightly, hourly, or on a custom schedule, for instance) as needed. You also do not have to start the initial data extraction run immediately after you configure the connection — you can choose to delay the job to ensure it takes place during an off-peak time period, for example.

The Intelligent Business Apps store also offers apps that consist of predefined analyses and objectives that focus on key areas of interest in a business process. Analyses, which can be selected and installed via the store, are predefined screens that visualize data about business processes in ways that are relevant to specific stakeholders in the defined business process. They show performance against commonly measured industry benchmarks. For example, an analysis could be for days payable outstanding in the accounts payable process, or for the automation rates in the warehouse management process. The analyses are accessed through the Process Analytics screen in the SAP Process Mining application. Objectives are installed in a similar way to the analyses and can be thought of as strategic initiatives. Objectives are a set of key performance indicators (KPIs) that are core to the strategic aims of a specific business process, such as days payable outstanding in the accounts payable business process. Objectives are shown and tracked using the Transformation Center (more on this in a moment). Analyses and objectives are fully configurable if required.

The prebuilt content for mapping processes, configuring the event collection component, analyzing processes, and supporting the Transformation Center is the central benefit of the cloud edition of SAP Process Mining. The aim is to provide both a rapid out-of-the-box solution that enables process mining for specific, standardized business processes as well as the flexibility to implement a complete enterprise-wide process mining platform for complex, non-standard, end-to-end processes.

Transformation Center

The Transformation Center is accessed via a menu option on the main SAP Process Mining solution screen. It is the go-to view for continuous improvement where specific company objectives are set and actual achievement against them is tracked on a continuous basis. It serves as a central repository for all KPIs that are relevant to the business processes being mined. These KPIs can be configured by authorized users, or they are included in the predefined objectives and installed from the apps available in the Intelligent Business Apps store.

Using the new Transformation Center, you can perform quantitative analysis on end-to-end business processes by tracking their performance against predefined objectives that can be installed from the Intelligent Business Apps store and adapted as needed. Objectives are measured against KPIs that can be added to the objective from any process or data model. You can even connect multiple KPIs from different data models within objectives. For example, a global procure-to-pay process objective might include KPIs at a global level and also regional KPIs that are combined from regional processes and data models. You can also attach milestones, which are a combination of target value and target date, to KPIs to measure readiness for system migration — for example, migration to a cloud-based system architecture or from SAP ECC to an SAP S/4HANA system landscape. Think of the Transformation Center as an engine for measuring the business impact of the process insights and recommended actions delivered by SAP Process Mining.

Figure 4 shows the performance of business processes against objectives visualized over time in the Transformation Center. As you can see, the example shown is tracking days payable outstanding, which is a strategic finance indicator. The performance is measured over time in a number of different ways, both in text format and graphically, and also shown are the owners responsible. Actions can be assigned to team members and tracked.

 

Figure 4 — Days payable outstanding visualized over time in the Transformation Center

 

Cloud Teams

Cloud Teams is an enhanced capability that enables collaboration between cross-functional teams on business processes and accountability for hitting performance objectives via defined business process owners. The use of Cloud Teams is integrated into the Transformation Center. With this new functionality, you can assign KPIs defined in the Transformation Center to owners to make it clear who is responsible for specific outcomes in a team, and you can set up actions to assign tasks and initiatives to teams. For example, in specific cases where a customer service request or outage is affecting a KPI, a positive action can be assigned to a team whose members would be alerted. These capabilities are underpinned by alerting and messaging functionality that enables communication among individuals and teams.

In the application, teams are distinguished by a logo and a name. There is also a privacy policy that governs how new team users can be invited — there can be public teams, where anybody can invite anybody else, and there can be private teams, where only team administrators can invite new members. In addition, there are domain-based restrictions where members can invite others with the same domain email address, or those with domain email addresses from a set defined in the solution. Roles within the defined teams are administrators, analysts, and members (note that these roles are specific to SAP Process Mining and have no link to any of the connected applications or roles within an organization). Administrators can add new users, invite new team members, and manage user permissions. Analysts can create, use, and edit the processes that they have access to, according to the access policies. Members can be assigned actions, but don’t have the capability to create or edit elements of the processes that they have access to.

The addition of Cloud Teams enables collaboration on previously siloed business processes — for example, in complex end-to-end business processes where cross-functional teams contribute, such as in order-to-cash processes and customer service processes — and the ability to address shortfalls and improvements to business processes in a collaborative way.

Deployment Options Tailored to Data Security Needs

The cloud edition of SAP Process Mining by Celonis can be deployed in two different ways: either fully on the cloud or with data remaining on premise. Figure 5 provides an overview of the architecture of the cloud edition and of the two deployment options.

 

Figure 5 — Architecture and deployment options for the cloud edition of SAP Process Mining by Celonis

 

A native cloud solution, SAP Process Mining by Celonis, cloud edition, leverages the Celonis Intelligent Business Cloud environment. It is hosted on the multi-tenant Celonis Cloud through Amazon Web Services (going forward, the intent is to broaden this to include other hyperscaler providers). Communications with connected systems are through encrypted HTTPS/TLS to a specified IP address at the customer’s organization. Celonis is ISO 27001-certified for providing secure services and has a comprehensive information security management system in place.

In a full cloud deployment, the data is extracted from the source systems through a secured connection, typically protected at the customer side through a firewall and other security infrastructure and extracted to the Celonis Intelligent Business Cloud environment. In some scenarios, however, key business process data cannot and should not be transferred across corporate, national, or regional borders — in heightened information security scenarios, such as government, state-run businesses, government contractors, financial services, and other regulated industries, or scenarios involving regulations or other requirements for safeguarding personal data. There can also be architectural reasons and data volume constraints that require process event data to remain local within the SAP HANA data platform.

For organizations that need to keep data local, there is a “leave the data in place” deployment option. This allows organizations to leverage the new functionality in the cloud edition, including the Transformation Center and the Cloud Teams functionality, while keeping the process event data that is mined on premise within required borders in SAP HANA. In this deployment scenario, an on-premise connector is installed locally to the SAP HANA instance, communicating with SAP HANA through Java Database Connectivity (JDBC). This connector connects to the Celonis Cloud tenant via a secure HTTPS/TLS channel. The SQL queries that would normally prepare and gather the data for the business process visualizations from the cloud-based database layer are instead executed on the SAP HANA platform through the on-premise connector.

Ensuring Process Mining Success

SAP Process Mining by Celonis, cloud edition, extends the process mining functionality of the SAP Process Mining solution with cloud-based capabilities that enable rapid insight into and optimization of complex processes. The solution provides prebuilt content — available via an Intelligent Business Apps store — that can be used to connect both SAP and non-SAP applications. The standardized processes and apps extract the required data, transform that data into the necessary format, and install predefined analyses and objectives without the need for time-consuming configuration of separate tools. It also includes a Transformation Center that provides quantitative analysis of end-to-end business processes, measuring their performance against objectives and milestones, and a Cloud Teams functionality that allows team members to collaborate with each other to address business process inefficiencies together. Enterprises can differentiate themselves through data-driven business process analysis, measurement, and actions, creating a high-performance collaborative platform for continuous improvement.

Using the cloud edition of SAP Process Mining delivers a transformational level of insight into end-to-end business processes with speed and ease, allowing organizations to shine a spotlight on key outcomes and gain accurate, quantitative insights that can be shared across the entire organization. A tool can only take you so far on its own, however — you must set up that tool for success. Here are some best practices for ensuring a successful process mining deployment in your organization:

    • End-to-end business process analysis requires an empowered center of excellence team that has the authority and support of the business to champion and coordinate the use of the tool across organizational boundaries.

 

    • A process mining platform that can be shared with business owners and business users on the ground is the most effective deployment model. It gets everybody speaking the same language, enables continuous improvement, and measures the effectiveness of changes.

 

  • Process mining can be used to benchmark the automation ratio of business processes and identify areas for improvement. It is equally important to use process mining to map out as-is processes to ensure the success of automation projects, such as the implementation of an RPA platform.

With thoughtful planning for your process mining deployment, you can step confidently in the direction of becoming a data-driven intelligent enterprise. Learn more at https://www.sap.com/products/process-mining.html.


Keith Grayson
Keith Grayson (keith.grayson@sap.com) is a Senior Director of Database and Data Management Solution Management at SAP. He is responsible for a set of products in SAP’s Enterprise Information Management and middleware portfolio that help organizations build best-practice capabilities to deliver process and integration excellence. He has been with SAP for 12 years, and initially specialized in identity management, single sign-on, and cybersecurity. Prior to joining SAP, Keith had been working in the technology industry for 19 years for both large companies and startups including Atos, HP, Cyberguard, and Siemens.


Ericsson Delights Customers with DevOps SAP Integration

After 140 years in business, global technology giant Ericsson knows a thing or two about staying at the forefront of a changing industry. Learn how the company has modernized its IT delivery approach, and increased customer satisfaction, by adopting a continuous delivery model for its digital business support systems portfolio, and how it used the ActiveControl solution by Basis Technologies to integrate SAP applications into this new model.

This content is available to SAPinsider Premium Members.
Please click below to log in or create an account

Login Now »

Create Acount»



Overcome Hidden Tax Compliance Barriers To Your SAP Central Finance Migration

For some SAP customers planning a move to Central Finance, tax compliance may not be a priority. Unfortunately, that will be a costly oversight. Download this white paper to learn how to prevent these roadblaocks and what a successful migration to SAP Cen

This content is available to SAPinsider Members(complimentary).
Please click below to log in or create an account

Login Now »

Create Acount »



The Many Faces of the Future of SAP Analytics – Part 2 (Do You Still Need, or Want, an Enterprise Data Warehouse?)

As promised in my last post, I’d like to continue the conversation about the future of SAP analytics in its many forms. In Part 1, I introduced this series of discussions with a trip in the Way Back Machine to look at the early days of SAP analytics solutions (LIS, HRIS, SIS, etc.) and the impact the introduction of SAP Business Warehouse (SAP BW) had on how we designed and delivered analytics in an SAP environment.

A lot has happened since those early days, and this topic has been of interest to me (and my former clients) for quite a while. Here in Part 2, I’ll look at how changes in thought process, user requirements, and technologies have affected the way we think about enterprise data warehouses (EDWs) — and whether we still need them.

The Legacy of EDWs

The vast majority of EDWs in SAP landscapes today should be considered “legacy” data warehouses. While these legacy data warehouses were built with good intentions and with the most advanced designs of the time, technologies and needs have changed, and what made sense then no longer meets current goals, meaning that it’s time to plan for their retirement.

The primary objective of the EDWs of the 1990s and 2000s was a single view of the truth (SVOT). But was the SVOT goal ever realistic? No, not really. Why not? A few reasons:

  • Batch loading meant latency in the data (and therefore gaps in data)
  • Data models forced pre-aggregation and defined data relationships (therefore limiting creativity and new ways of looking at data)
  • There was no way to access raw data and create new relationships or models on demand
  • Do we really all have the same “view”? If so, then why do we have so many reports and types of analyses?

Should we instead aim for a single version of the truth? Again, I challenge that thought. There is only one truth; not alternative versions of the truth.

With that in mind, let’s look at data warehouses under a slightly different lens: as a single source of the truth. This makes us look at the EDW as an environment in which trusted data resides and which becomes the platform for all reports and analysis. It encourages us to view the data in multiple ways to uncover the true answer to different questions.

Another goal of legacy EDWs was to push pre-aggregated data and reports to consumers of information. Because of technical constraints, we had (or allowed) limited ability to create new data models and therefore could not form or drive out new data relationships to align with our dynamic business environments. All of this limited the user community for our EDW and restricted them to being used by mostly casual consumers of information or those who subscribed to reports generated from the warehouse. Today’s users demand more flexibility and more than casual consumption of data.

The Future of EDWs

Looking at the goals and objectives as well as the constraints and challenges of the EDWs of the past, how do we move forward with them? Should we move forward them? In my opinion, yes, there is a place for EDWs today and tomorrow in your overall enterprise information architecture or analytics strategy.

Why do I feel that EDWs continue to make sense? An EDW will:

  • Be a source of enterprise-relevant information from multiple transaction systems and external sources (as appropriate or required) — not just data from a single transaction system
  • Be a collection of time-variant enterprise-relevant data from multiple sources (necessary for meaningful time-based trend analyses)
  • Be pre-aggregated and pre-integrated into business-aligned groupings and structures
  • Address the 80% of the user base who want/need to subscribe and have information pushed to them in the form of reports, dashboards, and alerts

Modern data warehousing designs need to address these requirements and include unstructured data, address real-time access, and provide scalability and flexibility — things our legacy data warehouses simply could not deliver. SAP offers two solutions for addressing today’s data warehousing needs: SAP BW/4HANA and enterprise SAP HANA (also known as SAP HANA “sidecar”).

There are pros and cons for both SAP BW/4HANA and enterprise SAP HANA. Much like legacy SAP BW, SAP BW/4HANA offers pre-defined content and connectivity to the SAP transaction system, SAP S/4HANA. Using enterprise SAP HANA as a data warehouse offers the most flexibility, as it is a “build from scratch” approach, but not every organization wants or needs to build from scratch.

When looking at both SAP BW/4HANA and enterprise SAP HANA for your data warehousing needs, assess it against the criteria that are most important to you. Here is an example assessment that I created recently for a very similar discussion that might be helpful as you prepare your own analysis:

Example assessment criteria for SAP BW/4HANA and enterprise SAP HANA

Key Takeaways

To close out the discussion for this week, here are a few takeaways for you to noodle on:

  • Enterprise-relevant information is — and always will be — a need for your organization
  • A data warehouse (or EDW) still serves a key role in your overall enterprise analytics framework because it:
    • Brings together data from multiple sources
    • Provides time-variance to data
    • Organizes data into models optimized for users
  • A data warehouse is not a data lake (which will be next week’s discussion) because a data lake:
    • Is raw data
    • Is all data, not just data known to have enterprise relevance
  • And finally, as with our previous discussions, there are multiple tools and options to meet data warehousing requirements. Assess each of them against your key criteria to determine the best option for your organization.

Thank you for joining me on this journey to the future and please do not hesitate to reach out with your comments, feedback, and questions via email (penny.silvia@wispubs.com) or Twitter (@pennysilvia).




Putting Ethics into Practice with Your SAP Security Strategy

How to Ensure an Ethical Implementation of Artificial Intelligence

 

by Guido Wagner, Chief Development Architect, SAP SE

 

With any new technology, it is important to consider the potential risks and threats it can bring, and to develop a solid strategy for avoiding them before that technology is widely used. As artificial intelligence (AI) moves toward becoming a standard technology in daily business — such as the use of conversational user interface technology in call centers and financial close processes driven by machine learning — it is critical that organizations ensure they have a well-thought-out plan in place for mitigating any potential issues.

While some visionary minds, such as Elon Musk and Stephen Hawking, have issued warnings about the longer-term existential risks AI can pose to society, in the near term, organizations must consider the general impact AI can have on business operations. For example, automated decision making that doesn’t consider extenuating circumstances, the perception of misuse of personal data, and machine learning based on faulty data can significantly affect the security of a company’s business model and reputation. With AI, companies increasingly need to balance the safety and privacy of people with the pursuit of growth and success.

This article shows security and risk managers and decision makers who are considering the use of AI in their organization’s software system landscape how to mitigate the risks posed by AI software by expanding existing security standards with a clearly defined set of digital ethics. It first looks at some typical ethical challenges businesses using AI-based software face and how these challenges can be mitigated. It then provides an overview of SAP’s own guiding principles for developing and deploying AI-based software and offers advice for how to get started developing a strategy for ethical AI in your own organization.

Automated decision making that doesn’t consider extenuating circumstances, the perception of misuse of personal data, and machine learning based on faulty data can significantly affect the security of a company’s business model and reputation.

— Guido Wagner, Chief Development Architect, SAP SE

Security Meets Ethics

So what are some of the ethical challenges businesses face when it comes to AI software? Here, we look at a few typical examples — specifically, ethical challenges around safety, privacy, and bias when using AI software — and some ways you can start to think about mitigating these risks.

Ethical Challenges Around Safety and Privacy

One business area that involves ethical challenges is the balance between profit and safety risks. Imagine you are on vacation and a hurricane is on its way to your location. You want to fly home, but the airfare is much higher than usual because some pricing software detected a spike in bookings without factoring in the potential impact on your safety. Similar scenarios have occurred when it comes to the price of drinking water during times of water shortage. In addition to creating safety risks, these scenarios can also tarnish the company’s public image. Mitigating measures for these scenarios would be to include the event (the hurricane or water shortage, for instance) in the price-finding algorithm (via automated rules or a rule that triggers an alert for human intervention, for example) and provide unambiguous instructions for handling the situation.

Another area of ethical challenges that can influence how a company is perceived is data privacy. There are many existing regulations about the usage of personal data — such as the EU’s General Data Protection Regulation (GDPR) — that do permit the use of anonymized data that cannot be connected to an identifiable person. However, many people do not want even anonymized data about their behavior used for any purpose. While this is not generally a legal compliance issue, it is a good example of the concept of farther-reaching “ethical compliance,” and of how even just the perception that a company is violating people’s privacy can cause a damaging lack of trust in that business. This type of scenario must be addressed by additional standards, and may include some academic research to understand how factors such as cultural backgrounds can impact ethical requirements, and how technological solutions can mitigate the impact.

Ethical Challenges Around Bias

Other types of ethical challenges have more technical roots, such as machine-learning software that bases its behavior on a certain selection of data. One well-known example is the chatbot “Tay,” which learned slang from reading posts on Twitter and eventually began to post inflammatory and offensive tweets. The reason this happened is simple: the bot was trained using the data of only a small sample of human culture — the language typically used on Twitter by people with specific interests and goals. Another example would be a hiring app that is trained using data that consists almost exclusively of profiles of male applicants and employees — in this scenario, the system (and hiring process) will be weighted against anyone who doesn’t fit that same profile.

This is one aspect of what is called “bias,” and while avoiding it can be a significant challenge for software developers and AI trainers, it is critical from a security perspective. Imagine what might happen if AI-based software is trained with an unbalanced data set, whether unintentional or with fraudulent intent. While this might not be a huge problem in the case of simple image recognition, what happens if it’s a neural network that a business uses to run prediction routines, and that managers rely on to make decisions? Recommendations from such a system will be misleading, even if the algorithms are correct, which can affect not only individuals, but also business operations and the company’s public image.

If there is fraudulent intent involved, it can be hard to identify or prove, since technologies such as neural networks do not work as transparently as previously used analytical algorithms — neural networks do not contain a well-defined set of algorithms, so it is nearly impossible to tell how exactly they calculated a result. Because of this missing transparency, it is often not possible to reproduce how a trained deep-learning system comes to a particular result.

The best chance to mitigate the risk of bias is to deeply understand the use cases of a system when designing it, and to ensure the completeness of the training data. Systems designed in this way will have limitations — that is, boundaries within which they work to ensure a low bias level — that are clearly communicated by the manufacturer or trainer of the system. Another possibility is to create qualifiers — such as a “bias index of learned data” — that could be helpful for evaluating the quality of AI recommendations. This is an idea that needs research and standardization to become a valid approach, however.

A Framework for Digital Ethics

The ethical challenges surrounding AI-based software is a key consideration for any organization that is moving forward into a digital business model, including SAP. One way SAP is taking action to address these challenges is by participating in broad discussions about establishing norms around the use of AI-based software that range across industries, academics, and politics. For example, SAP is engaged with the Partnership on AI, the European Commission’s High-Level Expert Group on Artificial Intelligence, and the Council on the Responsible Use of AI.

Another way SAP is addressing these challenges is by creating a framework of ethical values that goes beyond legal compliance to take into account the effect AI-based software can have on people’s quality of life. As a starting point, in the summer of 2018, SAP founded an AI Ethics Steering Committee and created a set of seven guiding principles for developing and deploying AI-based software, such as its SAP Leonardo Machine Learning portfolio. See the sidebar “SAP’s Guiding Principles for AI” on the next page for an overview of these principles.

 

 

SAP’s Guiding Principles for Artificial Intelligence

 

SAP organizes its policies for developing and deploying software based on artificial intelligence (AI) around seven guiding principles, which are summarized here (the complete text of the principles is available here):

    1. We are driven by our values. SAP actively supports a variety of UN-defined ethical precepts, including the Guiding Principles on Business and Human Rights, and developed the SAP Global Human Rights Commitment Statement that spells out its commitment. Furthermore, SAP’s AI Ethics Steering Committee operates based on this principle.

 

    1. We design for people. SAP strives to make the user experience of its software human-centered, where users can interact with systems just as they interact with other humans. SAP seeks to be inclusive with its AI software, and to empower and augment the talents of its diverse users. To achieve this goal, SAP actively drives co-innovation with customers.

 

    1. We enable businesses beyond bias. Bias must be seen as a major risk when creating AI-based advisory or decision-making systems. SAP asks its teams to gain a deep understanding of the business problems they are trying to solve, and the data quality this demands. SAP is investigating new technical methods for mitigating biases and is open to co-innovation opportunities in this area.

 

    1. We strive for transparency and integrity in all that we do. SAP customers will always remain in control of the deployment of SAP products, and SAP will clearly communicate the intended purpose of its products, their capabilities, and their limitations.

 

    1. We uphold quality and safety standards. Ensuring the quality of AI-based products and the safety of humans when using them is at least as important as for any other product. AI-based products will be subject to quality assurance processes, which will be reviewed and adapted on a regular basis.

 

    1. We place data protection and privacy at our core. Data protection is relevant to ethical behavior beyond legal compliance requirements — it also includes consideration of the impact AI can have on people’s quality of life. SAP will communicate clearly how, why, where, and when customer and anonymized user data is used in its AI-based software. Together with partners, SAP will continue to research the development of the next generation of privacy-enhancing technologies.

 

  1. We engage with the wider societal challenges of AI. SAP is aware that AI is one of the major drivers of what is widely known as the “future of work” discussion, and that AI has the potential to cause ethical dilemmas in business software. SAP will continue to consider and discuss the social impact and economics of AI across industries, borders, cultures, and philosophical and religious traditions.

 

These principles are maintained and executed by SAP’s AI Ethics Steering Committee, which consists of nine executive managers, including the heads of design, machine learning, legal, data protection, and sustainability. The AI Ethics Steering Committee is supported by an external AI Ethics Advisory Panel where academic experts contribute not only from the point of view of IT, but also from the perspective of biology, theology, legal, and other sciences and areas of study.
The AI Ethics Steering Committee is supported internally by a diverse expert group and collaborates with an internal “think cell” that handles questions about business ethics in digitalization scenarios. This type of setup has several advantages. For example, it facilitates a close connection to the board and direct access to employees who define, build, and implement AI. It also connects the committee with relevant discussions on ethics outside of SAP and ensures that upcoming questions around business ethics are involved in all AI-related work.

Developing Principles and Putting Them into Practice

Developing a set of guiding principles for digital ethics is an important first step for any software provider working with AI technology. There are three tasks in particular that are key to paving the way to ethical, sustainable AI practices based on guiding principles similar to the ones outlined by SAP: gathering requirements for ensuring the implementation of ethical AI, adding ethical AI information to your existing standards and policies, and monitoring and auditing AI activities. Let’s take a closer look.

Gathering Requirements for Ensuring the Implementation of Ethical AI

To build a successful set of principles, it is a good practice to first gather requirements based on customer feedback and academic discussions. Questions might include: In which cases must a system involve a human decision maker? What should the human-machine interaction look like? Which processes must be logged or monitored? Which parameters must be customizable to enable ethical system behavior? Within the purchasing process, for example, it could be a requirement to define a certain level of fair-traded goods and instruct the AI-based software to choose vendors accordingly. It could also be a requirement to ask users before using their personal data, even if the data is anonymized. These types of requirements must be gathered in close collaboration with customers, AI providers, and people who handle business ethics questions (executive leaders, portfolio or compliance managers, and sustainability departments, for instance).

Checklists can also be helpful for identifying the requirements needed to ensure ethical AI. Checklist items should include questions related to human involvement in AI, such as the end user’s cultural values, how the end user’s current context is evaluated, and situations in which the end user will want AI functionality turned off. Additional checklist items should focus on AI algorithms and boundary conditions, such as how “learn and forget” processes should be monitored to detect fraudulent activities, how a minimum training status can be determined, and to what extent computational results must be reproducible. Checklist items should also consider legal compliance requirements (such as data privacy regulations), how to unveil hidden override directives, and how to assess the potential long-term impact of AI operations. Will humans — or humanity — lose knowledge or capabilities? How can behavioral changes of the AI system be detected (due to hacking activities, for instance)?

The requirements you gather will help you identify the areas in which you need to operationalize your guiding principles. The next step is to transform the requirements into additions that you make to your existing product standards and corporate policies.

Adding Ethical AI Information to Existing Standards and Policies

Product standards and policies are proven to help ensure quality, including security aspects. Your organization’s definition of ethical AI — and how to monitor it — can be included in implementation and operations standards as well as in security and audit policies to ensure widespread awareness and understanding across the business.

Practical instructions for everyone involved in the AI life cycle can result from adding this information to policies and standards. The information must include patterns for human-machine interaction in specific situations, customization parameters to fulfill specific cultural requirements, and procedures to overrule AI (if reasonable and secure).

Monitoring and Auditing AI Activities

Automated controls — for tracking the level of fair-traded goods in a purchasing process or the use of anonymized, human-related data, for instance — can help with monitoring AI activities and supporting audits of the AI system’s behavior by ensuring that procedures are being followed. For example, automated controls could monitor price-finding algorithms for scenarios such as water shortages and apply rules for handling cases in which human health might be affected (to stop any automated price increases, for instance). You can also support audits of the AI system’s behavior by reviewing any available information about why an AI system came to a particular decision. Keep in mind that evaluating a user’s current situation is as important as assessing the potential risks related to alternative actions.

Another method for auditing the reasonable operability of AI is to turn off the AI algorithms and use the raw data and different calculation methods to generate the results. If the resulting data is different from the AI-based results, something obviously is wrong. Turning off AI functionality and providing more basic data to the user can also be a requirement — humans sometimes do not want to only rely on a system output, but rather support their gut feelings and come to their own decisions. Of course, using “exit doors” — that is, turning off AI algorithms — is not always possible, especially if immediate action is required, such as with high-speed trading. In cases where trend-setting decisions are required — such as adjusting a product portfolio, closing a branch office, or making a decision about investments — the ability to turn off AI algorithms at least for test purposes may help to avoid misuse or identify fraudulent changes by hackers or competitors. The results of such analyses must become part of product standards to ensure that business managers can rely on AI-based proposals.

In general, the process of AI auditing will be of high interest to insurance agents and lawyers when it comes to liabilities based on a system’s decisions and proposals, but it is also relevant and useful for any organization that is planning on using AI-based software. It is important to be aware of any potential issues related to the use of AI-based systems, since these issues represent risks. It is good practice to be proactive about how to mitigate these risks using additional security-related measures, such as implementing random reviews of AI behaviors, controls, and audits, or specifying human-machine interaction schemas for situations in which someone must make a decision relevant to a person’s ethical attitude.

Summary

When it comes to AI-based software, to ensure the lowest possible level of risk for human life and the highest possible level of integrity in modern enterprises, it is crucial to have security management policies in place. These policies should include standards and directives that not only are based on legal compliance regulations, but also incorporate well-thought-out guidelines for the ethical use of these solutions. Just as effective security management requires collaboration across company lines, implementing a successful digital ethics strategy for AI requires integrating ethical principles across your organizational standards, policies, and behaviors. AI is increasingly influencing people’s lives, and security and digital ethics can help ensure it is a positive impact.

 

 

Learn More

 

 

Guido Wagner

Guido Wagner (guido.wagner@sap.com) is responsible for innovation projects in SAP Design. He focuses on user experience optimization in a digitalized business environment. Preparing for the future of work through sustainable artificial intelligence that improves the way people live is his passion. Share your thoughts with Guido on LinkedIn at https://www.linkedin.com/in/guido-wagner-693965/.



Gain Instant Insight into Your Business Processes with Process Mining in the Cloud

As organizations continue to expand their global reach, optimizing business processes becomes increasingly important to ensure quality customer experiences and improve margins. Process mining solutions — such as the on-premise SAP Process Mining application by Celonis and SAP Process Mining by Celonis, cloud edition — have emerged as useful tools for supporting these types of initiatives by enabling businesses to drill down into their key business processes, uncover inefficiencies, and pinpoint areas for improvement.

Understanding Process Mining

Process mining is an analytical discipline for discovering and visualizing business processes using the raw data — or “digital footprints” — captured from enterprise application audit trails and logs. Because the visualizations are constructed from the raw data, without the influence of preconceptions or preexisting process models that structure and limit the results, they deliver accurate, valuable insights about how business processes are operating. Views of the data can go all the way from high-level schematic diagrams of process variants, all the way down to text-based reports at a very granular level — an order item in a specific purchase request in an end-to-end procure-to-pay process, for instance.

A process mining deployment can also help support the adoption of robotic process automation (RPA) platforms. Automation is fundamentally a rule-based activity, and automation activities tend to fail when the rules they follow require more manual intervention than the manual processes that they replace. An optimal automation setup requires insight into the exact process flows that are followed by the individuals responsible for them — when people are involved in business processes, they often fill in the gaps in documented process steps in their own ways, and these manual interventions can be missed as the process is automated. Process mining helps identify process flows and variants, including undocumented paths and exception handling, so that organizations can decide which steps need to be reproduced in the automation. Process mining can be the difference that makes an automation project successful.

Ensuring Process Mining Success 

SAP Process Mining by Celonis, cloud edition, extends the process mining functionality of the SAP Process Mining solution with cloud-based capabilities that enable rapid insight into and optimization of complex processes. The solution provides prebuilt content — available via an Intelligent Business Apps store — that can be used to connect both SAP and non-SAP applications. The standardized processes and apps extract the required data, transform that data into the necessary format, and install predefined analyses and objectives without the need for time-consuming configuration of separate tools. It also includes a Transformation Center that provides quantitative analysis of end-to-end business processes, measuring their performance against objectives and milestones, and a Cloud Teams functionality that allows team members to collaborate with each other to address business process inefficiencies together. Enterprises can differentiate themselves through data-driven business process analysis, measurement, and actions, creating a high-performance collaborative platform for continuous improvement.

Using the cloud edition of SAP Process Mining delivers a transformational level of insight into end-to-end business processes with speed and ease, allowing organizations to shine a spotlight on key outcomes and gain accurate, quantitative insights that can be shared across the entire organization. A tool can only take you so far on its own, however — you must set up that tool for success.

Here are some best practices for ensuring a successful process mining deployment in your organization:

  • End-to-end business process analysis requires an empowered center of excellence team that has the authority and support of the business to champion and coordinate the use of the tool across organizational boundaries.
  • A process mining platform that can be shared with business owners and business users on the ground is the most effective deployment model. It gets everybody speaking the same language, enables continuous improvement, and measures the effectiveness of changes.
  • Process mining can be used to benchmark the automation ratio of business processes and identify areas for improvement. It is equally important to use process mining to map out as-is processes to ensure the success of automation projects, such as the implementation of an RPA platform.

With thoughtful planning for your process mining deployment, you can step confidently in the direction of becoming a data-driven intelligent enterprise. Learn more at https://www.sap.com/products/process-mining.html.




How SAP Shops Can Mitigate the Risks of Accounts Payable Compliance

Accounts payable (AP) automation is supposed to save SAP customers money, but developments in digital tax may well make it a source of penalties and supply chain interruptions. With tax authorities all over the world seeking to increase revenues and close tax gaps, AP is becoming a new target for audits.

Many SAP customers probably think of accounts receivable (AR) processes when they create plans to ensure tax compliance — and rightly so. The supplier of goods or services carries the most important indirect tax responsibilities, such as properly clearing an invoice or accurately charging indirect tax.  

In most countries with indirect taxes — such as value-added tax (VAT), goods and services tax (GST), or sales and use tax — however, a much greater challenge arises in the AP process. In its capacity of a buyer of goods or services, a company must verify that its diverse supplier ecosystem sends compliant invoices, which it can confidently book into its ERP systems and use as a reliable basis for tax reporting. It’s often the buyer that carries the largest risk, as noncompliance, in addition to penalties and fines, can lead to denial of input tax reclaim in certain cases.

Supplier Errors and AP Proliferation

Companies might trust their suppliers to deliver the goods they have ordered, but can they trust them enough to risk losing a significant portion of the invoice value in irrecuperable tax because of supplier errors in the invoicing process? Miscalculation of indirect taxes, improper handling of electronic invoicing (e-invoicing), and non-compliant VAT reporting can hit cash flow and operations in multiple ways.

Complicating the situation further is the diversity and proliferation of AP systems themselves. Most SAP customers employ multiple types of systems, each of which is tuned to deal with a different supplier or category of supplier. Each of those systems may require a different approach to tax compliance, increasing the potential for errors and corrupt data. 

Understanding how to manage AP processes is critical for SAP shops that want to maximize their investment in AP automation and ensure that their AP systems are saving them money, and not costing it.

Indirect Tax Determination

There was never any doubt about the importance for trading partners to determine the right taxing jurisdiction and to apply the right tax rate for every item in a supply chain. However, in an increasingly real-time tax control environment, getting invoices right the first time becomes vitally important for supply chains to keep moving. When a tax administration needs to approve a company’s invoices before that business and its trading partners can proceed with the next step in the transaction — including physically dispatching the goods in many cases — having the wrong tax rate on invoices doesn’t just expose the company to fines but can damage operations substantially.

Most of the attention for sales and use tax falls on the “sales tax” side of the house, as the seller is responsible for accurately assessing tax on goods sold. But AP processes play a big role in indirect tax compliance, and tax on purchases is often low-hanging fruit for auditors. In the US, for instance, states and the IRS continue to digitize efforts at tax collection and enforcement. 

The Wayfair Supreme Court decision on state sales taxes has complicated this process for many suppliers that may not have worried about properly calculating tax in the past. With new economic nexus standards, sellers need to figure out if they are getting it right, while purchasers need to make sure that they are. With the potential for error, regulators will be looking increasingly at not just how much companies charge for sales tax but also how much they pay.

Globally, paying too little in indirect tax can trigger an audit, so AP departments must be able to verify that their suppliers are calculating and assessing tax correctly — and, for those purchases where the supplier may not charge tax, have solid processes and technology in place to accurately self-assess use tax where applicable.

The risk goes beyond audits, however. While paying too little in indirect tax can lead to regulatory trouble, paying too much can dent profitability and negatively impact cash flow. Again, blindly relying on suppliers to determine indirect tax rates correctly is not a sustainable plan. SAP shops need to ensure that their AP systems are tuned to provide clarity and accuracy on tax rates paid as well as on rates charged.

E-Invoicing Compliance

Outside the US, particularly in Latin America and increasingly in Europe, SAP customers are dealing with massive new government initiatives to enforce VAT compliance through e-invoicing. In a growing number of countries, the government steps in directly to validate an invoice before a buyer and seller can complete a transaction.

Again, much of the practical responsibility of compliance in that scenario falls on AR, but the buyer often carries the highest risks. Companies that purchase goods can recoup VAT expenses after transactions are complete, but to do so, they need to prove their VAT deductions are compliant. If an SAP customer can’t substantiate a deduction, the company might not get its money back. Furthermore, if an invoice upon receipt from a supplier turns out not to be compliant in a country that requires real-time clearance, AP can’t accept the invoice and may have to trigger a variety of complex correction processes.

As a result, AP processes in the e-invoicing scenario have a direct effect not only on cash flow but on supply chain efficiency. Again, ensuring both compliance and data purity in AP e-invoicing processes is critical — and each country that mandates e-invoicing has a different set of rules, penalties, and deadlines. The complexity involved is significant.

VAT Reporting

VAT reporting has undergone significant change in recent years and continues to quickly evolve. What once was a largely paper-based process has moved almost entirely online in many countries, and the advent of digital reporting is creating even more complexity. The first natural step from paper to electronic reporting was e-filing, which involves using a standardized electronic form for filing periodic tax returns with other income data filed electronically and matched annually to identify differences with previously filed information. 

The move from paper to e-filing strengthened tax authorities’ VAT reporting enforcement capabilities, reducing the potential for fraud and boosting the case for more thorough and frequent audits casting a spotlight on incorrect information. Digital tax reporting takes that concept to another level. With digital tax reporting, companies submit accounting data to support filings in a defined electronic format and in a defined timetable, enabling real-time audits.

A growing number of EU members use the Standard Audit File for Tax (SAF-T) standard for this type of reporting. Still, there are many different approaches with variations in reporting frequency, granularity, scope, and responsibility for data reported to tax administrations. Latin American countries such as Mexico and Brazil use standards of their own, adding to the complexity of the process.

The general trend is for tax administrations to collect transaction-level data with greater frequency, and ideally as close as possible to the actual data interchange — and then to use this information for triangulation and electronic VAT auditing. Governments access additional data and begin to match data across tax types, and potentially across taxpayers and jurisdictions, in real time. The tax authority can then aggregate data and register it in the government’s central database.

With that data, the government can subsequently trigger, substantiate, and prosecute audits; validate deductions; and assess penalties for non-compliance, late registration, and invoicing discrepancies. As such, VAT reporting can have a major and immediate financial impact, potentially leading to penalties for lack of compliance and cash-flow issues due to loss or delay of VAT deductions. AP departments cannot simply rely on their suppliers to provide correct invoicing information; they have a responsibility to ensure their own compliance in reporting payments.

The Importance of Compliant Archiving

Archiving is the element that centralizes original transaction evidence, such as digital invoices, and creates a searchable repository to protect against audits and risk of non-compliance. Compliant archiving has historically been viewed as important for SAP customers to prove compliance in countries where the taxpayer still carries a large part of the burden to prove that transactions that are reported and accounted for have really taken place.

Interestingly, the market is also adopting robust archiving methods in countries that are rapidly moving to “e-assessment,” whereby tax administrations send taxpayers statements, instead of the other way around. In such markets, it becomes even more important to have strong evidence if you want to challenge a payment request that the tax administration has calculated from your real-time transactions. With governments turning to such extreme digital methods to boost tax revenues, compliant archiving will only grow in importance.

Globally, e-invoicing compliance is changing rapidly, with governments increasingly requiring invoices in structured electronic formats for compatibility with real-time clearance. The e-invoice archive should serve as the basis for an SAP customer’s entire e-invoicing compliance strategy. Most countries that require e-invoicing also require an e-archive, so it’s in the SAP customer’s best interest to go beyond simple document storage and pursue an enterprise-wide evidence strategy.

Supporting AP and AR Processes

SAP customers need to cover their bases across the globe. That means integrating disparate AP systems with SAP software so that companies can manage multiple e-invoicing and digital reporting mandates in countries across the world and still maintain full control of tax-relevant AP data.

SAP shops also need to be able to move at the pace of regulatory change as mandates shift and develop, rather than trying to keep up with new regulations through resource-intensive in-house updates. AR processes might get more hype in terms of compliance, but failure to properly address AP compliance also exacerbates the potential for risk of audits and negative financial consequences.

Christiaan Van Der Valk is Vice president, Strategy, at Sovos. Elected a World Economic Forum Global Leader for Tomorrow in 2000, Christiaan is an internationally recognized voice on e-business strategy, law, policy, best practice, and commercial issues.




SAP S/4HANA: State of the Market- Webinar

The promise of digital transformation is just a single driver accompanied by SAP’s increasing investment in the platform and their looming plans to end ongoing support for ECC.

This content is available to SAPinsider Members(complimentary).
Please click below to log in or create an account

Login Now »

Create Acount »

Introducing ABAP Platform 1809

An On-Premise ABAP Platform Fully Optimized for SAP HANA and Modern Application Development

 

by Karl Kessler, Vice President of Product Management ABAP Platform, SAP SE

 

At the SAP TechEd conferences in the fall of 2018, SAP introduced SAP Cloud Platform ABAP environment, which is a platform-as-a-service (PaaS) offering for ABAP development that went live in September 2018.1 While the announcement of this cloud-based environment has generated a considerable amount of excitement and interest among SAP customers, there are more than 100,000 existing productive installations of the on-premise ABAP environment, and to support these customers going forward, SAP delivered another technology at the same time: version 1809 of the on-premise ABAP platform.

ABAP platform 1809 is the technological foundation for SAP S/4HANA 1809, and these two technologies were shipped together in September 2018, with ABAP platform 1809 as an embedded delivery. ABAP platform 1809 includes an ABAP programming layer that is fully optimized for SAP HANA and for the development and enhancement of modern applications, including support for Internet of Things (IoT) and machine-to-machine (M2M) communication. It also delivers a wide range of valuable improvements to various ABAP development tools to increase efficiency.

This article examines some of the key innovations delivered with ABAP platform 1809 — including improvements to the Eclipse-based ABAP development tools and optimizations that leverage the full functionality of SAP HANA and SAP S/4HANA — and how these innovations help you develop sophisticated applications that meet your digital business needs. Before we take a closer look at these new capabilities, however, we’ll first review the release strategy for the ABAP stack, which involves some changes required to fully support SAP HANA functionality. These changes are important to understand so that you can properly plan for the adaptations that are necessary when moving from traditional SAP NetWeaver and SAP Business Suite deployments.

SAP’s Release Strategy for the ABAP Stack

Up to version 1809, innovations for the ABAP stack have been developed based on one common codeline that has served all existing ABAP-based products. These products include the SAP Business Suite applications (such as SAP ERP) and SAP NetWeaver hubs (such as SAP Gateway) that run on SAP NetWeaver Application Server (SAP NetWeaver AS) ABAP 7.5, which includes the traditional ABAP programming model and software component SAP_ABA. SAP S/4HANA on premise and SAP S/4HANA Cloud are also based on this common codeline, with SAP S/4HANA on premise running on SAP NetWeaver AS ABAP 7.5x (that is, 7.5, 7.51, and 7.52) and its new version of the SAP_ABA software component, and SAP S/4HANA Cloud running on a cloud-based version of the platform.

As of ABAP platform 1809, SAP plans to focus all innovation on SAP S/4HANA. Since SAP NetWeaver AS ABAP must be able to run on all officially supported database platforms, it takes a least-common-denominator approach to database functionality and cannot benefit from the latest innovations and advantages that are specific to SAP HANA. SAP S/4HANA on premise and SAP S/4HANA Cloud, on the other hand, support SAP HANA only and are fully optimized for that database and its features. With the focus on SAP S/4HANA — and with SAP NetWeaver AS ABAP 7.51 and 7.52 adopted only by SAP S/4HANA 1610 and 1709, respectively — the latest enhancement package for SAP Business Suite (enhancement package 8) is based on SAP NetWeaver 7.5.2

So, how does SAP plan to deliver innovations to SAP S/4HANA on premise and SAP S/4HANA Cloud while delivering continuous improvements to the existing installed base of SAP Business Suite customers without disrupting their business? Figure 1 provides an overview of the planned path forward. As you can see, the SAP S/4HANA and SAP Business Suite codelines are separated. On the left is the SAP Business Suite and SAP NetWeaver codeline, which is based on SAP NetWeaver AS ABAP 7.5x. It is intended to deliver improvements for the ongoing operation of SAP Business Suite and SAP NetWeaver deployments — including the SAP NetWeaver hubs, SAP NetWeaver add-ons, and custom code developed by SAP customers and partners — and will include all maintenance efforts, security fixes, and selective downports (of version 4.0 of the OData protocol to older SAP NetWeaver releases, for example). As a result, there is no standalone 7.53 version of SAP NetWeaver AS ABAP.

 

Figure 1 — The release strategy for the ABAP platform going forward

 

In parallel, all innovations for the ABAP stack are placed into a common innovation codeline, shown on the right, that serves SAP S/4HANA on premise, SAP S/4HANA Cloud, and the recently released SAP Cloud Platform ABAP environment. This innovation codeline takes full advantage of the underlying SAP HANA database and data platform, with its capabilities fully exposed on the ABAP platform layer. In each case — SAP S/4HANA, SAP S/4HANA Cloud, and SAP Cloud Platform ABAP environment — the new SAP_ABA layer is used, which contains the extended material code field (which is now 40 characters instead of the 18 characters supported by the classical SAP_ABA layer used by SAP Business Suite).

The platform naming follows the nomenclature used by SAP S/4HANA, meaning that ABAP platform 1809 is the foundation for SAP S/4HANA 1809 on premise, while ABAP platform 1808 is the foundation for the cloud solutions SAP S/4HANA Cloud 1808 and SAP Cloud Platform ABAP environment 1808. New cloud versions (for example, 1811 in November 2018, 1902 in February 2019, 1905 in May 2019, and 1908 in August 2019) are shipped every quarter and are consolidated in the yearly on-premise delivery in September, which will be the 1909 version in September 2019. This new release strategy opens up a completely new way to quickly deliver SAP HANA innovations together with the ABAP stack. In traditional deployments, the back-end version determines the speed of innovations — now ABAP innovations can be deployed in the cloud much faster.

It is important to note that unlike SAP NetWeaver AS ABAP, ABAP platform 1809 is not shipped independently but only indirectly (embedded) with the shipment of SAP S/4HANA, although it can be patched separately. The development capabilities delivered by this platform are compelling reasons to move to this SAP S/4HANA release, which is an easy upgrade from previous on-premise SAP S/4HANA releases (1511, 1610, and 1709). Existing SAP Business Suite implementations can be migrated with the support of SAP tools and methodologies3 — and with maintenance and support for SAP Business Suite set to end in 2025, it is a good idea to consider making this move sooner rather than later.

With a solid understanding of the release strategy, let’s now look at some of the key innovations delivered with ABAP platform 1809. We’ll first examine improvements made to the Eclipse-based ABAP development tools, and then look at optimizations that fully leverage the capabilities of SAP HANA and SAP S/4HANA. This article covers the most prominent examples of the delivered innovations — details on additional improvements are available from the ABAP community site and the release notes for ABAP platform 1809.

Improvements in the Eclipse-Based ABAP Development Tools

ABAP platform 1809 enhances the Eclipse-based ABAP development tools in several ways to help improve developer efficiency. Improvements in the areas of maintaining enhancements, analyzing runtime errors, defining lock objects, and editing transport objects are particularly notable. Let’s take a closer look.

Maintaining Modifications and Enhancements

Prior to ABAP platform 1809, it was not possible to maintain modifications within the Eclipse-based ABAP development tools. To be more precise, you could directly modify a standard SAP program using the source code editor tool in the Eclipse workspace, but that meant losing the ability to track changes with the Modification Assistant, which is not available to the Eclipse-based tools. This approach is not recommended, since without the Modification Assistant, the system has no clue where you tried to enhance the standard version, which results in significantly more effort when you need to apply a support package or upgrade your system to a higher release. With ABAP platform 1809, you can maintain the modification directly inside the Eclipse-based tools. Creation and deletion of modifications are not yet supported.

In addition, when implementing enhancements that should be inserted into the standard logic or replace standard logic — allowing you to add functionality without modifying the standard source code if an enhancement spot is available — developers would still have to switch to the SAP GUI-based ABAP Editor (transaction SE38) within the ABAP Workbench, even if they started their development work in the Eclipse-based source code editor. Only the SAP GUI-based tool could fully handle enhancements.

With ABAP platform 1809, the Eclipse workspace includes a native enhancement implementation editor tool that is fully capable of displaying and changing enhancements directly within the workspace. You can now freely edit the code and implement enhancements to standard SAP programs without leaving the ABAP development tools environment. During support package implementation, the system will help you identify the place where the enhancement was made and support you in readjusting the enhancement if necessary. Creation and deletion of enhancement implementations are not yet supported.

Figure 2 shows the enhancement implementation editor in the Eclipse workspace. A pop-up window shows an enhancement implementation consisting of several lines of ABAP code — including regular ABAP data declarations, a SELECT statement, and a READ TABLE operation — that is located at an enhancement spot defined by the application.

 

Figure 2 — The enhancement implementation editor enables you to display and change enhancements directly within the Eclipse workspace

 

Analyzing Runtime Errors

Every ABAP developer is familiar with the traditional handling of ABAP runtime errors. When an ABAP runtime error occurs — and there are often numerous error conditions in a development or productive ABAP runtime environment — the system shows the location in the source code where the error occurred that caused the ABAP program to abort. The system allows you to switch to the debugging environment, analyze the call stack from when the program terminated abnormally, and inspect the local variables, the global tables, structures and fields, and other valuable information.

Previously, these analysis tools were only available in the SAP GUI-based environment. The Eclipse-based tools would implicitly switch to SAP GUI when a runtime error occurred that prevented the ABAP program from continuing its execution. As of ABAP platform 1809, the Eclipse-based tools can directly handle this situation without switching to SAP GUI. Remember that in SAP Cloud Platform ABAP environment, where the same Eclipse-based ABAP development tools are used, SAP GUI is not supported at all, meaning that using the runtime analysis capabilities of the Eclipse workspace is the only way to gain useful information for error analysis. However, this is also helpful on premise, since the error analysis is presented in the Eclipse workspace directly rather than in an embedded SAP GUI screen.

Figure 3 shows an example runtime error analysis in the Eclipse workspace. The identified error is an occurrence of a division by zero that causes program termination if not caught on a higher-order level. The information displayed includes details of the error analysis and, most important, the location in the source code where the error occurred. It also includes details about the call stack (such as the active function modules and method invocations), which are displayed at the bottom.

 

Figure 3 — As of ABAP platform 1809, the Eclipse-based ABAP development tools can handle runtime analysis within the Eclipse workspace

 

Defining Lock Objects

From its very beginning, the ABAP stack was optimized to handle large volumes of transactional workload. With the in-memory technology of the underlying SAP HANA database platform, this can easily be achieved for traditional reporting as well as for transactional access, such as SQL insert, update, and delete operations.

In a transactional environment with multiple users, database records must be locked before updates are written back to the database. In a simple client-server scenario, each parallel user could use database locks to achieve isolation when shared resources are accessed. Using database locks in interactive environments such as SAP Business Suite and SAP S/4HANA is more problematic. In this case, a database lock would have to be held during several steps and screen changes to avoid parallel updates from other users logged on to the system. However, the ABAP stack prevents database locks from being held over several screen changes, since a work process that executes an ABAP program execution request will free any locks by issuing a commit when the screen is sent to the dialog user for further input (for example, a pop-up that the user must answer). This means that database locks cannot be used in the ABAP stack to provide exclusive access to database resources (to update database rows or fields, for example).

For this reason, ABAP developers have used logical lock objects called enqueue operations to synchronize access to shared database tables. If a resource is free, an enqueue operation will create a logical lock in the central lock table of an SAP system. If a resource is already locked, the enqueue operation will fail and the dialog user can try again later. The dialog user is not set on hold to avoid database deadlocks when user sessions are waiting for each other to close a transaction. After the update operation, the enqueues are removed from the shared objects to allow access by other users again. You can define these lock objects for one particular database table or you can define combined lock objects that span multiple database table entries.

Previously, enqueue lock objects could not be defined in the Eclipse-based development tools — you had to define them in the ABAP Dictionary via transaction SE11. With ABAP platform 1809, lock objects can be defined from within the Eclipse workspace using a native Eclipse-based editor tool. This allows you to stay in the ABAP development tools environment without the need to switch to SAP GUI and back again. When you activate a lock object successfully, an enqueue function module is created that you can use inside your ABAP program logic to obtain a logical lock.

Figure 4 shows an example lock object for the well-known flight data model. On the left, it lists the tables the lock objects depend on, and on the right, it shows the generated lock function modules that an application programmer needs to call inside the application logic. In addition, it lists the arguments (key fields) that are passed when locking one or several rows.

 

Figure 4 — ABAP platform 1809 enables developers to define lock objects from within the Eclipse workspace using a native Eclipse-based editor tool

 

Editing Transport Objects

When you start to edit your development objects in the Eclipse workspace, a transport request pop-up will ask you to assign the changed object to an already-existing transport request or to create a new transport request on the fly. However, sometimes developers must change an implementation several times before transporting their development requests to the consolidation system. In addition, developers often want to manually remove development artifacts from the transport request so that unneeded artifacts are not transported to consolidation, the productive environment remains clean, and there are no name clashes with other developers.

In these cases, it is useful to be able to manually edit transport requests. It is also helpful if, after longer periods of development, a developer can create a mass transport that comprises all development artifacts of a particular project and submit a complete transport, to ensure that a consistent target state is sent to consolidation. Again, this requires the ability to manually edit transport requests.

Prior to ABAP platform 1809, transport objects could not be edited in the Eclipse-based development environment — you would have to switch to the Workbench Organizer (transaction SE09). With ABAP platform 1809, you can view and edit the details of a transport request using the Eclipse-based transport request editor tool. With this tool, all changed objects are collected in a list where each list element is of a certain type, such as an ABAP class, an ABAP program, a table definition, or a lock object.

Figure 5 shows an example transport request in the editor tool. The header information of the transport request is displayed in the upper half of the screen, and the lower half shows the list of changed objects that are contained in the transport request. Using this editor tool, an ABAP developer could remove certain elements or add missing elements manually, but in most cases, the system adds the items automatically whenever a developer changes a new development object.

 

Figure 5 — The transport request editor tool enables developers to view and edit the details of a transport request from within the Eclipse workspace

 

Optimizations for SAP HANA and SAP S/4HANA

In addition to delivering improvements to the Eclipse-based ABAP development tools, ABAP platform 1809 includes optimizations that help SAP customers take full advantage of the underlying capabilities of SAP HANA and the features of SAP S/4HANA. Here we look at three key optimizations: the ability to use SAP HANA hierarchy and abstract entities in core data services, support for adapting custom code for SAP S/4HANA, and enabling M2M communication.

Using SAP HANA Hierarchy and Abstract Entities in Core Data Services

With ABAP platform 1809, for the first time, the ABAP stack can focus exclusively on SAP HANA without having to ensure that corresponding features are available on other database platforms as well. One key feature that is supported only by SAP HANA is hierarchies, which are not well handled on the SQL level since you need to model them as tables. In principle, you could define a two-column table in SQL, with a column for parent and a column for child, but this approach becomes unwieldy if you need to compute the descendants (such as grandchildren) — in this case, you would have to define join operations of the table with itself, known as self-joins, which are not easy to handle.

In traditional ABAP, these types of hierarchies were typically handled on the application server level using internal tables. With ABAP platform 1809, you can use the SAP HANA hierarchy functions in-memory on the database level using the new keyword DEFINE HIERARCHY (see Figure 6). Supported by core data services, you can easily navigate to subordinate levels and access nodes and branches of a given hierarchy without the need to introduce internal tables on the ABAP language level. The new hierarchy functions can also be used in ABAP SQL, which is the name for Open SQL as of ABAP platform 1809, because SQL no longer focuses on supporting any database, but instead on full optimization of the ABAP layer on top of SAP HANA.

 

define hierarchy DEMO_CDS_PARENT_CHILD 
  with parameters 
    p_id : abap.char(2) 
  as parent child hierarchy( 
    source 
      DEMO_CDS_PARENT_CHILD_SOURCE 
      child to parent association _relat 
      start where 
        id = :p_id 
      siblings order by 
        parent 
      multiple parents allowed 
    ) 
    { 
      id, 
      parent, 
      $node.hierarchy_rank        as h_rank, 
      $node.hierarchy_tree_size   as h_tree_size, 
      $node.hierarchy_parent_rank as h_parent_rank, 
      $node.hierarchy_level       as h_level, 
      $node.hierarchy_is_cycle    as h_is_cycle, 
      $node.hierarchy_is_orphan   as h_is_orphan, 
      $node.node_id               as h_node_id, 
      $node.parent_id             as h_parent_id 
    }

Figure 6 — Defining a hierarchy with core data services

 

Another important improvement is the definition of abstract entities in core data services using the new keyword DEFINE ABSTRACT ENTITY. Abstract entities, such as database entities, have a structure consisting of fields, but they are not stored in a table. They correspond to structure definitions in the traditional ABAP Dictionary. They can be used in metadata extensions to define the user interface, similar to how structures are used in Dynpro and Web Dynpro development.

Adapting Custom Code for SAP S/4HANA

Making the transition from SAP Business Suite to SAP S/4HANA means adapting your custom code for SAP S/4HANA,4 which uses a different data model than SAP Business Suite. ABAP Test Cockpit is a central entry point for beginning this transformation and shepherding you through the adaptation. ABAP Test Cockpit has been a strong analysis tool for identifying incompatibilities between your custom SAP Business Suite code and SAP S/4HANA — however, you have to fix the errors manually when making the conversion to SAP S/4HANA.

With ABAP platform 1809, ABAP Test Cockpit offers quick fixes for commonly identified issues that you can apply with a single mouse click. Figure 7 shows an example that often occurs in the SAP HANA context, where a binary search operation in ABAP requires a sorted result, but the SQL standard is not sorted by default. In this case, you need to append the addition ORDER BY PRIMARY KEY, which is supported by the quick fix functionality. This approach saves a significant amount of development time and also increases code quality.

 

Figure 7 — The quick fix functionality for commonly identified issues, such as sorting, included with ABAP platform 1809 saves significant time and effort when adapting custom code for SAP S/4HANA

 

In addition, ABAP platform 1809 introduces highly graphical SAP Fiori-based reporting applications, such as the one shown in Figure 8, to support you with scoping and analysis capabilities, and to guide you through the transition. Objects that are out of scope won’t be imported to get rid of unused code.

 

Figure 8 — ABAP platform 1809 provides graphical SAP Fiori-based reporting and analysis applications to support the custom code adaptation process

 

Enabling M2M Communication

Due to the growing popularity of IoT scenarios, the Message Queuing Telemetry Transport (MQTT) protocol, which was designed to support M2M communication in industrial IoT scenarios, has generated a lot of interest for enabling real-time data updates and the ability to react to events on the fly. MQTT supports this capability by offering a lightweight publish-and-subscribe infrastructure where clients can send and receive messages through a message broker.

With ABAP platform 1809, the ABAP stack can act as an MQTT client that can publish messages and receive events, similar to the ABAP channels infrastructure. Figure 9 shows the implementation of an MQTT client that connects to a public MQTT broker and publishes a message to that broker.

In this way, MQTT can be used to extend SAP S/4HANA and SAP Cloud Platform in an event-driven way, taking advantage of the IoT and machine learning capabilities included with SAP S/4HANA, and enabling real-time analytics and responsiveness with the speed provided by SAP HANA.

 

METHOD constructor.
  TRY.
      " create MQTT client
      cl_mqtt_client_
         manager=>create_by_url(
        EXPORTING
          i_url            = 'ws://broker.hivemq.com:8000/mqtt'
          i_event_handler  = me
        RECEIVING
          r_client        = mo_mqtt_client ).

      " establish the connection
      mo_mqtt_client->connect( ).
    CATCH cx_mqtt_error.
      " to do: error handling, e.g. write error log!
  ENDTRY.
ENDMETHOD.

METHOD publish.
  TRY.
      " create message with specific quality of service (QoS)
      DATA(lo_mqtt_message) = cl_mqtt_message=>create( ).
      lo_mqtt_message->set_qos( if_mqtt_types=>qos-at_least_
        once ).
      lo_mqtt_message->set_text( iv_message  ).
      " publish message to topic
      mo_mqtt_client->publish(
                EXPORTING i_topic_name = iv_topic
                          i_message    = lo_mqtt_message ).
    CATCH cx_mqtt_error.
      " to do: error handling, e.g. write error log!
  ENDTRY.
ENDMETHOD.

Figure 9 — Implementing an MQTT client that connects to a public MQTT broker and publishes a message to that broker

 

Summary

While the cloud in general — along with SAP Cloud Platform ABAP environment — garners significant attention, a large number of on-premise ABAP-based productive systems are still running in SAP customer landscapes to support SAP Business Suite and SAP S/4HANA on premise. ABAP platform 1809 is intended to support these customers going forward.

ABAP platform 1809 is a thorough foundation for SAP S/4HANA and SAP S/4HANA Cloud, and takes full advantage of SAP HANA functionality. With the optimizations and the improved Eclipse-based ABAP development tools included with ABAP platform 1809, SAP customers and partners can develop custom applications and enhance standard SAP code to position themselves for the digital future.

 

1 For an in-depth look at SAP Cloud Platform ABAP environment, see the SAPinsider article “Take Your ABAP Skills to the Cloud” (Issue 3 2018) available at SAPinsiderOnline.com. [back]

2 For a detailed examination of the SAP NetWeaver release strategy, see the SAPinsider article “Planning Your SAP NetWeaver Upgrade Strategy” (Issue 1 2018) available at SAPinsiderOnline.com. [back]

3 Learn more about converting from SAP Business Suite to SAP S/4HANA in the SAPinsider articles “Making the Move to SAP S/4HANA” (Issue 1 2017) and “A Simplified Way to Bring Your Custom Code to SAP S/4HANA” (Issue 2 2018) available at SAPinsiderOnline.com. [back]

4 For a detailed look at this process, see the SAPinsider article “A Simplified Way to Bring Your Custom Code to SAP S/4HANA” (Issue 2 2018) available at SAPinsiderOnline.com. [back]

 

Karl Kessler

Karl Kessler (karl.kessler@sap.com) joined SAP SE in 1992. He is the Vice President of Product Management ABAP Platform — which includes SAP NetWeaver Application Server, the ABAP Workbench, the Eclipse-based ABAP development tools, and SAP Cloud Platform ABAP environment — and is responsible for all rollout activities.