Placeholder image

SAP Business Data Cloud, Datasphere and Microsoft Fabric

| Ulrich Christ |

Datasphere Fabric


Episode #279

Introduction

In episode 279 of our SAP on Azure video podcast we talk about SAP Business Data Cloud, Datasphere and Microsoft Fabric.

With all the excitment about the SAP Business Data Cloud, I recently searched for the Microsoft Fabric and SAP integraton and found several older blog posts from Ulrich Christ, with 90.000, 115.000 and almost 200.000 views. So there is definetly an interest in integrating SAP data with Fabric. Because of that I am really happy to have Ulrich with us again to provide us with an update on the latest SAP integration with SAP Business Data Cloud, Datasphere and Fabric.

Want to register for the planned preview for SAP Business Data Cloud Connect for Microsoft Fabric: http://aka.ms/SAPinFabricPrpr

Find all the links mentioned here: https://www.saponazurepodcast.de/episode279

Reach out to us for any feedback / questions:

#Microsoft #SAP #Azure #SAPonAzure #Fabric #Datasphere #OneLake

Summary created by AI

  • Overview of SAP Data Integration Approaches in Microsoft Fabric:
  • Ulrich provided Holger and Goran with a comprehensive overview of the various approaches for integrating SAP data into Microsoft Fabric, detailing shortcut, mirroring, and copy job methods, and explaining their use cases and technical distinctions.
    • Shortcut Integration Method: Ulrich explained that the shortcut method allows users to connect to external data sources, such as AWS S3 or Azure Data Lake, and access the data in OneLake without physically moving it. This lightweight approach is suitable when the data is already in a data lake and offers immediate access, though some internal optimizations are not available.
    • Cross-Cloud Mirroring: Ulrich described cross-cloud mirroring as a turnkey solution that replicates data from a source database into OneLake, creating a mirror database. This method involves configuring a connection and selecting tables to replicate, with the engine keeping the replica up to date in near real time. The mirrored data benefits from OneLake’s optimized storage format, such as V-order Parquet.
    • Copy Job for Large-Scale Data Movement: Ulrich outlined the copy job approach, which is designed for petabyte-scale data movement and offers extensive configuration options, including table mapping and filtering. This method provides more control over data movement and can be orchestrated within pipelines for complex scenarios.
    • Applicability to SAP Data: Ulrich emphasized that all these integration approaches—shortcuts, mirroring, and copy job—are applicable to SAP data, allowing users to select the most appropriate method based on their specific requirements and use cases.
  • SAP Connectivity Options and Partner Ecosystem in Fabric:
  • Ulrich discussed with Holger and Goran the built-in SAP connectors in Fabric, the integration of partner solutions, and the alignment with SAP’s own integration tools, highlighting the maturity and adoption of these options.
    • Built-In SAP Connectors: Ulrich explained that Microsoft Fabric includes built-in SAP connectors, previously available in Azure Data Factory, now integrated for out-of-the-box use. These connectors support a wide range of SAP data integration scenarios and have seen significant adoption, with thousands of customers using them at scale.
    • Partner Ecosystem and Open Mirroring: Ulrich described how Microsoft collaborates with partners who offer specialized SAP data integration solutions. These partner tools can ingest data into Fabric’s landing zones, and the open mirroring feature allows third-party tools to participate in the end-to-end change data capture process.
    • Alignment with SAP’s Integration Tools: Ulrich highlighted that Microsoft also works closely with SAP to provide integration options that align with SAP’s data strategy, particularly through SAP Business Data Cloud and SAP DataSphere, ensuring best practices and recommended approaches are available to customers.
  • SAP Business Data Cloud Connect and Zero Copy Sharing:
  • Ulrich presented to Holger and Goran the upcoming SAP Business Data Cloud Connect integration for Microsoft Fabric, detailing its zero copy sharing capabilities, harmonized data models, and the planned preview and general availability timeline.
    • Zero Copy Sharing Concept: Ulrich explained that SAP Business Data Cloud Connect enables zero copy sharing, allowing Fabric to access SAP data in Business Data Cloud without physically moving it. This shortcut-like approach provides direct access to harmonized SAP data products managed by SAP.
    • Harmonized Data Models: Ulrich described how SAP Business Data Cloud creates harmonized data models across various SAP applications, solving the challenge of integrating disparate data sources such as S/4HANA and SuccessFactors. This harmonization simplifies analytics and reporting for customers with multiple SAP systems.
    • Integration Timeline and Preview Registration: Ulrich announced that the preview for SAP Business Data Cloud Connect is planned for the first half of the calendar year, with general availability expected later in the year. He encouraged interested customers to register for the preview to gain early access.
    • Use Cases and Limitations: Ulrich noted that while zero copy sharing is valuable for scenarios focused on SAP data, it does not eliminate the need for data movement in cases where data transformation is required. The approach is particularly beneficial for customers leveraging multiple SAP cloud applications.
  • Mirroring and Copy Job Integration with SAP DataSphere:
  • Ulrich demonstrated to Holger and Goran the mirroring and copy job integration between SAP DataSphere and Microsoft Fabric, explaining the technical setup, data flow, and the flexibility offered by these methods for SAP data replication and transformation.
    • Mirroring via SAP DataSphere: Ulrich detailed how SAP DataSphere’s Premium Outbound Integration can replicate data from SAP sources into Azure Data Lake Gen2, which Fabric’s mirroring engine then processes to create a mirrored database in OneLake. This setup supports both initial snapshots and incremental data loads with configurable frequency.
    • Technical Flow and Configuration: Ulrich walked through the process of setting up a replication flow in DataSphere, connecting to SAP source systems, and landing data in Azure Data Lake. On the Fabric side, users create a lakehouse and shortcut to the storage container, then configure a mirrored database with minimal setup.
    • Copy Job for Advanced Control: Ulrich explained that the copy job method provides more granular control over data replication, including table selection, column mapping, and scheduling. This approach supports a wider range of destinations beyond lakehouses, such as SQL databases and Snowflake.
    • Scalability and Adoption: Ulrich reported that both mirroring and copy job integrations are in public preview and have already seen adoption at terabyte scale per month, demonstrating their scalability and effectiveness for large SAP data workloads.
  • Business Process Solutions for SAP Data in Fabric:
  • Ulrich introduced Holger and Goran to Business Process Solutions, a set of pre-built data models, dashboards, and AI agents for SAP data in Microsoft Fabric, designed to accelerate time to value and simplify analytics across business processes.
    • Pre-Built Data Models and Dashboards: Ulrich described how Business Process Solutions provide pre-built data models for enterprise business applications, along with ready-made dashboards and reports in Power BI, enabling rapid insights without extensive custom development.
    • End-to-End Data Integration and Transformation: Ulrich explained that these solutions cover the full data flow from integration, through technical transformations such as surrogate key creation and currency conversion, to consumption in analytics and AI scenarios, with built-in security and authorization features.
    • Use Cases and Extensibility: Ulrich noted that Business Process Solutions address common SAP scenarios in finance, sales, and procurement, and can be used out-of-the-box or as learning tools for customers to understand best practices in SAP data transformation and analytics.
    • Upcoming Deep Dive: Holger announced that a future session will feature a colleague providing an in-depth exploration of Business Process Solutions, inviting attendees to stay tuned for more detailed information.