Skip to main content

Organizations that run their Dynamics 365 Business Central or Dynamics NAV workloads on-premises often have large amounts of data built up from a high volume of transactions. Over time, it may become too expensive to host it on Business Central online.

In addition, many organizations want to glean insights from their ERP data by combining it with external data sources such as IoT, AI, or machine learning capabilities. In both cases, having direct access to information hosted inside the Business Central cloud can be useful.

You can access data in Business Central using REST APIs, of course. However, we’re piloting a way to host the information in an Azure Data Lake outside of Business Central. This capability will allow you to:

  • Maintain a lower-cost alternative data warehouse that syncs to your production data.
  • Run analytics without disrupting Business Central operations.

Our solution opens up some interesting possibilities for organizations using Business Central. If you’d like to try it out, we’re making it available as a proof of concept at aka.ms/bc2adls.

Configuration guidance

There are two parts to the solution:

  • A Business Central extension that pushes incremental data from the database to Azure Storage
  • An Azure Synapse pipeline that reconstructs the full dataset from the increments pushed by the extension

Configure the extension with an Azure Storage account and access credentials, and then determine the tables and fields to export. Each time the export process runs, the Business Central extension places the inserts, updates, and deletions that were made since the last export in a container in the Azure data lake.

The Azure Synapse pipeline assimilates updates from multiple exports and creates a replica data warehouse in Common Data Model format. If an entity doesn’t yet exist in the data warehouse, the pipeline creates it.

Use case 1: Periodic exports of Business Central data

The export process can run as a recurring background batch job or as a one-time operation, allowing you to maintain your data in the lake over time. Combine it with a recurring run of the Azure Synapse pipeline to have the final data updated at the lake. You can tune the frequency of recurrence in accordance with an acceptable delay between when changes are made in Business Central and when they’re updated in the lake.

Use case 2: Business Central data archive

Over time, older information in the Business Central database may need to be removed to make space for new operational data. Our solution duplicates the information from Business Central in Azure Storage, while giving you the option to skip exporting deletions. This feature is indispensable to auditing and historical tracking scenarios, because records can be maintained in the lake when they have been removed from Business Central.

To skip deletions for specific tables, uncomment or edit the relevant lines for the subscriber OnAfterOnDatabaseDelete procedure.

Use case 3: Richer analytics

Using Business Central data that resides in Azure Data Storage as a source of analytics decouples it from the Business Central database. The database can then focus on running operational workloads.

Because the data is available in the Common Data Model format, it’s available to an increasing number of analytics services, such as Power BI and Azure data services. Query your Business Central data in the lake using familiar T-SQL syntax using Azure Synapse Serverless SQL pool endpoints, too.

Next steps

Get more information, try out our solution, and read some instructions at aka.ms/bc2adls.

Information about the proof of concept is also linked from the BCTech repository. We encourage your contributions and suggestions.

Source