Rapid Reporting: Speeding Up Power BI Reports for Business Central
Catapult brand logo
Article

Rapid Reporting: Speeding Up Power BI Reports for Business Central

There’s nothing that puts a damper in productivity quite like slow-running reports. Everyone, at some point or another, has sat around watching a progress burn daylight during a workday. Sometimes it’s not a big deal if you can task-switch while waiting for the results, while other times your entire process is held captive by a looping refresh or loading animation.

While there are certainly a variety of ways to make a Power BI report refresh faster, this article will focus on two key areas, specifically in the context of Dynamics 365 Business Central: data source streamlining and OData filtering.

Data Source Streamlining

It is important to note that even if your queries can refresh in parallel, the more queries you have, the longer the report will take to refresh. This is particularly true in cases where you have transformations, merging, and functions running against those queries. Therefore the name of the game is to try and pare down those query counts as much as possible.

To give you an example, here at Catapult we report off of the Job module in Business Central. We have hundreds, if not thousands, of Jobs in the system. Naturally we do close out Jobs when projects are closed, but some of our reports will look at closed Job data as well as live.

So, if we were to, say, look at Jobs, Job Tasks, Resources, associated Planning Lines, and Job Ledger Entries, the default minimum querying approach might look like this:

Which might refresh just fine, but for our purposes adds a lot of unnecessary glut and processing requirements.

First off, some of those tables use composite primary keys, meaning to create relationships you would most-likely:

  • Link Job to Job Task using the Job No. (e.g. ABC)
  • Create a ‘key’ as a Custom Column in the Job Task query that combines the “Job No.” and “Job Task No.” (e.g. ABC-01);
  • Create a matching corresponding ‘key’ Custom Column in the Job Planning Lines and Job Ledger Entries queries;
  • Create relationships for all the above in Power BI;

And you’d be pulling in all Jobs, all Job Tasks, all Resources regardless of whether your Planning Lines or Ledger Entries are referencing them.

Second of all, for our purposes, all we really want is the Description, Job Posting Group, and Bill-to Customer Name from the Job and just the Description from the Job Task.

At this time we need to think about streamlining our data source. Having each of those entities as different queries is an example of a normalized data structure. I won’t get into normalization, but the tl;dr is that a relational database should have data structured this way in order to minimize redundancy and maximize data integrity.

Then how do we avoid querying the entire Job/Job Task pages? In my mind, there are two solutions:

  1. Create a function to retrieve only the relevant Jobs/Job Tasks;
  2. Surface the lookups to the fields I want on the published Power BI page.

While the function approach is, programmatically, more elegant to retrieve only Jobs that are relevant, it does not necessarily speed up our reports (in many cases, it might even increase processing overhead, leading to slower reports):

The simplest solution is to denormalize my intended ‘starting point’ queries, the Planning Lines and Ledger Entries, namely to surface relevant Job, Job Task, and Resource information into the Job Planning Line and Job Ledger Entry pages;

After that, I would probably append the two into a single query, designated by “Source” (i.e. “Planning Line”, “Ledger Entry”), in a similar way that I’d handle a “Budget vs. Actuals” report.

For the first point, I (or my Microsoft Partner) can create two custom pages, throwing in the new fields “PBI_Job_Description”, “PBI_Job_Task_Description”, etc.

You can then publish this page and consume it in Power BI over web services:

And then you can proceed to report using all the fields you just pulled in.

OData Filtering

When people think of filtering the data in Power BI, they think of filtering the data using the Excel-like column filters, such as the below:

The one thing to note is that this is filtering after the data is already pulled from Business Central, so you’re still taking the full time to download all the unfiltered data every refresh.

But the great thing about published objects through web service is that they support the protocols and conventions for filtering at the source itself, so that your dataset comes in “pre-filtered”.

Using the General Ledger Entries (page 20), for example, against a demo instance, I see that if I pull the page into Power BI as is, using the OData URL on the Web Services page:

I retrieve a whole lotta data:

But I can append OData URL filters to my base URL (http://docs.oasis-open.org/odata/odata/v4.01/odata-v4.01-part1-protocol.html#_Toc31358947).

For example, I can take the base OData URL that Business Central provides:

https://api.businesscentral.dynamics.com/v2.0/74f755a1-3473-4c24-9e5b-9f23a968211e/Sandbox/ODataV4/Company(‘CRONUS%20Canada%2C%20Inc.’)/PBI_GeneralLedgerEntries

And append a filter to only see G/L Account 64450:

https://api.businesscentral.dynamics.com/v2.0/74f755a1-3473-4c24-9e5b-9f23a968211e/Sandbox/ODataV3/Company(‘CRONUS%20Canada%2C%20Inc.’)/PBI_GeneralLedgerEntries?$filter=G_L_Account_No eq ‘64450’

Which returns only 17 rows of data:

And I can stack my filters, too. Let’s say I only care about the 2020 transactions to that same G/L Account:

https://api.businesscentral.dynamics.com/v2.0/74f755a1-3473-4c24-9e5b-9f23a968211e/Sandbox/ODataV3/Company(‘CRONUS%20Canada%2C%20Inc.’)/PBI_GeneralLedgerEntries?$filter=G_L_Account_No eq ‘64450’ and Posting_Date ge datetime’2020-01-01T00:00:00Z’ and Posting_Date le datetime’2020-12-31T00:00:00Z’

 I now only see 5 rows of data:

But these are hard-coded filters, you may protest. Well, you have options:

Automate with Formula

For a report that is dynamically driven off the current day (say, if a report is always viewed in Year To Date) you can using simple M-language formula (https://docs.microsoft.com/en-us/powerquery-m/) in the query code (the “Advanced Editor” in Power Query), I can create some variables called “FromDate” and “ToDate”:

In this example, I’m basically forcing the date into the YYYY-MM-DD format that I need. The FromData is using the “StartOfYear” Date function to retrieve the beginning of the year. The FixedLocalNow date function basically returns the current local time.

So, at the time of writing this post, the formulas yield January 1st of this year and February 26th respectively:

Now all you need to do is wrap these values in the filtering syntax (what I like to do is build the filter string using a variable) and then set it as the source URL:

And then I get the filtered data in my report. Truth be told, I’m simplifying the steps that I, personally, normally go through for this. My own reports typically have the Business Central tenant ID parameterized, as well as the environment type (Sandbox/Production/etc.) and the web service name, but that’s a matter of personal coding preference (however for publishing the report to PowerBI.com, there are limitations to what you can parameterize), but for OData filters, this won’t be an issue.

Automate with Parameters

You can also set the date dynamically by and allow your end-user to be able to set the report date before refreshing (i.e. an “as at” date). You can do this by first creating a parameter:

And then using the parameter value in your formula (similar to the previous step):

Then when a user goes to the report, they can Edit Parameters:

And refresh the report. Note that changing the parameters is only (at the time of this post) available for end users to change in Power BI Desktop and not the cloud/published reports. You can, as an administrator, modify the parameters of the report for refreshing the Dataset:

But understandably that’s not quite the same thing as the above. Hopefully someday soon Microsoft will implement run-time parameter setting in the cloud. Here’s hoping!

Closing the Loop

While not every report needs to run at lightning speeds to be useful – in fact many Power BI reports are scheduled reports that run after-hours daily and are not refreshed in real-time – there are many use-cases where you might actually be refreshing a report frequently (e.g. around month or year-end), when the refresh time becomes apparent. This holds particularly true with consolidated reports in Excel using Power Query.

Regardless of your refreshing frequency, I think the pursuit of efficient and streamlined reports is always a worthy endeavour and with a little bit of thought and design, you can say goodbye to the days of staring at a dreadfully slow progress bar or mesmerizing spinning refresh wheel!

Got questions about Microsoft Dynamics 365 Business Central or Power BI? Schedule a demo and one of our experts will walk you through the solution.

Register to receive the latest Dynamics 365 Insights

Catapult reinvented consulting and implementation services for the subscription economy.

Subscription packages that include everything you need to get up and running, and transform your business

  • Icon

    Lean
    Deployment

  • Icon

    On-Demand
    Training

  • Icon

    Platform
    Upgrades

  • Icon

    Amazing
    Support

  • Icon

    Software
    Licensing

Packages from

$2,000

/mo

36-month commitment

Get Pricing

Catapult Leadership

Be open-minded. ERP projects aren’t just about requirements, they are about putting together a very thoughtful, practical program of work with scalable solutions in mind.

Jeff Bacon, VP Client Development