Measuring Power BI Report Page Load Times

If you’re performance tuning a Power BI report the most important thing you need to measure – and the thing your users certainly care about most – is how long it takes for a report page to load. Yet this isn’t something that is available anywhere in Power BI Desktop or in the Service (though you can use browser dev tools to do this) and developers often concentrate on tuning just the individual DAX queries generated by the report instead. Usually that’s all you need to do but running multiple DAX queries concurrently can affect the performance of each one, and there are other factors (for example geocoding in map visuals or displaying images) that affect report performance so if you do not look at overall page render times then you might miss them. In this post I’ll show you how you can measure report page load times, and the times taken for other forms of report interaction, using Performance Analyzer in the Service and Power Query.

Consider the following series of interactions with a published Power BI report:

The report itself isn’t really that important – just know that there are a series of interactions with a slowish report while Performance Analyzer is running. Here’s what Performance Analyzer shows by the end of these interactions:

Here’s a list of the interactions captured:

  • I changed from a blank report page to a page with a table visual, where the table visual was cached and displayed immediately
  • I then refreshed the table visual on that page by clicking the Refresh Visuals button in the Performance Analyzer pane
  • I changed to the next page in the report and all the visuals on that page rendered
  • I changed the slicer on that new page
  • I clicked on the bar chart to cross-filter the rest of the page

As you can see from the screenshot above, Performance Analyzer tells you how long each visual takes to render within each interaction but it doesn’t tell you how long each interaction took in total. In a lot of cases you can assume that the time taken for an interaction is the same as the time taken for the slowest visual to render, but that may not always be true.

So how can you use Performance Analyzer to measure the time taken for these interactions? How can you measure the amount of time taken to render a page in a report?

To solve this problem I created a Power Query query that takes the event data JSON file that you can export from Performance Analyzer and returns a table showing the amount of time taken for each interaction. Here’s the M code for this query:

let
Source = Json.Document(File.Contents("C:\PowerBIPerformanceData.json")),
ToTable = Table.FromRecords({Source}),
Events = ToTable{0}[events],
EventTable = Table.FromList(Events, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
#"Expanded Column1" = Table.ExpandRecordColumn(EventTable, "Column1", {"name", "start", "id", "metrics", "end"}, {"name", "start", "id", "metrics", "end"}),
#"Expanded metrics" = Table.ExpandRecordColumn(#"Expanded Column1", "metrics", {"sourceLabel"}, {"sourceLabel"}),
#"Added Custom1" = Table.AddColumn(#"Expanded metrics", "UserActionID", each if [name]="User Action" then [id] else null),
#"Added Custom2" = Table.AddColumn(#"Added Custom1", "UserActionLabel", each if [name]="User Action" then [sourceLabel] else null),
#"Changed Type" = Table.TransformColumnTypes(#"Added Custom2",{{"start", type datetime}, {"end", type datetime}, {"UserActionID", type text}, {"sourceLabel", type text}, {"UserActionLabel", type text}}),
#"Filled Down" = Table.FillDown(#"Changed Type",{"UserActionID", "UserActionLabel"}),
#"Filtered Rows" = Table.SelectRows(#"Filled Down", each [start] > #datetime(1970, 1, 2, 0, 0, 0)),
#"Filtered Rows1" = Table.SelectRows(#"Filtered Rows", each [end] > #datetime(1970, 1, 2, 0, 0, 0)),
#"Grouped Rows" = Table.Group(#"Filtered Rows1", {"UserActionID", "UserActionLabel"}, {{"Start", each List.Min([start]), type nullable datetime}, {"End", each List.Max([end]), type nullable datetime}}),
#"Added Custom" = Table.AddColumn(#"Grouped Rows", "Duration", each [End]-[Start], type duration),
#"Removed Columns" = Table.RemoveColumns(#"Added Custom",{"UserActionID"})
in
#"Removed Columns"

Here’s the output of this query for the interactions shown above:

Some notes about this query:

  • You will need to change the Source step to point to the JSON file you have exported from Performance Analyzer
  • Each interaction is represented by a row in the table and identified by the UserActionLabel column
  • I’m calculating the durations by finding the minimum start time and the maximum end time for all events associated with an interaction and subtracting the former from the latter
  • There’s a bug (which hopefully gets fixed at some point) where some events have start and end dates in 1970, so I have filtered out any dates that are obviously wrong
  • The Duration column shows how long each interaction took and uses the Power Query duration data type, which is formatted as days.hours:minutes:seconds

The example above is fairly complex showing several different kinds of interactions. If you just want to find the amount of time taken to render all the visuals on a page you can click the Refresh Visuals button in Performance Analyzer to refresh all the visuals on the page – it may not give you a 100% “cold cache” page render but it will be good enough. I’m not a web developer but I think to really do things properly you’ll need to open the report on a blank page in the browser, do an “Empty Cache Hard Reload“, go to edit mode in the report, enable Performance Analyzer, then move to the page you want to test. If you’re testing a DirectQuery model then you’ll also want to include the overhead of opening connections (which can be substantial); the only way I have found to do that is either wait for at least an hour for any connections in the pool to be dropped, or if you’re using a gateway to restart it. One last point to make is that while you can use Performance Analyzer in Power BI Desktop and in the browser the behaviour of Power BI may be different in these two places, so always make sure you measure performance of published reports in the browser because that’s where your users will be using your reports.

Here’s what clicking the Refresh Visuals button in Performance Analyzer to refresh all the visuals on a page looks like:

This results in a single interaction and a single row in the output of the Power Query query above:

In this case you can see that the page refresh took 12.14 seconds.

As you will have realised by now, getting the amount of time it takes to load a report page isn’t straightforward and there are a lot of factors to take into account. Nonetheless using Performance Analyzer in this way is much better than not measuring page load times at all or (as I’ve seen some people do) using a stopwatch. If you try this and find something interesting please let me know: I’m doing a lot of testing with Performance Analyzer and learning new things all the time.

8 thoughts on “Measuring Power BI Report Page Load Times

  1. Well, this only measures the time the single visuals take to load their data and adds them up. It doesn’t measure the time the report really takes to load, so from clicking on the report name in the portal to everything there is to show the results. So it doesn’t consider any logic in the visuals (might be relevant when you use “smart” visuals like Zebra BI), when visuals are loaded (might be relevant when you use hidden visuals), the time to load complex reports (all the metadata, when reports have many sheets) etc.
    The only way we found out is to “remote control” a browser and then really see how it behaves and measure it. I didn’t build the code so please excuse my “high level” answer, but I’ll pass that post to our experts, maybe they can share more details…

    1. No, the point is that it doesn’t just measure the time the visuals take to load their data – if you start on a blank page and then move to a page with visuals, it really does measure the time to render all the visuals on that page including all of the non-data related operations. You’re right that in order to open Performance Analyzer the report already needs to be open and that means you miss some metadata operations that happen when the report first opens, but those operations are ones that you don’t typically have any way of tuning.

  2. Hey Chris!

    How would this approach differ from this PBI tool can do a similar job of analyzing performance of page loads. It works off of the same premise of using the JSON file from the performance recordings to visualize them. It looks like you’re loading it into PowerQuery to analyze it there.

    Not sure if you’re aware of this tool or not (even though you’re rightfully name-dropped multiple times in it).

    https://smartpowerbi.co.uk/2021/10/12/advanced-power-bi-performance-analyser-by-smartpowerbi/

    1. There’s no difference – apart from the fact that that tool was created when you could only use Performance Analyzer in Desktop. There can be differences in performance and behaviour of reports between Desktop and the Service so having Performance Analyzer in the Service means you can now easily do this kind of performance measurement there.

  3. New to Power BI and trying to achieve similar report page load time monitoring via Fabric EventHouse KQL queries. With multiple operations taking place and no obvious common id to tie them together, it seems a lot more complicated than it should be!

Leave a Reply to AnonymousCancel reply