Why You Should Optimise Your Power BI Premium Reports And Refreshes For CPU Time As Well As Duration

When you tune a Power BI report or dataset refresh your first job is to make it run faster – users hate slow reports and late data. However, if you are using Power BI Premium you also need to think about reducing the amount of CPU used by a query or refresh as well, and in this post I’ll explain why.

First of all, what are the metrics I’m talking about? For query performance tuning, if you connect SQL Server Profiler to Power BI Desktop or the Power BI Service (as described here for example) and look at the Query End event you can see all the DAX queries executed when a report is rendered. The TextData column gives the query text, the Duration column shows the amount of time the query took to run in milliseconds and the CPUTime column is a representation of the amount of CPU resources used by the query:

The Command End event gives you similar information for refreshes:

The same data is available in Log Analytics, albeit with slightly different column names: the EventText holds the query text, DurationMs has the duration and CpuTimeMs has the CPU time:

As I said, it’s the duration that, traditionally, you would have wanted to reduce. In fact, in the past, CPU Time didn’t even return accurate values so it wasn’t useful – although that changed recently.

So why is CPU Time so important? In Power BI Premium gen1 you have a fixed set of resources that all your queries and refreshes compete for; if you try to use more CPU than you have available then performance suffers. Power BI Premium gen2 works differently (as explained in detail here and here): you no longer have a fixed set of resources available, but if you use more CPU than your capacity allows you will either pay extra (if you enable autoscale) or get throttled. The Premium Capacity Metrics app gives you a lot of information about CPU usage across your Premium capacities. Power BI Premium Per User is based on the Power BI Premium gen2 platform and some of the same considerations exist for it as for Premium gen2, although they aren’t fully documented or implemented yet and I can’t go into details here.

As a result it’s important for the overall cost and performance of a Premium capacity that all the queries and refreshes that run on it use as little CPU as possible. One last point: while Duration and CPU Time are linked the relationship is not necessarily linear. For example you could tune a Storage Engine-heavy DAX query and achieve a big reduction in CPU Time but a smaller reduction in Duration because the Storage Engine can do a lot of work in parallel across multiple cores; on the other hand tuning a Formula Engine-heavy DAX query might result in an equal reduction in both Duration and CPU Time. Similarly because the overall amount of time taken to refresh a dataset is often determined by the amount of time taken to refresh the biggest table, you might be able to significantly reduce the CPU Time for a refresh without changing the duration much. The CPU used by background operations like refresh is smoothed over a 24 hour period in Premium gen2 so there is no need to try to slow down a refresh to avoid a spike in CPU usage that might lead to throttling or autoscaling, but it is still true that reducing the amount of CPU used by a refresh at any given point in time will mean more resources are available for any queries that are running on the same node at the same time.

[Thanks to Akshai Mirchandani and Giri Nair for the information in this post]

10 responses

  1. Hi Chris,
    One quick question: if log analytics workspace corresponds to premium workspace as 1:1, how do we find overall performance of capacity if there are many workspaces and hence log analytics. Is there an app to consolidate view ?

  2. Pingback: Remember CPU Time when Optimizing Power BI Premium Reports – Curated SQL

  3. Hey Chris, long time fan! Been meaning to ask a question since I haven’t been able to achieve a good answer for this. In PBI when it comes to dataflows and ECE. If you have dataflows linked across workspaces, does ECE benefit us. We can’t really figure this one out. We’re trying to keep dataflows in separate workspaces because we want to perform separation of refresh schedules and can’t really tell much about whether ECE is actually helping us when we look at timings. Any help would be greatly appreciated.

  4. Hey Chris,

    You mentioned the CPU time is now accurate in Log Analytics. Does this mean SE CPU for Queries is now being logged?

    • Not exactly: CPU Time for queries includes everything that can happen when a query runs, including FE, SE and Power Query activity. You should be able to work out SE CPU Time by looking at the SE-related events inside a query though.

      • Hey Chris,

        Thanks!

        Another question as I dig into the Refresh CPU consumption of our datasets, why do I get a much smaller CPU time for Refresh Operations vs the sum of CPU time for the refresh Suboperations?

        e.g. the QueryEnd – Batch Command has a CPU Time of 54,000 ms, while the sum of the Suboperations (EventSubClass of ProgressReportEnd) CPU Time is MUCH higher, in this case 4,100,000 ms?

      • Remember that suboperations are often nested and a parent operation’s CPU Time will be the sum of its child operations plus some extra of its own. Are you sure you’re not including both parent and child operations underneath the Batch command?

  5. Hello Chris, i try to work on Log Analytics Application and when i want to analyse Query End Event i find a user which is “Power BI Service”. How explain this “user” ? For me Query End which contain (DEFINE MEASUE ….) is only a user which interact with my report , no ? I Have also an Query End Operation with its Even Text but these values don’t filter related Queries ? Its normal ?

    Thanks,

Leave a Reply to Christian Cancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: