In the first post in this series I showed how any Power BI dataset refresh started via the Power BI portal or API is limited to 2 hours in Shared capacity and 5 hours in Premium capacity, and how you could work around that by running a refresh via Premium’s XMLA endpoint feature. In the second post in this series I showed how some M functions allow you to set timeouts. However, even if you initiate a refresh via the XMLA endpoint you may still get a timeout error and in this post I’ll discuss another reason why: the External Command Timeout.
This property is a hangover from Analysis Services (you can see it documented here). It represents the amount of time the Analysis Services engine inside Power BI will wait to get data from a data source. How it behaves exactly depends on the data source: it may limit the amount of time it takes to get the first row of data from the source or it may limit the amount of time it takes to get all the rows of data. In Power BI Premium it is set to five hours, which means that no single partition can take more than about five hours to refresh. In the first post in this series I worked around this by creating a dataset with multiple partitions, each of which took about an hour to refresh, but when trying to refresh a dataset with a single partition that takes more than five hours I got the following error when trying to refresh from SQL Server Management Studio through the XMLA Endpoint:

The error message here is:
Timeout expired. The timeout period elapsed prior to completion of the operation.. The exception was raised by the IDbCommand interface.
In this case I saw the same error in the Error event in Profiler:
…and in other cases, when testing a different source, I got a different error in Profiler in the Progress Report Error event:
The message here is:
Error processing partition ‘<pii>SlowCommand-2114bb81-69d3-4fe4-9d54-6b2661b0c497</pii>’ of table ‘<pii>SlowCommand</pii>’ [Internal Error Code: ‘0xc112001a’].
There’s no way to avoid the External Command Timeout. Instead, what you need to do is either change your partitioning strategy so each partition refreshes in under five hours or tune your data source, M code or gateway (if you’re using one) so that data is returned to Power BI faster.
In Shared capacity I believe the External Command Timeout is set to two hours (again, to match the overall refresh timeout) but it’s much less important there because you can’t create partitions manually (the only way a dataset in Shared can be partitioned is by setting up incremental refresh) and there’s no XMLA Endpoint so there’s no way to work around the two hour overall refresh limit anyway.
[Thanks, as always, to Akshai Mirchandani for a lot of the information in this post]
UPDATE August 2022: the ExternalCommandTimeout property can now be set as a workspace-level server property. For more details see https://powerbi.microsoft.com/en-us/blog/analysis-services-server-properties-in-power-bi-premium-are-now-in-public-preview/
I keep forgetting how handy it is to connect profiler to the XMLA endpoints. There are other things that have mysterious timeouts too. I would love to investigate why “overwrite” publishing of a small report to the PBI Service will take a long time before hitting an internal timeout of some kind. Even a small report can take a lot of time to “overwrite”. I often get so frustrated that I just delete (or rename) the original dataset and publish a “fresh” one. Of course then I have to repair a few things related to the gateway, security, etc.
I would hope profiler can give me that some visibility into why my “overwrite” operations are slow; we shall see….
Server logging is a common thing to find in software – especially in services. It would be awesome if the PBI service would provide some basic audit logs to capture when operations begin and end (like the beginning and ending of a publish operation, along with the size of the pbix). This type of log is something I could send to tech support.
Great post as always Chris. Have to say though, that I think if you have single commands that are running longer than a few hours as part of a refresh, that the bigger issue is the design of the system 🙂
Exactly…
Hello Chris,
In my organization, many users pull out data from Analysis Services in import mode (I know it isn’t a perfect approach, but it is what it is) to Power BI and mash this up with other data sources, i.e. SharePoint. As a result, there are a lot of timeouts.
How would you approach such a requirement where we have SQL SAS, then we want to export some data from it and merge it with files located on SharePoint in Power Query?
Thank you for your answer and regards!