Calling The Power BI REST API From Microsoft Flow, Part 2: Refreshing A Dataset When a Data Source Changes

For a while now I’ve had an idea stuck in my head: wouldn’t it be cool to build a Power BI solution where a user could enter data into an Excel workbook and then, as soon as they had done so, they could see their new data in a Power BI report? It would be really useful for planning/budgeting applications and what-if analysis. I had hoped that a DirectQuery model using the CData Excel custom connector (mentioned here) might work but the performance wasn’t good enough; using Flow with the Power BI REST API (see Part 1 of this series for details on how to get this set up) gets me closer to my goal, even if there’s still one major problem with the approach. Here’s how…

First of all, I have two Excel files saved in a OneDrive For Business folder:


The contents of each Excel file is more or less the same, a table containing budget data for four quarters:


These are the files that my hypothetical users will be editing. I also have a very basic Power BI dataset that imports the data from these two files, and a report to display the data:


Now for the Flow. For the trigger I use a OneDrive For Business “When a file is modified” that fires when a file in the above folder is changed:


I had to turn on Concurrency Control for the trigger and set the Degree of Parallelism property to 1, to make sure only one instance of the Flow ran at any one time – there’s no point trying to refresh the dataset if another instance of the Flow is already running.


Next I have an action from the Power BI custom connector created in my last post. It calls the Power BI API’s Get Refresh History In Group endpoint and gets details of the last refresh of the dataset:


This returns a JSON value containing the details of the last refresh; the JSON goes to a Parse JSON action so later on I can get the Status of the last refresh. This action requires a schema, which I generated from a sample payload given in the Power BI REST API documentation. I had to delete endTime from the list of required properties in the schema because it won’t be returned if the most recent refresh is currently in progress:


Next there’s an Apply to each to loop over each refresh returned by the API; since only the most recent refresh is returned I know this will only ever execute once.


Inside the Apply to each I have a Condition that looks at the Status property of the most recent refresh and checks whether the Status is Completed:


If the Status is indeed Completed, then the Flow calls the Refresh Dataset In Group endpoint to refresh the dataset:


If the Status of the last refresh is not Completed (either because it failed for some reason, or because there is already a refresh in progress), there is a delay of one minute and the Flow then starts itself again:


As I’ve said before, I’m not an expert on Flow but this seems to work reliably with the limited testing I’ve done on it. Here it is in action:


So what is the problem with this approach? It’s that if you are using Power BI Pro you can only refresh a dataset eight times a day and if you are using Power BI Premium you can only refresh a dataset forty-eight times a day. There have been rumours that this forty-eight times a day refresh limit for Premium has been lifted, or might be lifted soon, or doesn’t apply when you refresh via the API, but I specifically tested this and got the following error message: “Invalid dataset refresh request. Number of refresh requests in last 24 hours exceeded limit” (interestingly this didn’t appear as a failed refresh in the refresh history of the dataset). Since it’s reasonable to assume that any Excel file will be updated more than forty-eight times per day in a busy period I don’t think this approach is usable in the real world; all the same I’m happy with this as a proof-of-concept showing what might be possible if the refresh limit is lifted in the future though.

13 responses

    • Thanks Craig! I’ve got a lot of admiration for the work you’ve done with PowerShell and Flow, and was meaning to give you a mention in a future post in this series.

  1. Chris, this is a great tutorial. Thanks for blogging about it. I’ve often thought about this same use-case. When working with financial and risk time series data, what-if analysis is very important, and Power BI makes such this very difficult outside of very basis WI analysis. Hopefully we’ll see more enhancements built into the platform itself in the coming years to assist in these advanced scenarios, however I’ll probably give something like this a try in the meantime. You could even enhance this by having different files for different users, and apply per user row-level security so that they only see their own WI information.

  2. Hi Chris..just a doubt…Is it correct that you are refreshing the power BI service (manually or through schedule refresh) to get the updated values in the visualization? If so, then what can be the difference between this approach and the connect to the excel files in OneDrive, where any refresh will bring in the new data from the one drive excel file? Am I missing something?

    • Gopa, My understanding is that one drive refresh can only be used for models built using the report builder and not reports built in power bi desktop. Considering the feature disparity that’s a big reason to use this.

      Chris, what I want to figure out is if there is a way to build in a delay so that if you are using a file in onedrive that has autosave on or that will likely be edited many times in a day you can have it delay into an every 90 minutes type schedule that you might use otherwise to cover the whole work day.

  3. Maybe a way to get around the limit is to add some logic to throttle frequent refreshes. Store the refresh count somewhere and maybe check against a table to see if you hit the limit for whatever pacing you want to set per hour (i.e. allow more prime shift, less off-shift). If you’re out of refreshes, just delay until the next refresh period. I don’t know Flow well enough to know how difficult that may be, but it seems like it should be doable from your example.

    • I’ve just seen a comment on LinkedIn from a Microsoft employee stating that they are testing unlimited refreshes via the API as a private preview with some customers. So with a bit of luck this limitation will be gone in a few months…

  4. Pingback: Calling The Power BI REST API From Microsoft Flow, Part 3: Finding All Data Sources Used In A Power BI Tenant « Chris Webb's BI Blog

  5. Pingback: Calling The Power BI REST API From Microsoft Flow, Part 2: Refreshing A Dataset When a Data Source Changes |

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: