Refreshing A Power BI Semantic Model With Eventstreams, Pipelines And Activator

Following on from my last post where I showed how to send data from Power Automate to a Fabric Eventstream, in this post I’m going to show how to use it to solve one of my favourite problems: refreshing a Power BI semantic model that uses an Excel workbook stored in OneDrive as a source when that Excel workbook is modified.

Now before I go on I want to be clear that I know this is a ridiculously over-engineered and expensive (in terms of CUs) solution, and that you can do almost the same thing just using Power Automate or in several other different ways – see my colleague Mark Pryce-Maher’s recent videos on using Fabric Open Mirroring with Excel for example. I’m doing this to teach myself Fabric Eventstreams and Activator and see what’s possible with them. Please excuse any mistakes or bad practices.

To start off, I built a Power BI semantic model containing a single table, connected to an Excel workbook stored in OneDrive for Business:

I then built a Fabric Pipeline to refresh the semantic model, using the newly-enhanced Semantic Model Refresh activity:

Next, I created a new Fabric Eventstream and created a slightly more sophisticated version of the Power Automate flow in my previous post that ran every 30 seconds, checked to see if the Excel file was modified, and if it was, sent an event to the Eventstream:

The Compose action in this flow generates some JSON that is sent to the Eventstream and contains the name of the file that has been altered and a dummy value called EventCounter that always contains 1:

I then connected the Eventstream to a new Activator:

In the Activator, I created a new object called ExcelWorkbook. I used the FileName field as the unique identifier and added the EventCounter field as a property:

Finally I created a new rule on the EventCounter property that aggregated the events over 1 minute windows and then fired when the total number of events in each window changed to the value 0 and stayed at 0 for 1 minute:

This is where the real value of Activator’s rules comes in. As I said, it’s easy using other approaches to refresh a semantic model when an Excel file changes. The problem with doing this, however, is that Excel files in OneDrive for Business autosave on a regular basis and if you make several changes to the file over the space of several minutes, you’ll end up trying to refresh the semantic model all the time. This can lead to excessive CU usage on your capacity because you’re refreshing too often as well as situations where one refresh fails because another is already in progress. What the rule above does is only trigger a refresh when changes to the Excel workbook have been detected (so the Total is greater than 0), then there has been a minute where no changes have been detected (so the Total changes to 0) and there has been a further minute where no changes have been detected (so the Total stays at 0). Since Activator rules can trigger Fabric items, I hooked my rule up to the Pipeline shown above to run the refresh:

This screenshot of the Definition tab of the rule shows a pattern of events where there were four one-minute windows where the Excel file was edited and the rule fired three times:

The problem with this rule is that if the Excel file is edited constantly for a long period, so that there is no period of inactivity, the semantic model would not get refreshed until that period of activity is over, which might be too long to wait. I guess the solution would be another rule that detects periods of, say, 10 minutes when the file has been edited constantly and refreshes the model even if there is no period of inactivity.

All in all I think this shows how the combination of Power BI/Power Platform and new features in Fabric can be used to solve old problems in new, interesting ways. I’m looking forward to learning more about Fabric Real-Time Intelligence and Activator!

[Thanks to James Hutton from the Activator team for his help with this post]

Send Data From Power Automate To Fabric Using Eventstreams

Fabric’s Real-Time Intelligence features are, for me, the most interesting things to learn about in the platform. I’m not going to pretend to be an expert in them – far from it – but they are quite easy to use and they open up some interesting possibilities for low-code/no-code people like me. The other day I was wondering if it was possible to send events and data from Power Automate to Fabric using Eventstreams and it turns out it is quite easy to do.

Here’s a really simple illustration. I created a new Eventstream in a Fabric workspace, then clicked the “Use custom endpoint” option and published the Eventstream so it was ready to send data to:

Then, in Power Automate, I built a super-simple flow with a “Manually trigger a flow” trigger and then the “Send event” action from the Event Hubs connector:

This works because you can send data from Power Automate to the Eventstream custom endpoint using the Event Hub protocol. To set up the connection to the Eventstream in Power Automate I entered a name for the connection, chose the “Access Key” authentication type and then copied the “Connection string-primary key” value from the Eventstream custom endpoint, shown in the first screenshot above (you need to click the eye icon next to it in order to copy it to the clipboard) and pasted it into the “Connection String” property in the Power Automate connector:

Then I configured the “Send event” action like so:

I had to manually paste the Event Hub Name, shown in the first screenshot above, as a custom value because I got an error when I tried to use Power Automate’s dropdown box to find the name. I put the Timestamp from the “Manually trigger a flow” trigger into the Content property.

After saving the flow I was then able to trigger it from my browser or the Power Automate mobile app, and every time the flow ran it sent the time of the run to my Eventstream:

Very basic but it proves the point. What could you use this for? Power Automate has a huge number of connectors for Microsoft and third party services: for example you can trigger a flow when a file stored in OneDrive for Business is modified or when new content is published to an RSS feed. Sending these events and their related data on to a Fabric Eventstream not only means you can capture them for analysis in OneLake or an EventHouse but also send the events on to Fabric Activator, which can apply sophisticated rules so you can be alerted when something happens and specific conditions are met in Teams or via email, run Fabric items such as notebooks or even trigger Power Automate flows. I’ll explore some of these scenarios in more detail in future blog posts.