Finding All Selected Items In An Excel Slicer Connected To SSAS, Power BI Or the Excel Data Model Using Dynamic Arrays

The big news in the world of Excel right now is the introduction of dynamic arrays. They’re only available in the Office 365 click-to-run version of Excel and, at the time of writing, only available to people on the Office Insiders programme, but eventually they’ll be available to anyone running Excel for Office 365 on their desktop. There are already lots of blog posts about them including this overview by Jon Acampora, and you probably also want to download Bill Jelen’s detailed mini-book on them here which is free for the rest of 2018. Now I’m not an Excel expert by any stretch of the imagination but I’m as excited as anyone else about them because they will be incredibly useful for anyone building reports using Excel cube functions against Analysis Services, the Excel Data Model/Power Pivot and Power BI. Bill Jelen’s book has a short section on this subject but the possibilities are limitless…

Here’s one example of how they can be used. A while ago I blogged about how to use a regular array formula and the TextJoin() Excel function to get all the selected items from a slicer. Dynamic arrays make this problem much easier to solve. Take the following table loaded into the Excel Data Model:

Capture1

Now, say you have a PivotTable built from this and a slicer (called Slicer_Fruit) connected to it:

Capture2

It’s possible to use the CubeSet() function to get the set of selected items in a slicer using the following formula:

=CUBESET("ThisWorkbookDataModel",Slicer_Fruit,"Slicer Set")

Assuming this formula is in cell H1, you can then get the number of items in this set using CubeSetCount():

=CUBESETCOUNT($H$1)

Assuming this is in cell H2, all you need to do to get a comma-delimited list of all the selected items in the slicer via this set is:

=
 TEXTJOIN(
  ", ", 
  TRUE, 
  CUBERANKEDMEMBER(
   "ThisWorkbookDataModel",
   $H$1, 
   SEQUENCE($H$2,1)
  )
 )

Capture3

Here it is in action:

demo

It works as follows:

  • The new Sequence() function is used to create a dynamic array of numbers from one to the number returned by the CubeSetCount() function.
  • The CubeRankedMember() gets the name of each of the items in the set using the numbers returned by Sequence()
  • TextJoin() then concatenates all of the names returned by CubeRankedMember() into a single comma-delimited list.

You can download the sample workbook here (remember it will only work if you have a version of Excel installed that includes dynamic arrays!).

Calling The Power BI REST API From Microsoft Flow, Part 2: Refreshing A Dataset When a Data Source Changes

For a while now I’ve had an idea stuck in my head: wouldn’t it be cool to build a Power BI solution where a user could enter data into an Excel workbook and then, as soon as they had done so, they could see their new data in a Power BI report? It would be really useful for planning/budgeting applications and what-if analysis. I had hoped that a DirectQuery model using the CData Excel custom connector (mentioned here) might work but the performance wasn’t good enough; using Flow with the Power BI REST API (see Part 1 of this series for details on how to get this set up) gets me closer to my goal, even if there’s still one major problem with the approach. Here’s how…

First of all, I have two Excel files saved in a OneDrive For Business folder:

image

The contents of each Excel file is more or less the same, a table containing budget data for four quarters:

image

These are the files that my hypothetical users will be editing. I also have a very basic Power BI dataset that imports the data from these two files, and a report to display the data:

image

Now for the Flow. For the trigger I use a OneDrive For Business “When a file is modified” that fires when a file in the above folder is changed:

image

I had to turn on Concurrency Control for the trigger and set the Degree of Parallelism property to 1, to make sure only one instance of the Flow ran at any one time – there’s no point trying to refresh the dataset if another instance of the Flow is already running.

image

Next I have an action from the Power BI custom connector created in my last post. It calls the Power BI API’s Get Refresh History In Group endpoint and gets details of the last refresh of the dataset:

image

This returns a JSON value containing the details of the last refresh; the JSON goes to a Parse JSON action so later on I can get the Status of the last refresh. This action requires a schema, which I generated from a sample payload given in the Power BI REST API documentation. I had to delete endTime from the list of required properties in the schema because it won’t be returned if the most recent refresh is currently in progress:

image

Next there’s an Apply to each to loop over each refresh returned by the API; since only the most recent refresh is returned I know this will only ever execute once.

image

Inside the Apply to each I have a Condition that looks at the Status property of the most recent refresh and checks whether the Status is Completed:

image

If the Status is indeed Completed, then the Flow calls the Refresh Dataset In Group endpoint to refresh the dataset:

image

If the Status of the last refresh is not Completed (either because it failed for some reason, or because there is already a refresh in progress), there is a delay of one minute and the Flow then starts itself again:

image

As I’ve said before, I’m not an expert on Flow but this seems to work reliably with the limited testing I’ve done on it. Here it is in action:

ExcelBudgetDemo

So what is the problem with this approach? It’s that if you are using Power BI Pro you can only refresh a dataset eight times a day and if you are using Power BI Premium you can only refresh a dataset forty-eight times a day. There have been rumours that this forty-eight times a day refresh limit for Premium has been lifted, or might be lifted soon, or doesn’t apply when you refresh via the API, but I specifically tested this and got the following error message: “Invalid dataset refresh request. Number of refresh requests in last 24 hours exceeded limit” (interestingly this didn’t appear as a failed refresh in the refresh history of the dataset). Since it’s reasonable to assume that any Excel file will be updated more than forty-eight times per day in a busy period I don’t think this approach is usable in the real world; all the same I’m happy with this as a proof-of-concept showing what might be possible if the refresh limit is lifted in the future though.

Calling The Power BI REST API From Microsoft Flow, Part 1: Creating A Flow Custom Connector

Playing around with Microsoft Flow recently, I was reminded of the following blog post from a few months ago by Konstantinos Ioannou about using Flow to call the Power BI REST API to refresh a dataset:

https://medium.com/@Konstantinos_Ioannou/refresh-powerbi-dataset-with-microsoft-flow-73836c727c33

I was impressed by this post when I read it, but don’t think I understood quite how many exciting possibilities this technique opens up for Power BI users until I started to use it myself. The Power BI dev team are making a big investment in the API yet most Power BI users, myself included, are not developers and can’t easily write code (or PowerShell scripts) to call the API. With Flow, however, you can use the API without writing any code at all and solve a whole series of  common problems easily. In this series of blog posts I’m going to show a few examples of this.

Before that, though, there is something to add to what Konstantinos says about creating the custom connector. In his post he describes how you can add individual actions to the custom connector, but with Flow you can also create a custom connector from an OpenAPI definition and it turns out that there is an OpenAPI definition file for the Power BI REST API available here. As a result you can create a Flow custom connector with (at the time of writing) a whopping 116 actions by following the instructions in Konstantinos’s post, but instead of using the “Create from blank” option using the “Import an OpenAPI file” option instead and uploading the Power BI REST API OpenAPI definition file. You’ll find that the text for the summaries and descriptions of many of the actions needs cleaning up – in particular, Flow doesn’t allow actions with names that finish with a dot so you’ll have to go through each one and remove that character – but doing this is much faster and easier than adding the actions manually.

image

And one last thing to point out: as Konstantinos mentions in his post, if you get a 404 error while testing the connector, just wait a few minutes!

Testing Power BI Premium Features With Power BI Embedded

It is very difficult for an organisation to decide whether to buy Power BI Premium or to stick with Power BI Pro. Power BI Premium represents a big financial commitment, but how do you know whether the benefits will be worth the costs involved? This was a question that Melissa Coates and I dealt with in great detail in the second version of our white paper “Planning A Power BI Enterprise Deployment”, and I strongly suggest you read the relevant section (pages 37-42) if you are considering buying Premium.

However, it’s one thing to read a white paper and another to actually test Premium yourself – and right now there is no way of trying Premium before you buy it. As more and more features like SSRS paginated reports or linked entities in dataflows get added to Premium, the more likely it is you are going to want to test these features before buying Premium. So what do you do? The answer is to use Power BI Embedded instead. It’s not exactly the same thing as Premium (the setup experience is different, for one thing), but there are three significant facts to point out:

  • I have been assured by several senior Microsoft employees that all new Premium features will be available in both the Premium (EM and P) SKUs and the Embedded (A) SKUs. So, for example, when the SSRS paginated report feature is released it will be available in both Premium and Embedded.
  • The resources available (in terms of the number of v-cores and memory) in the various Premium SKUs mirror those available in the Embedded SKUs. For example a P1 Premium SKU has the same resources available as an A4 Embedded SKU.
  • With Power BI Embedded, unlike Power BI Premium, you only need to pay for what you use: you can pause a Power BI Embedded capacity when you are not using it and pay nothing.

Therefore, to sum up, if you want to test Premium features before you buy, all you need to do is create a new Power BI Embedded capacity in the Azure portal and assign a Workspace to it – and you’ll get access to all the Premium features. When you’ve finished just pause the capacity. You’ll still need to pay while you’re testing but it will be a fraction of the cost of buying Premium.

Can You Use Power Query In Power BI Or Excel To Write Data To A Data Source?

The short answer: no.

The long answer:

It is, technically, possible to use Power Query (in either Power BI or Excel) to write data to a data source; for example, here’s an old post I wrote showing how to run a UPDATE statement to update data in a table in SQL Server. However because the Power Query engine may evaluate a value or statement more than once when a query is executed, it is therefore not safe or supported to use it to try to create, insert, update, delete or otherwise change data in a data source – it should only be used to read data. You may see some functions listed in the M function reference that suggest otherwise but they aren’t supported either, and indeed may not work at all.

Thanks to Matt Masson of the dev team for providing the official position on this.

A Function To Generate The M Code For A Table Type

This is going to sound obscure, and it is, but recently I’ve been using the #table() intrinsic function in M a lot – specifically the version that takes a table type as its first parameter (as I describe here) – and because it’s a bit of a pain to have to write the M code for a table type by hand, I’ve written an M function that takes a table and returns the text for the M code that is needed to define a table type. Here it is:

(InputTable as table) as text =>
let
    Source = 
        Table.Schema(InputTable),
    SortRows = 
        Table.Sort(
            Source,
            {{"Position", Order.Ascending}}),
    RemoveColumns = 
        Table.SelectColumns(
            SortRows,
            {"Name", "TypeName"}),
    AddCustom = 
        Table.AddColumn(
            RemoveColumns, 
            "TypeNames", 
            each 
            Expression.Identifier([Name]) & " = " & [TypeName]),
    Output = 
        "[" & Text.Combine(AddCustom[TypeNames], ", ") & "]"
in
    Output

Nothing complex here, but now I’ve posted this I know that in the future I’ll be able to Google for it when I’m working onsite with a customer and I need it!

To give you an idea of how it works, take the table that is returned by the following M expression, which calls the public TripPin OData web service:

OData.Feed(
"https://services.odata.org/TripPinRESTierService/Airports", 
null, 
[Implementation="2.0"])

image

Passing this table to the function above returns the following text, the M code for a record that lists the names of the columns in this table and their data types, suitable for use with #table:

[Name = Text.Type, IcaoCode = Text.Type, 
IataCode = Text.Type, Location = Record.Type]

Using Power Query And Microsoft Flow To Automate The Creation Of CSV Files

I was super-excited when, a few days ago, the long-awaited integration of Power Query into Microsoft Flow was announced. I was then gutted when work commitments meant I couldn’t blog about it right away and Erik Svensen beat me to it with this excellent post showing how to push data from SQL Server via Power Query and Flow to a Power BI streaming dataset:

https://eriksvensen.wordpress.com/2018/09/25/powerquery-everywhere-now-in-microsoftflow-as-well/

Anyway, the integration of Power Query into Flow opens up so many interesting possibilities so I couldn’t resist writing about it myself today. For example: something I am asked about a lot is whether it is possible to use Power Query to automate the creation of CSV files. You can’t do this in a supported way in Power Query in Excel or Power BI (and yes I know about all of the slightly hacky ways people have done it like this), but it is absolutely something you can do with Power Query in Flow.

Here’s how:

image

First of all I have a Schedule – Recurrence trigger, which allows you to run a flow on a schedule. In this case I’m setting the Flow to run once a day at 6am UK time:

image

Next I have the all-important Transform data using Power Query action. At the moment you can only use Power Query with SQL Server data sources, and (at least when I tested it) it only seems to work with Azure SQL Database data sources and not on-premises SQL Server databases via a gateway – hopefully this gets fixed soon. Clicking the Edit Query button:

image

…opens up the Power Query online editor, which will be familiar to anyone who has used Power Query in Excel or Power BI Desktop, and you build your query:

image

Next I have a Create CSV Table action to convert the results returned by the query to csv format:

image

And finally I take the csv data and save it to a CSV file in OneDrive For Business with a Create File action:

image 

Run the Flow and the CSV file is created:

image

And I can view the contents in Excel Online:

image

The Power Query/Flow integration is still in Preview and I found a few things didn’t work reliably: for example the first few times I ran my Flow I got errors saying that it couldn’t connect to the Azure SQL Database, even though it clearly could while I was designing the query, but that error went away after a while. What’s more it only works for SQL Server data sources right now and I really hope that it is enabled for all the other data sources that Power Query can connect to, especially Excel. These are just teething troubles though, and it’s clear that this is going to be revolutionary for Power Query and Flow users alike!

I really need to improve my Flow skills now…

%d bloggers like this: