Power BI Video Training Now Available

If you can’t make it to one of my classroom-based courses in London next year, today I’m pleased to announce that my new Power BI video training course is now available via my friends at Project Botticelli. It covers the following topics:

  • What is Power BI? (Free)
  • Connecting to Data Sources
  • Data Import Concepts
  • Transforming Data with the Query Editor
  • Advanced Data Loading
  • Modelling Data
  • Basic DAX
  • Power BI Desktop vs Excel
  • Cloud and Desktop Power BI Dashboards and Reports
  • Building Reports in Excel
  • Data Refresh (Free)
  • Administration and Auditing (Free)

Several of the shorter videos are free to view. Full details can be found here:

https://projectbotticelli.com/power-bi-desktop-and-cloud-course-video-tutorial?pk_campaign=tt2017a

Other courses available on the site include my MDX and SSAS Multidimensional courses, plus Marco Russo’s DAX course and many others.

If you’re quick, you can use the code CHRIS2017CYBER15 to get a 15% discount on subscriptions before Tuesday November 28th 2017!

In-Person Power BI And Azure Analysis Services Training In London

I’m pleased to announce two in-person training dates in London next year.

First of all, I’ll be running a three-day “Introduction to Power BI” course from January 29th to January 31st 2018.  Suitable for BI pros, analysts and anyone who needs to use Power BI to build reports, I’ll be covering data loading, data modelling, a bit of M, a lot of DAX, report design, publishing, security and administration. Full details and registration can be found here:

http://technitrain.com/coursedetail.php?c=84&trackingcode=CWB

Not long after, on February 22nd 2018, I’ll be teaching a full-day preconference seminar at SQLBits in London on Azure Analysis Services. The agenda can be found here:

http://sqlbits.com/information/event17/Introduction_to_Azure_Analysis_Services/trainingdetails.aspx

…but, to be honest, there’s likely to be a lot of cool new functionality released for Azure SSAS between now and then so that will all have to be fitted in too. SQLBits is, of course, the biggest SQL Server/Microsoft data platform conference in Europe, ridiculously good value for money and loads of fun. Pricing and registration details can be found here:

http://sqlbits.com/information/pricing

UPDATE: the SQLBits team wanted me to make a video to promote my precon, so I outsourced the job to my daughter. Here’s the result:

The Extension.Contents() M Function

Following on from my post last week about M functions that are only available in custom data extensions, here’s a quick explanation of one of those functions: Extension.Contents().

Basically, it allows you to access the contents of any file you include in the .mez file of your custom data connector. Say you have a text file called MyTextFile.txt:

image

If you create a new Power BI Custom Data Connector project using the SDK, you can add this file to your project in Visual Studio like any other file:

image

Next, select the file and in the Visual Studio Properties pane set the Build Action property to Compile:

image

Setting this property means that when your custom data connector is built, this file is included inside it (the .mez file is just a zip file – if you unzip it you’ll now find this file inside).

Next, in the .pq file that contains the M code for your custom data connector, you can access the contents of this file as binary data using Extension.Contents(“MyTextFile.txt”). Here’s an example function for use in a custom data connector that does this:

[DataSource.Kind="ExtensionContentsDemo", 
Publish="ExtensionContentsDemo.Publish"]
shared ExtensionContentsDemo.Contents = () =>
    let
        GetFileContents = Extension.Contents("MyTextFile.txt"),
        ConvertToText = Text.FromBinary(GetFileContents)
    in
        ConvertToText;

image

In the let expression here the GetFileContents step returns the contents of the text file as binary data and the ConvertToText step calls Text.FromBinary() to turn the binary data into a text value.

When this function is, in turn, called it returns the text from the text file. Here’s a screenshot of a query run from Power BI Desktop (after the custom data connector has been compiled and loaded into Power BI) that does this:

image

BI Survey 17: Power BI Highlights

Every year, in return for publicising it to my readers, I get sent a copy of the findings of the BI Survey – the largest survey of BI product users in the world. As always they make for very interesting reading indeed, and although I can’t give away all the details I have been allowed to blog about a few of the highlights for Power BI:

  • Power BI is now the third most frequently considered product in competitive evaluations, after QlikView and Tableau.
  • Indeed, based on responses from vendors and resellers, Microsoft is now the third ‘most significant competitor’ after Tableau and Qlik and is in a much stronger position than it was two years ago – clearly as a result of the impact that Power BI has made, although Excel, SSAS and SSRS contribute to this too.
  • Unsurprisingly Power BI’s exceptional price/performance ratio is the main reason that organisations purchase it. Nonetheless it gets very high satisfaction ratings too.
  • Power BI is also now the third most frequently used front-end for SSAS, after SSRS (which is in the #1 spot and quite a way out in front) and Excel (either on its own or with an AddIn).

Overall it’s a very strong showing for Microsoft. If you’re conducting a competitive evaluation of BI tools, or if you’re a BI vendor, it’s probably worth buying a copy of the full findings.

Which M Functions Are Only Available To Custom Data Connectors?

Here’s one for all you M geeks out there. If you look at the example code for custom data connectors on the Power BI custom data connectors GitHub repo it’s clear that there are several M functions that are available in custom data connectors that aren’t available in Power BI Desktop. But what are they? As I’m sure you know, you can get a list of all the functions, types and enums available in M using the #shared keyword. Well, I created a simple custom data connector that calls #shared and returned a table of all the functions, types and enums available to a custom data connector, and then compared that table with what #shared returns when you run it in Power BI Desktop. This made it easy to find a list of M functions, types and enums that are only available in custom data connectors, and here are all 34:

  • CryptoAlgorithm.Type
  • CryptoAlgorithm.SHA1
  • CryptoAlgorithm.SHA256
  • Crypto.CreateHmac
  • Crypto.CreateHash
  • Web.SignForOAuth1
  • OAuth1.Type
  • OAuth1.HMACSHA1
  • OAuth1.RSASHA1
  • Extension.Module
  • Extension.CurrentCredential
  • Extension.CurrentApplication
  • Extension.CredentialError
  • Extension.LoadString
  • Extension.Contents
  • Credential.AccessDenied
  • Credential.AccessForbidden
  • Credential.EncryptionNotSupported
  • Credential.NativeQueryPermission
  • Error.Unexpected
  • Uri.Type
  • Binary.End
  • Action.Type
  • Action.Sequence
  • Action.Return
  • Action.Try
  • Action.DoNothing
  • ValueAction.Replace
  • ValueAction.NativeStatement
  • TableAction.InsertRows
  • TableAction.UpdateRows
  • TableAction.DeleteRows
  • WebAction.Request
  • Delta.Since

Some of these, like the Action functions, are documented in the Power Query function reference, and as I said others are mentioned in the Power BI custom data connectors GitHub repo, but there’s definitely some detective work to do here…

Exploring The New SSRS 2017 API In Power BI

One of the new features in Reporting Services 2017 is the new REST API. The announcement is here:

https://blogs.msdn.microsoft.com/sqlrsteamblog/2017/10/02/sql-server-2017-reporting-services-now-generally-available/

And the online documentation for the API is here:

https://app.swaggerhub.com/apis/microsoft-rs/SSRS/2.0

Interestingly, the new API seems to be OData compliant – which means you can browse it in Power BI/Get&Transform/Power Query and build your own reports from it. For example in Power BI Desktop I can browse the API of the SSRS instance installed on my local machine by entering the following URL:

http://localhost/reports/api/v2.0

…into a new OData feed connection:

image

image

image

This means you can build Power BI reports on all aspects of your SSRS reports (reports on reports – how meta is that?), datasets, data sources, subscriptions and so on. I guess this will be useful for any Power BI fans who also have to maintain and monitor a large number of SSRS reports.

However, the most interesting (to me) function isn’t exposed when you browse the API in this way – it’s the /DataSets({Id})/Model.GetData function. This function returns the data from an SSRS dataset. It isn’t possible to call this function direct from M code in Power BI or Excel because it involves making a POST request to a web service and that’s not something that Power BI or Excel support. However it is possible to call this function from a Power BI custom data extension – I built a quick PoC to prove that it works. This means that it would be possible to build a custom data extension that connects to SSRS and that allows a user to import data from any SSRS dataset. Why do this? Well, it would turn SSRS into a kind of centralised repository for data, with the same data being shared with SSRS reports and Power BI (and eventually Excel, when Excel supports custom data extensions). SSRS dataset caching would also come in handy here, allowing you to do things like run an expensive SQL query once, cache it in SSRS, then share the cached results with multiple reports both in SSRS and Power BI. Would this really be useful? Hmm, I’m not sure, but I thought I’d post the idea here to see what you all think…

Setting Data Types On Columns Containing Table Values In M

Last week I was working on a Power BI custom data connector for a customer and I came across a problem that has been bothering me for a while. The “Invoke Custom Function” button is a very powerful feature (see here for how to use it) and I was doing something very similar in my code, but after you click it and expand the table column it returns, you always have to set the data types on the new columns that appear – even if the function you have invoked returns a table with data types set on columns. I wanted to avoid this extra step. I knew it was possible because some data sources like SQL Server give you typed columns when you expand a table column; I also saw that TripPin custom data connector tutorial aims to cover this topic… soon. Luckily it wasn’t too hard to work out.

Imagine you have the following function:

(x as number, y as number) as table =>
let
    Addition = x + y,
    Multiplication = x * y,
    Output = #table(
	type table [Operation=text, Result=number],
	{
	{"Addition", Addition},
	{"Multiplication", Multiplication}
	})
in
    Output

It takes two numbers and returns a table with two columns and two rows showing the sum and the product of the two numbers. If the function is called AddAndMult, then the following expression:

AddAndMult(5, 6)

…returns the following table:

image

Notice that the function returns a table with data types set for the columns: Operation is a text column and Result is a number column. I’ve done this by specifying a table type in the first parameter of the #table() function as described here.

Now, here’s a query that calls the function on a table containing several rows using the Invoke Custom Function button:

let
    Source = #table(
	type table[x=number, y=number],
	{
	{4,7},
	{8,2},
	{1,9}
	}),
    #"Invoked Custom Function" = Table.AddColumn(
	Source,
	"AddAndMult",
	each AddAndMult([x], [y])),
    #"Expanded AddAndMult" = Table.ExpandTableColumn(
	#"Invoked Custom Function",
	"AddAndMult",
	{"Operation", "Result"},
	{"AddAndMult.Operation", "AddAndMult.Result"})
in
    #"Expanded AddAndMult"

This is the table returned by the Source step:

image

This is what the Invoked Custom Function step looks like:

image

And this is what the Expanded AddAndMult step looks like:

image

In the last two screenshots the ABC123 icon in the column headers show that they are set to use the Any data type; the columns returned by calling the function have lost their data types.

The key to solving this problem is using the optional fourth parameter of the Table.AddColumn() function, which allows you to set a data type for the column that function adds to a table. Altering the Invoked Custom Function step of the previous query to do this, setting the new column to be a table type like so:

#"Invoked Custom Function" =
	Table.AddColumn(
		Source,
		"AddAndMult",
		each AddAndMult([x], [y]),
		type table [Operation=text, Result=number]
	),

 

…means the Invoked Custom Function step now returns a column of type table, with the appropriate icon in the top left column:

image

…and once the AddAndMult column is expanded, the new columns have the desired data types set:

DT