Power BI DirectQuery Best Practices For Snowflake And Other Databases

Recently I collaborated with a number of people from Microsoft and Snowflake to write a blog post on best practices for using Power BI in DirectQuery mode on Snowflake. You can read it here:

https://medium.com/@jerellegainey/best-practices-for-using-power-bi-in-directquery-mode-with-snowflake-bfd1312ca7ab

It builds on what is already in the Power BI guidance documentation for DirectQuery to add some advice specific to Snowflake. It also has a few other tips that are generally applicable to all DirectQuery sources and which aren’t in the guidance docs (yet), such as the importance of setting the Maximum Connections Per Data Source property (which I also blogged about recently here) and the fact you can increase this to 30 in a Premium capacity, as well as the importance of always testing DirectQuery performance in the Power BI Service rather than in Power BI Desktop. As a result it’s worth reading if you are thinking of using Power BI in DirectQuery mode with Synapse, BigQuery or any other source.

If you are considering using DirectQuery on a project I have one last piece of advice: think very carefully about why you need to use DirectQuery and not Import mode. Many people don’t and end up in trouble – in my opinion Import mode should be the default choice simply because it will almost always perform better than DirectQuery mode, whatever back-end database you’re using.

6 thoughts on “Power BI DirectQuery Best Practices For Snowflake And Other Databases

  1. Great article! To illustrate the point on “think very carefully about why you need to use DirectQuery and not Import mode”… I’m kicking the tires on a new Snowflake database. Thinking that a table was small, I switched it to import to see if I could speed up a report. 90 minutes and 145 million records later, the table loaded. I’m shocked that it worked and there’s no way I’m keeping a 3 gigabyte Power BI Desktop file, but it opens the door to some interesting options with incremental refresh and Power BI Premium if we need the best possible performance. For now we are switching back to DirectQuery and we know why!

    1. Chris Webb – My name is Chris Webb, and I work on the Fabric CAT team at Microsoft. I blog about Power BI, Power Query, SQL Server Analysis Services, Azure Analysis Services and Excel.
      Chris Webb says:

      145 million rows is nothing for Import mode – the upper limit for Import mode is a few billion rows. You would not want to load all that data into Desktop though, you would do something like this: https://blog.crossjoin.co.uk/2020/05/24/limit-the-amount-of-data-you-work-with-in-power-bi-desktop-using-parameters-and-deployment-pipelines/

  2. Hey Chris, I’m a long time lurker of your awesome articles.

    I’m curious if you’ve done anything with Progress OpenEdge Databases and if you have any suggestions/tips on composite modelling with them (I know it isn’t supported out of the box) or best practices.

    Progress’s own DirectQuery connector leaves much to be desired. I do have ODBC access (direct SQL queries work great, folded… not so much). Thanks again for the time you might give to my question, and thanks again for your awesome content!

    1. Chris Webb – My name is Chris Webb, and I work on the Fabric CAT team at Microsoft. I blog about Power BI, Power Query, SQL Server Analysis Services, Azure Analysis Services and Excel.
      Chris Webb says:

      I have never seen or used Progress OpenEdge databases, sorry.

Leave a ReplyCancel reply