When a user complains about their Power BI report being slow it’s normal for the developer to open up DAX Studio and start tuning the measures in the semantic model because 95% of the time the problem is the DAX. But there’s a whole class of report performance problems that are nothing to do with the semantic model or DAX and they are the ones that are much easier to troubleshoot now that Performance Analyzer is available in the browser as well as Desktop. Today I want to look at one of them: geocoding in map visuals.
What is geocoding? Let me show you an example. I have a semantic model with data about UK real estate sales where each row in the fact table represents the sale of some kind of property like a house or apartment. The model has the address of the property sold and the address includes a postcode (similar to a US zip code). I added an Azure Map visual onto a report and dragged my Postcode data into the Location well of the visual and got this:

Postcodes are just text values. How does Power BI know where each postcode is on a map? It does so by geocoding: sending each postcode to a web service which returns a latitude and longitude for that postcode which allows its location to be plotted. And if, as in this case, you have thousands of postcodes to geocode then this process can be slow.
Because geocoding can be slow Power BI will cache geocoded locations but you can turn this caching off on the Diagnostics tab of the Options dialog using the “Bypass geocoding cache” option:

I did this to get worst-case performance and then refreshed my Azure Map visual with Performance Analyzer running, then exported the output of Performance Analyzer and visualised it with my Performance Analyzer custom visual. Here’s what it showed:

What does this show us?
- The visual as a whole took 107 seconds to display, as shown by the duration of the Visual Container Lifecycle event
- The underlying DAX query (which returns 30002 rows – not all the postcodes but the limit of what can be displayed) took a fraction of a second to return
- The visual took 6 seconds to render
- The geocoding took 106 seconds
So, a great example of where a report is slow but where the model and its measures is not the cause and where Performance Analyzer is the only way to see what the actual cause is.
What can you do to improve performance? Well as I said, with caching enabled (as it is by default) then performance would be nowhere near this bad, but there are some things you can do. First of all it’s obvious that the visual itself displays more data than any user can make sense of and Power BI is sampling data in this case too, so redesigning the report so the user only sees a useable, comprehensible subset of the data on the map would be a good start. You could also geocode the data yourself during your ETL rather than waiting for Power BI to do it when it displays the report – postcodes are not going to change location – and this would give you latitudes and longitudes you can give the Azure Map visual instead. Finally, it’s probably worth experimenting with different visuals, including other map custom visuals, to see if they perform differently for your requirements.