Continuing my series on Power BI memory errors (see part 1, part 2 and part 3), in this post I’ll look at the query memory limit which controls the amount of memory that an individual query can consume. This is a subject which I’ve talked about in a few blog posts in the past: hitting this limit is one of the reasons you’ll see the “This visual has exceeded the available resources” error which I blogged about here and and here. There’s also some official documentation here which is pretty good.
As I mentioned in part 1 of this series, when you run a DAX query against a Power BI semantic model it will use memory. How much memory it uses will depend on your data, how you have modelled it, what data your query is requesting and how you have written any DAX calculations used by your query. For obvious reasons we at Microsoft can’t let queries use an infinite amount of memory in the Power BI Service so there is a limit – the query memory limit – on how much memory an individual query can use. This limit varies depending on whether you are using Shared (also known as Power BI Pro) or Premium/Fabric capacity, and by the size/SKU of the capacity you’re using. The different limits for each SKU are documented here.
While you can’t increase the query memory limit over the published thresholds, capacity admins can reduce the query memory limit in the admin portal by setting the “Query Memory Limit %” property on the capacity.

The default value of this property is 0, which means the maximum value for the capacity SKU you’re using will be applied; any other value will be interpreted as a percentage of that maximum value. Reducing this value can be a good way to stop your developers from using inefficient DAX in their reports and consuming too many resources on your capacity – it won’t directly reduce a query’s CU usage but DAX expressions that are very memory-intensive often use a lot of CUs too.
If the DAX query generated by a visual in a Power BI report hits the query memory limit you’ll see an error in your report. The first reaction many Power BI developers have when they hit the limit is to ask if there is a way to increase it – which you can only do by buying a larger capacity – but to be brutally honest, if you are hitting the query memory limit you have done something wrong and you need to fix your model, your DAX calculations or your report to reduce memory usage. Doing this is likely to improve report performance too. If you’re encountering the error in Power BI Desktop you can raise the limit but this is only possible in Desktop and if you do this you may end up getting the error again after you publish.
Here’s the error you’ll see in Power BI Desktop if you hit the query memory limit:


This visual has exceeded the available resources. Try filtering to decrease the amount of data displayed.
If you hit the query memory limit in a published report you’ll see this error message:


In this case you not only get the “Visual has exceeded the available resources” error but going to the details shows a more helpful message that tells you what the current query memory limit is:
Resource Governing: This query uses more memory than the configured limit. The query — or calculations referenced by it — might be too memory-intensive to run. Either simplify the query or its calculations, or if using Power BI Premium, you may reach out to your capacity administrator to see if they can increase the per-query memory limit. More details: consumed memory 1 MB, memory limit 1 MB. See https://go.microsoft.com/fwlink/?linkid=2159752 to learn more.
The error number associated with this error is 0xC13E0004 (-1052901372).
How do you know how much memory your query is using to check how near you are to the query memory limit? The new Execution Metrics event in Profiler and Log Analytics will tell you. If you find the event associated with the Query End event for the query (something I discussed here) the approximatePeakMemConsumptionKB metric gives the peak memory usage of the query reached while it was running, and it is this value that must not exceed the Query Memory Limit.

{
"timeStart": "2024-06-21T16:24:16.608Z",
"timeEnd": "2024-06-21T16:24:18.326Z",
"durationMs": 1719,
"vertipaqJobCpuTimeMs": 1078,
"queryProcessingCpuTimeMs": 531,
"totalCpuTimeMs": 1609,
"executionDelayMs": 0,
"approximatePeakMemConsumptionKB": 46662,
"commandType": 27,
"queryDialect": 3,
"queryResultRows": 502
}
If you can’t use Profiler or Log Analytics there’s no direct way of knowing how much memory is using. You can however set a custom memory limit in Power BI Desktop and keep changing it until you see, or don’t see, the query memory limit and therefore estimate how much memory your query is using.
How can you reduce the amount of memory that your queries use? That’s a question that’s too big to answer in any one blog post and this one is already long enough. In my next few posts I will look at some common causes of excessive memory usage and what you can do to fix them.
Was the governor bug fixed? If not, and you set a lower memory limit and someone tries to export data from a visual that has poor dax feeding it, it can tank the capacity. The governor will correctly end the query when the limit is hit, but the service will keep resubmitting the request over and over again until the 6(?) hour query timeout kicks in.
If the user tries to do multiple exports you very quickly hit 200-300% capacity utilisation. Even with the limit at default it can happen, but you’re much less likely to encounter dax quite *that* bad.
I don’t know, sorry. There is no query governor in Power BI; this sounds like a problem where retrying after an error has some unexpected consequences. If you found this problem did you open a support case for it?
Hi,
Thank you for writing these insightful blog posts 😀
I have a question related to part 1 of this blog series:
“Both Import mode and Direct Lake models can page data in and out of memory as required, so the whole model may not be in memory at any given time.”
Does this apply to both Small Semantic Model Storage format and Large Semantic Model Storage format, or only the Large Semantic Model Storage format?
Best regards, Frithjof
Hi Chris, could you please redirect me any articles that you wrote on 1. Import models in powerbi desktop vs 2. Semantic model (imported) but part of premium workspace consumed via direct query(earlier live). Which is better in terms of performance etc?
That’s an “it depends” question and I haven’t written any blog posts on it. But live connections are not the same thing as creating a composite model and DirectQuery to a published semantic model.
Thanks for clarifications. Customer wants to have one centralized semantic model for all reporting (semantic model developer vs powerbi report creators). Size is huge in terms of fact table. Replicating same in each powerbi desktop(pbix) vs having light weight pbix with UI visuals… (No direct query to source – azure). All tables are imported. But in terms of performance implications, I am not finding much articles (best practices and experiences) and hence the question
Using one central semantic model and building reports with Live connections seems like the best choice here. There won’t be any report performance implications for using Live connections.