Since March 2026, Power BI semantic models have started showing warnings in their Refresh History in the Service. This has scared a few people but in fact all that is happening is that errors which were there all along and which don’t prevent refreshes from completing are now being flagged. Documentation on this feature can be found here but let’s see an example of the type of errors that can cause these warnings.
Consider the following semantic model that consists of a calculated table called Table With Error and a physical table called Sales with two physical columns called Product and Sales, two calculated columns called Sales Forecast and VAT Forecast, and two measures called Sales Amount and Tax Amount.

Here are the definitions of the calculated columns:
Sales Forecast = 'Sales'[Sales] * 1.1VAT Forecast = 'Sales'[VAT] * 1.1
Here are the definitions of the measures:
Sales Amount = SUM('Sales'[Sales])Tax Amount = SUM(Sales[Tax])
And here is the the definition of the calculated table:
Table With Error = FILTER('TableThatDoesNotExist', 'TableThatDoesNotExist'[ColumnThatDoesNotExist]>1)
There are some problems here: the VAT Forecast calculated column, the Tax Amount measure and the Table With Error calculated table all return errors because they refer to tables or columns that do not exist. You can see these errors in Power BI Desktop easily, for example in the Data pane where these items have warning triangles next to them:
…or if you look at their definitions:
None of these errors stop you from refreshing or publishing but of course you can’t use any of these items in your reports.
If you do publish and refresh this semantic model via the UI (although this does not happen if you refresh via the XMLA Endpoint) you’ll see the message “Refresh completed with warnings”:
If you click the Show link in the Details column and then the Show link in the yellow box that appears, you’ll see a dialog showing the errors for all the broken items:
If you see warnings like this you should probably go and either fix the items that are causing them or delete them. Errors like this happen frequently when you delete items in your semantic model that have measures, calculated columns or calculated tables that depend on them; there are plenty of other similar scenarios that will cause errors too.
That is a welcome addition, indeed.
However, when it comes to PBI and errors, I have three little pet peeves. First, for years I keep seeing misleading messages. A weird one I saw recently is the read rights error – identical PQ in a semantic model and in a Dataflow Gen 1 returning very different errors. I helped a user recently and looking at the *dataset* – from Log Analytics or from the Workspace, the error was a generic Gateway Mashup error. The dataflow on the other hand immediately and clearly reported that the NPA in the Gateway connection is missing read rights on the source table. If those errors could be aligned, that would help a lot PBI admins/CoEs.
The second pet peeve I have with errors is that errors often appear downstream, where they first ‘manifest’ – if we have Table1 being the source of T2 and then T2 being the source of T3, an error in T1 may only show up in T3, especially if T1 and T2 are not loaded. The error message also isn’t very helpful, at least for more junior analysts. I believe Lazy Evaluation is here to blame and I kind of understand why it is happening – but it’s annoying nonetheless.
The third one is PQ compatibility – bad PQ in PBI desktop may be accepted – but not in a dataflow as the ADLG2 has more strict requirements than PBI Desktop (think of structured columns for example). But I hope that update you discuss is part of a broader effort to make troubleshooting easier.