Web.Contents(), M Functions And Dataset Refresh Errors In Power BI

One slightly frustrating feature of Power BI is that some of the cool stuff you can do in M code for loading data, and which works in Power BI Desktop (and in Power Query), causes errors when you try to refresh your dataset after it has been published to PowerBI.com. I recently learned some interesting tricks for working around these problems when you are using Web.Contents() and M custom functions, which I thought deserved a wider audience and which are the subject of this post; thanks are due to Curt Hagenlocher of Microsoft and Idan Cohen for sharing this information.

First of all, I recommend you read my previous post on using the RelativePath and Query options with Web.Contents() to get some background, not just on the M functionality I’ll be using but also on the web service I’ll be using in my examples.

Let’s look at an example of where the problem occurs. The following M query uses a function to call the UK government’s open data metadata search API multiple times and then return a result into a table:

    Terms = 
       {{"apples"}, {"oranges"}, {"pears"}}
    SearchSuccessful = (Term) => 
        Source = 
             & Term
        Success = Source[success]
    Output = 
       "Search Successful", 
       each SearchSuccessful([Term])

Here’s the output:


This is just a variation on the widely-used M pattern for using functions to iterate over and combine data from multiple data sources; Matt Masson has a good blog describing this pattern here. In this case I’m doing the following:

  • Defining a table using #table() with three rows containing three search terms.
  • Defining a function that calls the metadata API. It takes one parameter, a search term, and returns a value indicating whether the search was successful or not from the JSON document returned. What the API actually returns isn’t relevant here, though, just the fact that I’m calling it. Note the highlighted lines in the code above that show how I’m constructing the URL passed to Web.Contents() by simply concatenating the base URL with the string passed in via the custom function’s Term parameter.
  • Adding a custom column to the table returned by the first step, and calling the function defined in the second step using the search term given in each row.

This query refreshes with no problems in Power BI Desktop. However, when you publish a report that uses this code to PowerBI.com and try to refresh the dataset, you’ll see that refresh fails and returns a rather unhelpful error message:

Data source error Unable to refresh the model (id=1264553) because it references an unsupported data source.



The problem is that when a published dataset is refreshed, Power BI does some static analysis on the code to determine what the data sources for the dataset are and whether the supplied credentials are correct. Unfortunately in some cases, such as when the definition of a data source depends on the parameters from a custom M function, that static analysis fails and therefore the dataset does not refresh.

The good news is that when, as in this case, the data source is a call to Web.Contents() then Power BI only checks the base url passed into the first parameter during static analysis – and as my previous blog post shows, by using the RelativePath and Query options with Web.Contents() you can leave the value passed to the first parameter as a static string. Therefore, the following version of the query does refresh successfully in Power BI:

    Terms = 
       {{"apples"}, {"oranges"}, {"pears"}}
    SearchSuccessful = (Term) => 
        Source = 
        Success = Source[success]
    Output = 
       "Search Successful", 
       each SearchSuccessful([Term])

This technique will only work if the url passed to the first parameter of Web.Contents() is valid in itself, is accessible and does not return an error. But what if it isn’t? Luckily there’s another trick you can play: when you specify the Query option it can override parts of the url supplied in the first parameter. For example, take the following expression:


When static analysis is carried out before dataset refresh, the url


..is evaluated. However when the dataset is actually refreshed, the search term in the Query option overrides the search term in the base url, so that the call to the web service that is actually made and whose data is used by the query is:


This means you can specify a base url that isn’t really just a base url just so that static analysis succeeds, and then use the Query option to construct the url you really want to use.

Of course this is all a bit of a hack and I’m sure, eventually, we’ll get to the point where any M code that works in Power BI Desktop and/or Power Query works in a published report. However it doesn’t sound as though this will be happening in the near future so it’s good to know how to work around this problem. I wonder whether there are other, similar tricks you can play with functions that access data sources apart from Web.Contents()? I need to do some testing…

UPDATE: The “Skip Test Connection” option on Power BI data sources, added in April 2019, solves some of the problems you run into where you can’t use RelativePath or Query to construct the url. See https://blog.crossjoin.co.uk/2019/04/25/skip-test-connection-power-bi-refresh-failures/

153 responses

  1. Really interesting post, as always Chris. I am actually struggling with apparently the same problem, this case with the AzureEnterprise.Contents() function. If you want more details in order to test it, just mail me. Regards!

    • As far as I can see, I don’t think it’s possible to use this technique with AzureEnterprise.Contents() but I haven’t tested it and I may be wrong.

      • Hi Chris,

        How can i make this work? I want to refresh my data in power bi services without gateway.

        Source = Web.BrowserContents(“https://gist.github.com/tadast/8827699″),
        #”Extracted Table From Html” = Html.Table(Source, {{“Column1”, “TABLE.js-csv-data.csv-data.js-file-line-container > * > TR > :nth-child(1)”}, {“Column2”, “TABLE.js-csv-data.csv-data.js-file-line-container > * > TR > :nth-child(2)”}, {“Column3”, “TABLE.js-csv-data.csv-data.js-file-line-container > * > TR > :nth-child(3)”}, {“Column4”, “TABLE.js-csv-data.csv-data.js-file-line-container > * > TR > :nth-child(4)”}, {“Column5”, “TABLE.js-csv-data.csv-data.js-file-line-container > * > TR > :nth-child(5)”}, {“Column6”, “TABLE.js-csv-data.csv-data.js-file-line-container > * > TR > :nth-child(6)”}, {“Column7”, “TABLE.js-csv-data.csv-data.js-file-line-container > * > TR > :nth-child(7)”}}, [RowSelector=”TABLE.js-csv-data.csv-data.js-file-line-container > * > TR”]),
        #”Promoted Headers” = Table.PromoteHeaders(#”Extracted Table From Html”, [PromoteAllScalars=true]),
        #”Changed Type” = Table.TransformColumnTypes(#”Promoted Headers”,{{“”, type text}, {“Country”, type text}, {“Alpha-2 code”, type text}, {“Alpha-3 code”, type text}, {“Numeric code”, Int64.Type}, {“Latitude (average)”, type number}, {“Longitude (average)”, type number}}),
        #”Removed Columns” = Table.RemoveColumns(#”Changed Type”,{“”})
        #”Removed Columns”

  2. I recently stumbled across this problem when deploying a PowerQuery to our SharePoint site to run with Excel Services. I’m going to try this technique instead of passing a static url with the query embedded. It might work! Thanks for your posts!

  3. I stumbled upon this technique myself, the relativePath and Query arguments are well hidden in the documentation for sure.

    Iterating over many requests can help with paged API requests for large datasets, just remember power BI is asynchronous ask the data may not come in in order.

    Merging two data sources can still be a problem on the refresh. I get around this by having one data source pull I the original data then start a second table with:

    Let mergedTable = OriginalTable

    You can now merge as many tables of different API SOURCES together without the refresh complaining.

  4. Fun fact: I added RelativePath as an option to Web.Contents solely to provide a workaround for this problem.

  5. Pingback: Dashboards, M, NFL and more... - Roundup #58 | Guy in a Cube

  6. Hello Chris, thank you for this amazing post: it’s been very helpful because I am stuck in this precise scenario (with a slight twist).
    I am trying to parametrize the access query to files inside a specific OneDrive folder.

    I can get the list of all files inside my OneDrive easily with
    src = SharePoint.Files(my_organization_url, [ApiVersion = 15])
    and then get my folder
    filtered = Table.SelectRows(src, each ([Folder Path] = “my_folder_path”))
    I then select two, columns, “Folder Path” and “Name”
    select_cols = Table.SelectColumns(filtered,{“Folder Path”, “Name”})
    and merge them to get my pathname list
    file_path_names = Table.CombineColumns(select_cols,{“Folder Path”, “Name”},Combiner.CombineTextByDelimiter(“”, QuoteStyle.None),”filePathName”)

    At this point, I am ready to apply my function fxExtract to [filePathName]:
    fxExtract =(filepathname as text) as table =>
    data = Excel.Workbook(Web.Contents(filepathname), null, true)
    by adding a custom column like this:
    extract_data = Table.AddColumn(file_path_names, “wb_data”, each fxExtract([filePathName]))

    After reading your post everybody knows that this will work in Power BI Desktop but it will not support refresh in Power BI Service (I also verified it by myself).

    So i tried to follow your suggestion, decomposing the Web.Contents part but I encountered two blocking problems.

    You said that, in order for it to succeed, the first part of the URL must be a valid URL.
    This means I cannot use any “folder” URLs (by omitting the file name in the URL), because then authentication will fail and this in turn will make Web.Contents fail as well.

    (P.S.: I have tried with several paths in my OneDrive and it seems Power BI would not let you connect to them. If you use Web as a source, it will only accepts full file path names., not folder path names).

    Do you have any idea, in light of this, on how to setup the base URL in this case?

    SECOND PROBLEM – Parametrization using Query option
    Even if I figured out the first problem, I would stay stuck on how to pass the file names to the base URL by using the Query option.
    Let’s consider your example for clarity’s sake:
    This is obviously wrong in my case, because ‘q’ has been specified as a parameter, so the resulting URL will be:
    while I am looking for the following form:

    I actually can’t find any documentation on how to use the optional Query record, but I have a feeling this must be possible to accomplish.
    I would try with [Query=filename] but I first need to solve the first problem.

    Sorry for the lengthy comment and thanks for reading, any help would be greatly appreciated.
    I believe this is an interesting technique that could be useful to many people.

    P.P.S.: I also started to approach the same problem through Microsoft Graph API calls, but I am not sure it’s the way to go because of the ApiKey. I think it might be more of a hassle than a real solution. Nonetheless I actually got to list the files in the folder, but I still need to get the binaries:

    • Hmm, this looks like a difficult problem. I have no idea what the answer to either of these questions could be – I would need to spend quite a lot of time testing to try to find out. When I have some free time I’ll try to take a look but it will be some time before I’ll be able to do this I’m afraid.

  7. Hi Chris,
    thanks for your reply.
    It is indeed a tricky scenario, and I posted it to you almost exclusively for informative reasons.
    I would dig deeper myself, but I cannot find any additional reference other than this:

    In this MSDN reference there is no focus on parameters and I really don’t know what I am touching there.
    By the way, I tried with [Query=filename] and it doesn’t work (expected I would say).

    I would be more than happy If you can point me towards more thorough reference or documentation, and if there isn’t any I’ll write directly to Microsoft.

    Thanks for your time,
    if I make progress I’ll report back here, this technique could be quite usueful.

    • I can’t resist a challenge, and so even though there are thousands of more important things I should be doing I just spent several hours solving this problem 🙂 I will write up the solution in a day or two in a separate blog post.

      • Chris, this is amazing!
        Your reply really made my day, and I can’t wait to read your take on this issue.

        If this technique proves to be feasible, Power BI users can basically use an arbitrary OneDrive folder as a data source that supports scheduled refresh: this would simply awesome (M language is awesome!)

        Thanks again for the kind interest you showed for this issue.


      • Here are a few hints while you wait… Don’t try using Web.Content(), I don’t think there’s a way it can be made to work here. Instead, because SharePoint.Files() returns a link to the content of each file(in the Content column), you just need to create a function that takes a binary, turns it into an Excel workbook and extracts the table/range/sheet you need, then call that function in a Calculated column on the table returned by SharePoint.Files(), passing the Content column to your function, then finally expand out the resulting column to get at the data.

      • Hello Chris,
        following your hints I was able to develop the solution on my own with two Functions and a Query.
        The dataset now offers the option to schedule updates, this is absolutely fantastic.

        I wont spoil anything to your reader while we wait for the post, but I already noted down a couple of observations to do.
        I will wait for the post to appear and use that comment section.

        As always, I want you to know that your work is greatly appreciated.
        Thank you very much for the help!

      • This looks like it’s the answer to my problem too.
        I’ve had a look in your posts for this but I can’t see if this ever get written up?

        I’m having this exact problem at the moment, when using Power BI desktop I’m able to pull all the relevant excel files from sharepoint but I can’t schedule refreshes when it’s published.
        It just gives me the error “Query Contains Unsupported Function. Function Name: Sharepoint.Files”

        I’ll try to implement a solution based on your additional helper comment below.

  8. Chris,
    Thanks for this excellent post – I am trying it now to get round a similar problem in PowerBI.

    In my case I have defined a separate function which I call from another table.
    I have also been using one of your other posts on monitoring performance to see how long the API calls take to run as there is some concern over the impact of my API calls on the server.
    What I’m a bit puzzled by is why PowerBI seems to call the API twice when the function is called. I have included a fragment of the trace analysis below.
    I wasn’t sure if it was because of the way I was invoking the function or how Web.Content retrieves the data.

    I was wondering if you had ever encountered something similar?

    Many thanks again for your great post and apologies if the question is out of place.

    Date Command Request Duration
    2016-09-15T10:22:28.8108993 Engine/IO/Web/Request/GetResponse RequestUri:https://some url 00:00:16.4505802
    2016-09-15T10:22:14.8284810 Engine/IO/Web/Request/GetResponse RequestUri:https://some url 00:00:13.9823147

  9. Pingback: Loading Data From Multiple Excel Workbooks Into Power BI–And Making Sure Data Refresh Works After Publishing – Chris Webb's BI Blog

  10. Wondering how these can be leverage for function that dynamically iterate over the facebook post id. For e.g In the code below, how do we pass PostID with query option like you shown in your blog?

    Source = Json.Document(Web.Contents(“https://graph.facebook.com/v2.7/”
    & PostID
    & AccessToken)),

    • Try as below :
      Var = AccessToken,
      & PostID

      • Thanks @Sachin, but it didn’t work. I think as Chris mentioned it tries to evaluate the URL as it is and it fails as it wont be the valid URL with the “& PostID” in it. Not sure though, but it didn’t work.

      • I am wondering if anyone managed to refresh the Facebook data on power bi service by any chance chance.

  11. Thank You Chris. I was puzzled by this error while working for a customer. The error message thrown by Power Bi online was not at all helpful. Thanks for posting this lovely tip.

  12. Hi guys, same issue here….

    I am pulling paginated rows from API (roughly 500 pages, 50 rows each) via Web.Contents(). The call also needs to contains access token.

    I wonder if someone has already figured out a workaround which could be implemented by a non-programmer?

    Basically, my steps are:
    1. I call API to see how many pages of data there are.
    2. I get the number.
    3. Based on the number, I create the exact number of the “paginated URLs”, I attach access token and I call all those URLs.
    4. I wait.
    5. Voila, my data is in Excel.

    This obviously works in Excel or PBI for Desktop but does not work in PBI Online. I thought I was gonna get heart attack. I spent so many hours on the report to learn that this kind of “data sourcing” is not supported in PBI Online.. Terrible dissapointment..

    Thx for any help!

    • It’s hard to say without knowing a lot more about your API, but there’s no reason why this should not be possible if you can use Web.Contents() in the way I describe in this post.

      • Does not seem to work… I basically took your sample query, did very small changes, converted it to my scenario, pubiished to Power and still same error… Data source cannot be refreshed. I wonder if the token part is causing it.

        The code is:
        Terms =
        {{“1”}, {“2”}, {“3”}}
        ApiCall = (Term) =>
        Source =
        & Term
        &”&fields=company_name,keywords,landing_page_url,gclid”, [Headers=[Authorization=”Token token=461817fcf20f88eae7c08028skfkshfhYYZSAMPLESAMPLE”]]


        #”XX – Call Pages” =
        “Search Successful”,
        each ApiCall([Term])
        #”Expanded Search Successful” = Table.ExpandRecordColumn(#”XX – Call Pages”, “Search Successful”, {“page”, “per_page”, “total_pages”, “total_records”, “calls”}, {“page”, “per_page”, “total_pages”, “total_records”, “calls”})
        #”Expanded Search Successful”

      • Yes, this is a known issue: only the url in the first parameter is evaluated during the tests, so authentication will fail because it doesn’t pass the token in the header.

    • Hey Daniel – wanted to touch base as I started working with CallRail today as well if you were ever able to find a viable solution. Considering we’re both using v2 – I’m going to say that’s a big “NO” – but if you had any weird workarounds I would certainly be open to hearing about them if you did get it up and running.

      • Hi Alex,
        I do have a solution but it’s relatively major workaround… I pull the data from CallRail to Big Query and then from Big Query to PBI.

        Obviously, CallRail=>Big Query involved some development work outside PBI.

        I was also using a GoogleApps script to pull data to Google Sheets before this solution, but I decided to go for BigQuery as storage which is more stable (I don’t remember the exact reasons anymore, but Google Sheets were not realiable).


  13. Hey Chris,

    I have a table in Power BI that contains a single column with id’s I get from a previous api call. I’ve then added a custom column that makes a separate call for each row while appending the id to the end of the path. This worked in desktop, but gave me the head ache you’ve addressed in this post. So, now I have separated the id in my function from the static path to the RelativePath property in my Web.Contents call and though my dataset will now refresh, the credentials cannot be verified without an id at the end of the base URL. This is similar to your example for replacing a Query, but it’s my understanding that RelativePath always gets appended to the end. Is that true and do you know of a work around?


    • RelativePath always gets appended to the end, yes. It sounds like you have hit one of those scenarios where data refresh can’t be made to work, unfortunately – something I’ve run into rather too often myself.

  14. Thanks for sharing this technique. I’m trying to use a table of parameters rather than a hardcoded table and I can’t seem to get it to work in Power BI Desktop without doing the “Ignore the Privacy Levels” workaround. I want the query to be automatically updated in the Power BI Service, so I can’t use that workaround. I’m using a public API for Web.Contents() and the table of parameters is a file on SharePoint. I’ve tried creating the “staging” call per good practices, but that doesn’t seem to help. Is there a way to get this technique to work in the Power BI service without hardcoding the table?

      • Thanks! I read those posts and figured out a way for it to work. I have to (a) combine the queries into a single query thereby removing the “staging” workaround that others have used to avoid the Formula.Firewall() error and (b) match the privacy settings for the file on SharePoint and the external website. Doing either (a) or (b) is not enough to avoid the error. That doesn’t seem like the right implementation to me. I would think the “staging” call should be how it works in this case and you should be allowed to have different privacy settings. Am I missing something? Is there an equivalent to Value.NativeQuery() for Web.Content()?

      • You have no choice but to use Web.Contents(). I don’t fully understand what you’ve done from your description, though – can you explain what you mean about being allowed to have different privacy settings?

      • My understanding is the “staging” workaround allows us can have queries rely on datasets with different privacy settings. In this case, I’d like to have the SharePoint file be “Organizational” and the public website be “public.” That doesn’t seem possible here.

  15. Even though the Power BI desktop offers no error (with privacy settings set to the Combine Data Levels…), the Power BI service failed with this error: [Unable to combine data] Section1/State Legislation Summary/AutoRemovedColumns1 is accessing data sources that have privacy levels which cannot be used together. Please rebuild this data combination.

    It looks like the workaround about setting the Data Source Settings to the same (Organizational in my case) doesn’t seem to work.

    • Have you set the data privacy levels on the data source in the Gateway? You need to do this in the Manage Gateways screen in the browser when you set up the data source; it’s in the Advanced section at the bottom of the screen on the Data Source Settings tab.

  16. I’m not sure I understand. I’m not using a Data Gateway for this dataset — I’m using the Power BI service to host. The workaround I’m speaking about is setting the Data Source Settings in Power BI Desktop. Perhaps I misunderstood how that’s supposed to work?

      • OK, got it. I’m afraid I don’t know what’s going on here – although I did hear about a similar problem today, so when I get more details on it it may turn out that you are suffering from the same problem. As a test, can you install a Personal Gateway and see if that works?

  17. Hi Chris! Thanks for your amazing post. I’ve been triying to use this method with the twitter API. However, i cant get it to work. I believe this may be happening because my base URL is “https://api.twitter.com” which does not seem to be a valid one. Is this the problem? Do you know if there might be something to fix it? Thanks!!

    • It could be. However you’re going to have a lot of problems using the Twitter API anyway, for example with authentication, so I think it would be a better approach to build a separate app to get the data from Twitter and load it into a database, and then use the database as a source for Power BI.

  18. I can’t reply to the latest message in our thread, Chris, so apologies. Anyhow, I really appreciate your help here. I won’t have time to test this through a Personal Gateway so I’ll just have to deal with the behavior as-is for now.

  19. Pingback: [TFS] Visualizando Queries (WIQ) no PowerBI – ESX

  20. Pingback: Dynamic Web.Contents() and Power BI Refresh Errors – Data Inspirations

  21. Hi Chris

    Thanks for the post. I’m having the issue where I’m calling an API with a changing parameter (TestArea). It works in desktop but not Power BI Service as in your examples. The issue is that I can’t authenticate against the URL in Power BI service as it’s not valid. From some of the comments above, I’m interpreting this as something that cannot be solved. Is this correct? My example is below.

    (TestArea as text) as list =>

    Source = Json.Document(Web.Contents(“https://directapi.imaginarysite.com/api/v1/gateway/”,
    RelativePath= “”&TestArea&”/userLogins”,

      • Hi Chris

        Is it possible to authenticate in Power BI Service in the manner below? I’m still struggling to get what works in Power BI Desktop to work in Power BI Service with changing URL Parameters coming from an API. The below is a proposed solution I need to get me head around where you authenticate against a base URL within the Power BI desktop that then applies to the entire model. I can’t get it to work however.

        Authenticate separately against the URL below in a generic function/somewhere in Power BI desktop:

        and then use the below URL as the required data source (without the auth part):

        where “&TestArea&” is the changing parameter coming from the API.

        Thanks for the help and your blog.

  22. Hi there,

    I had a working solution to get PowerBI data into a desktop visual and refreshed fine but when put into the PowerBI service and a refresh was tried I got the same problem you speak about above.
    Unfortunately I am unable to resolve the problem
    Code I am currently using which works fine in PowerBI Desktop:
    SearchIssue = () =>
    Source = fnAllJiraIssues(“”, 500, 1),
    table = Table.ExpandRecordColumn(Source, “Column1”, {“id”}, {“id”}),
    Search = (id) =>
    Issue = Json.Document(Web.Contents(#”URL (3)” & “/rest/api/2/issue/” & id & “?expand=changelog”))
    Output = Table.AddColumn(table, “ChangeHistory”,each Search([id]))

    Anybody know what I should change this to, to get it working In the PowerBI Service?

    I have wrote the following which gives me a syntax error, only changing the issue=

    Issue = json.Document(Web.Contents(“https://jira.co.uk”, [RelativePath=”/rest/api/2/issue/, [Query=[id=id, end=”?expand=changelog”]]] ))

    Or could I just explicitly enter the url as per the 1st one instead of referencing it in a parameter field?
    Issue = Json.Document(Web.Contents(“https://jira.co.uk” & “/rest/api/2/issue/” & id & “?expand=changelog”))

    • Does this work?
      Issue = json.Document(Web.Contents(“https://jira.co.uk”, [RelativePath=”/rest/api/2/issue/” & id, Query=[expand=”changelog”]] ))

  23. Hi Chris, this did solve the first error message I was receiving but I am now onto the 2nd error
    ” [Unable to combine data] Section1/Change/AutoRemovedColumns1 is accessing data sources that have privacy levels which cannot be used together. Please rebuild this data combination. Table: Change.”

    I have tried to changed privacy settings to always ignore but still get the above error, any ideas?

  24. Hi Chris,

    I am experimenting with PowerBI for a few weeks ( i am a newbee) and I am stuck apparently with the same problem. The ESI Funds open data REST API that I am connetcing to provides two types of services. Lists

    (e.g.) Source = Json.Document(Web.Contents(“https://opendata.itms2014.sk/v2/zonfp/zamietnute”))
    #Easy to get, can be converted to table, refreshes fine in both desktop and powerbi service

    and Details. In order to get them I use the project id from the list as a parameter with the base URL https://opendata.itms2014.sk/v2/zonfp/zamietnute/{projekt id}

    (e.g.) Source = Json.Document(Web.Contents(“https://opendata.itms2014.sk/v2/zonfp/zamietnute/” & #”projekt id”)),
    # with this I get a refresh error in powerBI service

    I tried to implement your solution with the [Query] optional field, but the problem is that it apparently adds by default a “?” (question mark) to the URL and therefore it does not work with the service. The service supports only the GET method so my experiment with the [Content] optional field and POST web requests did not work either.

    I would appriciate any idea on how to get PowerBI service to refresh my details table automatically.



  25. Hi Chris,

    I think I found a solution. Instead of using the [Query] optional field I constructed the parametrized URL using the [RelativePath] optional field:

    Source = Json.Document(Web.Contents(“https://opendata.itms2014.sk/v2/zonfp/zamietnute/”, [RelativePath=projekt_id])),

    Fortunately the base URL is also valid in itself and now it refreshes also in the PowerBI service. I would not have solved it without your post, thank you very much!


  26. Hi Chris,

    I’m loading Dynamics 365 option sets using a function and I’m not able to refresh it in the PowerBI Service – hitting the issue you try to resolve in this blog.

    The M-code I’m currently using:

    Source = Json.Document(
    & OptionSetTable
    & “‘)/Attributes/Microsoft.Dynamics.CRM.PicklistAttributeMetadata?$select=LogicalName&$filter=LogicalName%20eq%20%27”
    & OptionSetColumn
    & “%27&$expand=OptionSet”

    I’ve been reading throught the comments (and testing some suggestions I’ve read in the blog and comments) and I’m afraid it won’t be possible to solve my case but just want to double check if there is any solution.

    Thanks in advance,


    • I don’t know if there is a solution here – as I say in the post, it will depend on whether your can use the RelativePath option and still have a url that can be called without an error.

    • Hello Ilse, did you find a solution for the option sets? I’m still looking for a solid solution and was on the same path as you mentioned here.
      It would be so great to have D365 option sets better supported in PowerBI. With ODATA v4.0 it is of course possible to extend the records in M, but performance will be BAD.

    • Hi Chis, Thx for the suggestion. I will definitely check that one out. For getting data out of D365 there is a better solution now by using the CDS (common data services) connector in stead of the D365 connector.
      It has a better performance and ……whoohooo…… has both option value and option label in the result set.

  27. Hi Chris,

    Would you mind looking at this code? I am wondering if I am wasting time trying to get this resolved??

    I am calling an API from an accounting package and I can get a list of page url’s using the BuildPageUrls line.

    The API returns some metadata including a nextpagelink with the relevant stop and skip which maybe I could use somehow?

    The code below works fine in the service and desktop as far as refreshing goes up to the point where I want to call the pages urls and grab each page of the table data.

    Is there any changes to the code you can suggest? I am no 100% sure how I can modify the CallEach Page line to get the data.

    Here is my sample code to date

    Source = Json.Document(Web.Contents(“http://localhost:8080/AccountRight/”,
    [RelativePath = “e76e28f2-87a6-44f9-be7e-93a88f8799bc/GeneralLedger/JournalTransaction/?api-version=v2″,
    Query = [q=”api-version=v2”]
    ) ),
    PageCount = Number.RoundUp(Source[Count]/1000),
    Pages = {0.. PageCount-1 },
    ToTable = Table.FromList(Pages, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
    BuildPageUrls = Table.AddColumn(ToTable, “GetPages”, each if [Column1] = 0 then
    BaseUrl & CFUID & Resource & Apiversion & “$stop=1000” else
    BaseUrl & CFUID & Resource & Apiversion & “$stop=1000” & “&$skip=” & Text.From( [Column1] * 1000)),
    DataTypes = Table.TransformColumnTypes(BuildPageUrls,{{“GetPages”, type text}}),
    CallEachPage = Table.AddColumn(DataTypes, “CallPages”, each Json.Document ( Web.Contents ([GetPages]))) /*page call fails in service*/

    Is there a workable solution in this case out there? or would a custom connector achieve the result required?

    I can get a list of page urls so could I use that data some how to grab the data in a different manner to avoid the issue?

    Thanks for your help in this space, I can see its a real issue for a lot of users..

  28. Thanks for the very useful tip! I was already trying out some very ugly alternatives but this was exactly what I needed!

  29. Hi Chris,

    Really helpful blog post, thanks for your insights.

    It seems to me that your example works because you know in advance what “terms” you are going to pass to the web call. But what if you don’t know these terms?

    I have a similar issue with paginating a response from Airtable’s API. I have tried to think about how I can modify my code so that it doesn’t fail Power BI’s static analysis, but I have drawn a blank.

    Here is the code I’m starting with…

    Pagination = List.Skip(List.Generate( () => [Last_Key = “init”, Counter=0], // Start Value
    each [Last_Key] null, // Condition under which the next execution will happen
    each [ Last_Key = try if [Counter]<1 then "" else [WebCall][Value][offset] otherwise null,// determine the LastKey for the next execution
    WebCall = try if [Counter]<1 then Json.Document(Web.Contents("https://api.airtable.com/v0//?api_key=“)) else Json.Document(Web.Contents(“https://api.airtable.com/v0//?api_key=&offset=”&Last_Key&””)), // retrieve results per call
    Counter = [Counter]+1// internal counter
    each [WebCall]

    The way you iterate through the API call is to pull the first page of results and then use the offset key from the end of the results in the next call. You have to go through each page to get to the next key. These change with every call.

    The first thing I need to understand is whether the reason the power BI service won’t refresh the dataset is because of the static analysis test, or because of something else that this code does.

    If the former, is it possible to somehow rewrite this using your technique so that the service can be tricked into running this code?

    In your example you begin by creating a table of terms to send to the web call. I wondered about trying to do something similar by creating a table of the offset keys and then passing each one to the web call. But in order to generate such a table you would need to iterate through the whole API call in order to retrieve the keys and you would end up using a similar piece of code to the above, in order to achieve this.

    Any help would be greatly appreciated.


    • Looking at your code I would say yes, the reason it’s failing is that it’s failing the static analysis because you’re trying to generate the entire url dynamically. The fact that the key changes with each call is not a problem, neither is the fact that you might be passing different parameters with each call. You should be able to rewrite your code using the technique in this post and make refresh work.

  30. Hi Chris,

    Thank you very much for posting this. It is very helpful. I’m facing a similar, although somewhat different issue and would very much appreciate help.

    I’m getting data either through Web or OData Feed via the following query:


    in which element ID (*****), start date (2018-10-19T18:30:00Z) and end date (2018-10-25T00:00:00Z) are dynamic. End date is the current time, start date is current time minus 7 days and I got ca. 200 different IDs that I import from the excel file into one column in Power BI.

    I solved this by creating 3 parameters (GUID, StartTime and EndTime) and a following function:

    Source = (GUID as text, StartTime as text, EndTime as text) => let

    Source = OData.Feed(“https://dummy.com/API/Dsl/v1/Attributes(ElementId=” & GUID & “, 20Name=’TimeSeriesValue’)/RetrieveValuesTimeSliced(Start=” & StartTime & “,End=” & EndTime & “,TimeSpan=’PT10M’)”)


    Finally, I invoke this custom function to a table containing actual IDs, start and end times, and get my data imported. This works like a charm in Power BI Desktop, no issues. Also refresh works in Power BI Desktop, no problem.

    However, when I publish to Power BI web service, refresh doesn’t work anymore. I read many posts online, but wasn’t able to find a solution for my problem.
    Your url example “https://data.gov.uk/api/3/action/package_search?q=” contains “?” that can be exchanged by “Query”, but I’m not sure if and how could I do this for my example.

    Thanks a lot in advance,

      • I’ve converted this:

        Source = OData.Feed
        & GUID &
        & StartTime &
        & EndTime &


        Source = OData.Feed(“https://dummy.com/API/Dsl/v1/Attributes(ElementId=”, [Query = [GUID], RelativePath = “,%20Name=’TimeSeriesValue’)/RetrieveValuesTimeSliced(Start=”, Query = [StartTime], RelativePath=”,End=”, Query = [EndTime], RelativePath=”,TimeSpan=’PT10M’)”])

        But unfortunately, it doesn’t work 🙁 Do you have any suggestion on how to convert the query properly?

      • Hmm, I’m afraid not. I was looking at this myself recently, and I hadn’t realised that the Query option for OData.Feed() doesn’t work in the same was as the Query option for Web.Contents() – it can only be used for custom query options, not system query options.

  31. Thanks so much Chris for the great article. I was struggling for some time now over this. It has really opened a big aveneue for me. now for the first time I am able to run loops and publish the report for auto refresh. Thanks so much.

  32. Pingback: Credentials, Data Privacy Settings And Data Sources In Power Query/Power BI « Chris Webb's BI Blog

  33. Pingback: Fourteenth Blog Birthday « Chris Webb's BI Blog

  34. Does this technic means the URL can’t be a parameter anymore?

    ) works fine

    BUT not

    Web.Contents(#”My Data Source”)

  35. Brilliant write-up, thanks. Are you aware of any changes to the behavior for online-refreshes to help prevent hand-coding around this? I find myself writing to Web.Contents sources a lot – and as you note examples like loops for pagination (E.g. “http://www.anyweb.com/results/page=” & MyParmHere “) <- where MyParmHere is a table, each {1….100}, fails online.

  36. My base url is behaving strange…
    The full url I am trying to access is “https://www.acompany.co.il/odata/abc.ini/xyz/accounts” (where xyz is a parameter)
    I have tried this:
    Source = Json.Document(Web.Contents(“https://www.acompany.co.il/odata/abc.ini/”,
    In Desktop works great!
    But in the service I get DataSource error. It asks for user/password for:
    “https://www.acompany.co.il/odata/abc.ini” but doesn’t accept the credentials.
    I have tried the url directly in Chrome and this is SOOO strange:
    This works: “https://www.acompany.co.il/odata/abc.ini/” – it is accepting my credentials
    While this doesn’t: “https://www.acompany.co.il/odata/abc.ini” !!!
    All the difference is in the last SLASH!!!!
    I have tried different syntax but in the service it always askes for credentials without slash in the end – so it fails….
    Also I notice that in the service it recognizes this url as OData (https://www.acompany.co.il/odata/abc.ini/xyz/accounts)
    while this url – as Web
    And it is always shown without a slash in the end, so I cannot log in…
    Please help!
    Thank you!

    • Strange – I don’t know the answer to this, sorry. It must be something to do with the way Power Query stores credentials and permissions (which is related to the url).

  37. Chris, I just want to make sure I understand the you are using here, I have for example a report that makes two api calls. One to recursively get a weather radar station location and then a second to retrieve forecast data from an api based on the previous url returned.

    I will apologies for the sloppy M Code queries below as this is my first venture into using M Query within power bi so I am somewhat still learning its capabilities.

    My queries are as follows;

    = (Latitude as text, Longitude as text) => let
    Source = Json.Document(Web.Contents(“https://api.weather.gov/points/”& Latitude & “,” & Longitude, [Timeout=#duration(0, 0, 2, 0), Headers=[token=”xIZyacjOaGCxWLiGjTprTdAhcXzSclmT”]])),
    #”Converted to Table” = Record.ToTable(Source),
    #”Transposed Table” = Table.Transpose(#”Converted to Table”),
    #”Promoted Headers” = Table.PromoteHeaders(#”Transposed Table”, [PromoteAllScalars=true]),
    #”Changed Type” = Table.TransformColumnTypes(#”Promoted Headers”,{{“@context”, type any}, {“id”, type text}, {“type”, type text}, {“geometry”, type any}, {“properties”, type any}}),
    #”Expanded properties” = Table.ExpandRecordColumn(#”Changed Type”, “properties”, {“forecastOffice”, “forecast”, “county”}, {“properties.forecastOffice”, “properties.forecast”, “properties.county”})
    #”Expanded properties”

    and then

    = (ForecastUrl as text) => let
    Source = Json.Document(Web.Contents(“” & ForecastUrl)),
    properties = Source[properties],
    #”Converted to Table” = Record.ToTable(properties),
    Value = #”Converted to Table”{7}[Value],
    #”Converted to Table1″ = Table.FromList(Value, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
    #”Expanded Column1″ = Table.ExpandRecordColumn(#”Converted to Table1″, “Column1”, {“number”, “name”, “startTime”, “endTime”, “isDaytime”, “temperature”, “temperatureUnit”, “temperatureTrend”, “windSpeed”, “windDirection”, “icon”, “shortForecast”, “detailedForecast”}, {“Column1.number”, “Column1.name”, “Column1.startTime”, “Column1.endTime”, “Column1.isDaytime”, “Column1.temperature”, “Column1.temperatureUnit”, “Column1.temperatureTrend”, “Column1.windSpeed”, “Column1.windDirection”, “Column1.icon”, “Column1.shortForecast”, “Column1.detailedForecast”})
    #”Expanded Column1″

    Any suggestions on how I might alter these two M Queries to comply with your suggestion ;

    • The first thing that I suggest is that you use the RelativePath option for the Latitude and Longitude values in the Source step of the first function

  38. Hi Chris, this post is so tantalizingly close to solving our problem. I am helping a client work with data from Airtable to power dashboards in Power BI Service. We are both new to Power BI, but I have a fair amount of experience with web APIs and Airtable.
    You mention that Power BI will analyze the first parameter (the url) passed to Web.Contents, like https://data.gov.uk/api/3/action/package_search?q=apples. However, when we use a dataset with a query parameter like that and try to schedule a refresh, Power BI seems to choke. In our particular case, if we remove the parameter, the url returns unauthorized. Are we publishing the data set or dashboard incorrectly? The key lines look like:

    if [Counter]<1
    then Json.Document(Web.Contents("https://api.airtable.com/v0/app3jBqxbLcBLRS9R/Evaluations?api_key=****&quot;))
    else Json.Document(Web.Contents("https://api.airtable.com/v0/app3jBqxbLcBLRS9R/Evaluations?api_key=****&quot;, [offset=[WebCall][Value][offset]]])),

    Any clues might help. Or, if you do consulting work and have a 2-3 hours available, I'd love to get your expertise. Thanks, great post in any case!

  39. Hi Chris,

    I’m working on an application where I need to use API calls for detailed data constructed based on data from previous calls. Of course, I’ve faced the problem where Power BI Service didn’t allow to perform such a refresh with a given dataset because my data sources were dynamically constructed.

    RelativePath saved the situation – I think it would be good if you reference it in your another post where you’ve resolved a similar problem using query search parameters orange/apples.

  40. Hello Chris,

    First I want to thank you for the many useful posts you made. I’ve read this post but I’m not able to solve my problem as it is quite different than your example. Maybe you could help if possible? In Power BI desktop my code works perfectly.

    First of all I want to fetch my access token from Zendesk with the asked credentials. As for this I’ll probably need to change my path?

    Source = (Subdomain as text, ClientID as text, ClientSecret as text, LoginName as text, Password as text) =>
    ZendeskAccessTokenContent = [
    grant_type = “password”,
    client_id = ClientID,
    client_secret = ClientSecret,
    scope = “read”,
    username = LoginName,
    password = Password
    ConvertZendeskAccessTokenContent = Text.ToBinary(Uri.BuildQueryString(ZendeskAccessTokenContent)),
    ZendeskAccessTokenRequest = Web.Contents(“https://” & Subdomain & “.zendesk.com/oauth/tokens”,
    Headers = [#”Content-Type”=”application/x-www-form-urlencoded”],
    Content = ConvertZendeskAccessTokenContent
    ZendeskAccessTokenJSON = Json.Document(ZendeskAccessTokenRequest),
    GetZendeskAccessToken = ZendeskAccessTokenJSON[access_token]

    As for getting my actual data, I work with a parameter in which I’ll look if there’s a next page to fetch the data. I guess when working with a parameter and paging, it makes it more difficult, no?

    GetZDAccessToken = ZendeskAccessToken,
    GetURLParameter = (URL) =>
    HTTPHeader = [Headers = [#”Authorization”=”Bearer ” & GetZDAccessToken, #”Content-Type”=”application/json”]],
    ZendeskGroupsRequest = Web.Contents(URL, HTTPHeader),
    ZendeskGroupsJSON = Json.Document(ZendeskGroupsRequest),
    ZendeskGroupsList = @ZendeskGroupsJSON[groups],
    IterationOverNextPage = try @ZendeskGroupsList & @GetURLParameter(ZendeskGroupsJSON[next_page]) otherwise @ZendeskGroupsList
    GetAllZendeskGroups = GetURLParameter(“https://blablabla.zendesk.com/api/v2/groups.json”),
    ExpandListOfRecords = Table.FromRecords(GetAllZendeskGroups)

    Is it possible to solve this, so I can automatically refresh my data in Power BI service?

    Kind regards

      • Hello Chris,

        I want to thank you because it works for the simple queries like the authentication and such even without skipping the test connection. Though, it still doesn’t work for my last query. I have a guess that it has to do with my iteration and that I call my URL in a function? If I remove the iteration and use the relative path for “/api/v2/groups.json” it works, but as soon as I use this iteration, it fails.

        GetZDAccessToken = ZendeskAccessToken,
        GetURLParameter = (URL) =>
        HTTPHeader = [Headers = [#”Authorization”=”Bearer ” & GetZDAccessToken, #”Content-Type”=”application/json”]],
        ZendeskGroupsRequest = Web.Contents(URL, HTTPHeader),
        ZendeskGroupsJSON = Json.Document(ZendeskGroupsRequest),
        ZendeskGroupsList = @ZendeskGroupsJSON[groups],
        IterationOverNextPage = try @ZendeskGroupsList & @GetURLParameter(ZendeskGroupsJSON[next_page]) otherwise @ZendeskGroupsList
        GetAllZendeskGroups = GetURLParameter(“https://blabla.zendesk.com”),
        ExpandListOfRecords = Table.FromRecords(GetAllZendeskGroups),
        GetSelectedColumns = Table.SelectColumns(ExpandListOfRecords, {“id”, “name”, “deleted”, “created_at”, “updated_at”}),
        RenameSelectedColumns = Table.RenameColumns(GetSelectedColumns, {{“id”, “ID”}, {“name”, “Name of Group”}, {“deleted”, “Is Deleted”}, {“created_at”, “Created at”}, {“updated_at”, “Updated at”}}),
        ChangeTypeSelectedColumns = Table.TransformColumnTypes(RenameSelectedColumns, {{“ID”, type text}, {“Name of Group”, type text}, {“Is Deleted”, type logical}, {“Created at”, type datetime}, {“Updated at”, type datetime}})

        Kind regards

  41. Hello Chris,

    Is there any way to do something like this for ODBC.Query()?

    It seems like setting your DSN as a Parameter is not something the PBI Online Service is okay with.

    To prevent a ton of wasted work I just hardcoded the DSN name, but ideally I would be able to parameterize this in the future.

    Any feedback is greatly appreciated.


      • Well, it does work to an extent:

        1.) It works as a function within PBI Desktop, but I get the following error message when I push it to schedule refresh in service: you can’t schedule refresh for this data set because the following data sources don’t support refresh: ODBC.Query() – I’ve determined this to be a false error message as it refreshes just fine as long as the DSN is hard coded.

        2.) It works as a parameterized query both in desktop and in the service, but this isn’t ideal as I have two (nearly) identical tables I need to pull from, a live table and an archive table – having a function ensures the two always match perfectly when I need to change something. I have narrowed the problem down to the DSN parameter because I have several parameters within these queries that still refresh fine as long as the DSN is hard coded.

        It is my understanding that the issue lies within PBI’s “test connection”. Essentially, when your query is a function the service takes your query as is (without our parameter injections) and gives the DB a test run, clearly, this will never work with a parameterized DSN. This gives me hope that “skip test connection” will solve my problem once it is fixed and made available (late June ETA), but I would still like to identify ways around the issue in the meantime.

        I have a hunch that “tricking” the service into testing a hard-coded “dummy” DSN is the way to go here, but I’m open to any method. I appreciate your help – your website has saved me countless times and despite your belief that your book is outdated, it has served as an incredible foundation for the work I do.

      • I think your best bet will be to wait until ‘Skip Test Connection’ is back. There may be a way to get it working before then but it might involve a lot of trial and error.

  42. Fair enough, thanks for your feedback.

    You’d think there would be an M function that allows you to pass “TRUE” to the connection attempt.

    Shame they made “Skip Test” GA and then took it away.

  43. Hi Chris,

    great post!

    It would be much appreciated to get your advice on the below query. We are struggling to pull all Project Sites risk data (they have custom columns) from a PWA:


    ProjectsList =
    {{“Implementation Project A”}, {“Infrastructure Project A”}, {“Infrastructure Project B”}}
    GetRisks = (ProjectName) =>
    Risks = Json.Document(
    Success = Risks[success]
    Output = Table.AddColumn(ProjectsList, “All Risks”, each GetRisks([ProjectName]))


    The error we get is:


    DataFormat.Error: We found extra characters at the end of JSON input.

    Our idea is to get it working in Power BI Desktop and then publish it to Power BI Service to check if we can schedule refresh the report…

    Thanks in advance!

  44. Hi Chris,

    Great post and it’s from 2016…amazing. I would love you help for an issue regarding this topic.

    I need to make a call to the following api:


    Using Relative Path I’m able to use as base URL the beginning of the url

    Source = Json.Document(Web.Contents(“https://api.” & BaseURL, [RelativePath = “/” & CompanyName & “/projects”, Headers=myToken]))

    The problem is that “https://api.itmplatform.com” as itself returns an error. It would work if it was “https://itmplatform.com”.

    Is it possible to add two RelativePath? Meaning something like this

    Source = Json.Document(Web.Contents(“https://”, [RelativePath = “api.”], BaseURL, [RelativePath = “/” & CompanyName & “/projects”, Headers=myToken]))

    Like this I think I would be able to solve the test connection.

    Thanks in advance!!

    • According Microsoft:

      The connection string that we construct in M will not be identified in service at the time of static analysis. Meaning, we try to look into M and pick the connection strings (datasource uri’s). But the one’s we construct will come to life only when we actually execute that particular M script which happens only during execution not when we perform static analysis. I do not think refresh will be successful in this case.

      So case closed, it does not work.

  45. Pingback: Tracking AEMO data using PowerBI – Project Controls blog

  46. Hello Chris, Need some advice if possible.

    I’m running the following query fine in Power BI desktop, however when I publish it to PBI service and attempt to refresh I get the following error

    “Unable to refresh the model (id=6884172) because it references an unsupported data source.”

    (name as text) =>

    Source = Json.Document(Web.Contents(“https://goma-webgui-tmf.vodafone.com/prweb/PRRestService/api/v1/cases/”&name)),
    content = Source[content],

    I’ve attempted to implement your solution about via changing the web.contents line to

    Option 1.)

    Source = Json.Document(Web.Contents(“https://goma-webgui-tmf.vodafone.com/prweb/PRRestService/api/v1/cases”,

    Option 2.)

    Source = Json.Document(Web.Contents(“https://goma-webgui-tmf.vodafone.com/prweb/PRRestService/api/v1/cases”,

    Option 1.) Gave me an error in power bi desktop. Option 2.) Worked in Desktop version but get the following error when trying to refresh in PBI online.

    Underlying error code-2147467259 Table: Cases List.
    Underlying error message[Unable to combine data] Section1/Cases List/AutoRemovedColumns1 references other queries or steps, so it may not directly access a data source. Please rebuild this data combination.

    Any help would be appreciated thanks.

    • This looks like a different problem, relating to data privacy settings (which I’ve blogged about a lot). Have you set Power BI Desktop to ignore data privacy checks?

      • Sorry to jump on this thread. I’m curious why you feel this might be a privacy issue. I’m asking because I am getting a similar error and feel I’ve ruled out Privacy Settings issues.

        The query I’m trying to run looks like this:

        Source = Table.Combine(List.Transform(list_parameters, each Xml.Tables(
        Web.Contents(“https://” & Server_Name & “/esys/”,
        [RelativePath=”api/restxml/” & API_Name & “/”,

        Server_Name and API_Name are the name of the server and the name of the API to call.

        list_parameters is a list of records that contain the Query parameters. There are two – Project and BudgetType (as in Project = 11111& BudgetType = BD).

        The query works in Desktop and in a Dataflow but balks with the error the other poster listed when it is run as a non-dataflow dataset refreshed through the Service.

        However, if I remove the part that tries to parse through the parameter list, I can refresh through the Service. That query looks like this:

        Source = Xml.Tables(
        Web.Contents(“https://” & EcoSys_Server & “/ecosys/”,
        [RelativePath=”api/restxml/” & API_Name & “/”,

        where params = [Project=#”Project Number”]

        Based on that, I don’t feel that there is a privacy setting issue otherwise neither query would be refresh-able.

        Any insight would be appreciated as I’m at a dead-end. The point of the query is to be able to dynamically handle the budget types since there can be anywhere from 4-16 and new ones can be added at any time.

      • The “references other queries or steps, so it may not directly access a data source” error message is always related to data privacy and the Formula Firewall. What error message are you getting?

  47. Thanks for the response Chris. Sorry for late reply. The error message states as follows:

    [Unable to combine data]/BudgetSourceTable/Source references other queries or steps so it may not directly access a data source. Please rebuild this data combination.

    I am actually getting the error in Desktop as well. It doesn’t occur when I turn off the privacy settings and does occur as an OLE DB or ODBC error.

    Here is the entire query:


    API_Name = “BudgetAPI”, \\API that calls the budget data
    ServerName = #”Server”, \\partial path of the server

    #”——Get Budget Types to loop through—-” = “”,
    GetListfromAliasesQuery = #”Aliases Query”,
    #”Removed Other Columns” = Table.SelectColumns(GetListfromAliasesQuery,{“Project”, “BudgetType”}),

    list_parameters = Table.ToRecords(#”Removed Other Columns”),

    Source = Table.Combine(List.Transform(list_parameters, each Xml.Tables(
    Web.Contents(“https://” & Server & “/ecosys/”,
    [RelativePath=”api/restxml/” & API_Name & “/”,

    This Step: GetListfromAliasesQuery = #”Aliases Query” calls a table from another query that has a field for project number and a field for budget type which gets converted to a list of records here:

    list_parameters = Table.ToRecords(#”Removed Other Columns”)

    list_parameters is referenced in the Source Step which calls the API for each record in the list.

    If I create the table of budget types manually (i.e. click Enter Data and create a table with the data) and use that as the reference in the GetListfromAliasesQuery step, the query works. If I try to use a Query that calls the data source, I get the error.

    So I need to rebuild the data combination but I cannot figure out how to do that. I need to call the source because the number of budget types can vary from 4 -16 based on the project and projects can add budget types midstream, so just typing them into a table in the query editor is not a good option.

    I have even tried creating a query to the Alias API and creating an interim query that references it and using that in the GetListfromAliasQuery step and still get the same error.

    I guess I’m really trying to understand what the statement – Source references other queries or steps so it may not directly access a data source actually means and how does a person rebuild a data combination that the query editor does not balk at?

    • FYI. I was able to finally resolve the issue with this query. However, the reasoning seems slightly counter-intuitive to me.

      The wrinkle was that I had to make an external call to the API that contained the budget type data WITHIN the same query. So the Step: GetListfromAliasQuery directly calls an API to create the list of parameters for the Source Step rather then referring to another query to get that information.

      In some ways this makes sense. I was stumbling on the idea that I needed to combine queries, but there is no way to do that because the Source Step is dependent on the Alias Step. I just don’t understand why it all must occur in the same query when the data can be passed from another query.

      Here is the updated query:


      API_Name = “BudgetAPI”, \\API that calls the budget data
      ServerName = #”Server”, \\partial path of the server

      #”——Get Budget Types to loop through—-” = “”,

      GetListfromAliasesQuery = Xml.Tables(Web.Contents(“https://” & ServerName” & “/ecosys/api/restxml/BudgetTypesAPI/?Project=” & #”Project Number”)),

      #”Removed Other Columns” = Table.SelectColumns(GetListfromAliasesQuery,{“Project”, “BudgetType”}),

      list_parameters = Table.ToRecords(#”Removed Other Columns”),

      Source = Table.Combine(List.Transform(list_parameters, each Xml.Tables(
      Web.Contents(“https://” & Server & “/ecosys/”,[RelativePath=”api/restxml/” & API_Name & “/”,Query=_]

      Live and learn, I guess.

  48. Hi Chris,

    I tried implementing the steps into this query and it did not work. The problem I have is that I need part of the URL to be dynamic, and also, part of the parameters in the web query.

    Is there a solution for me? – I need to schedule these web calls every day. If not possible in Powerbi, what would be the next easiest solution for me (I am not a developer/programmer), to schedule web calls apis refresh to be stored automatically in any database in my local computer (to be then refreshed through the powerbi service)?

    MasterQuery = (symbol as text) => let
    Source = Json.Document(Web.Contents(“https://api-v2.intrinio.com/securities/” & symbol & “/prices?” & Current_StartDate & “&end_date=” & Current_EndDate & “&frequency=daily&page_size=10000&api_key=XXX=”))


    InsertedCustom = Table.AddColumn(Symbols1, “Custom”, each MasterQuery([Intrinio Symbol])),

    #”Expanded Custom” = Table.ExpandRecordColumn(InsertedCustom, “Custom”, {“stock_prices”}, {“stock_prices”}),
    #”Expanded stock_prices” = Table.ExpandListColumn(#”Expanded Custom”, “stock_prices”),
    #”Expanded stock_prices1″ = Table.ExpandRecordColumn(#”Expanded stock_prices”, “stock_prices”, {“date”, “open”, “high”, “low”, “close”, “volume”, “adj_open”, “adj_high”, “adj_low”, “adj_close”, “adj_volume”}),
    #”Changed Type” = Table.TransformColumnTypes(#”Expanded stock_prices1″,{{“open”, type number}, {“high”, type number}, {“low”, type number}, {“close”, type number}, {“volume”, type number}, {“adj_open”, type number}, {“adj_high”, type number}, {“adj_low”, type number}, {“adj_close”, type number}, {“adj_volume”, type number}})
    #”Changed Type”

  49. I tried to bypass the dynamic URL’s by hardcoding URL strings and then feeding them to Web.Contents() in a new custom column via:


    However this still gets flagged with “Query contains unsupported function. Function name: Web.Contents” The URL’s are static, so what am I missing? Do I need to configure a gateway or is there something simple I am missing?

  50. Thanks Chris! This really helped me out with calling a Mailchimp API.

    It’s Nov 2019 and this is still an issue from 2016? I found various links to uservoice suggestions to vote to fix this, but they are all now dead links. They must have deleted the suggestions and not fixed the issue?

    Separate complaint: I couldn’t use the Mailchimp connector as they always failed. I guess those have been in beta mode since release (in 2016?).

    Anyways, thanks!!!

  51. Chris – Great info. I am using an open source product called DHIS 2, querying via web.contents with Power BI. The issue I am running into when I try to setup a query with query params in a PBI table as you suggest is that DHIS 2 queries are not unique, so can’t be setup as table column headers. For instance, you might have a query string such as “?dimension=pe:2018,1019&dimension=ao:&dimension=co:” etc. so the dimension parameter is used more than once. How can that be squared with what Power BI needs for a table?

  52. Pingback: Dashboards, M, NFL and more… – Roundup #58 – 360reports.io Power BI Videos

  53. Pingback: My new workstation build – 2016 (Video Editing and Rendering + Gaming) – 360reports.io Power BI Videos

  54. Hi Chris, thanks for all your work on this topic here. Since this week MS has deactivated the “Skip test connection” checkbox for web sources again. Now, I am trying to find a workaround for my sources that use JWT token auth.

    This is my code, the error is: he credentials provided for the Web source are invalid.

    contracts = let
    urllogin = “/api/login”,
    GetJson = Json.Document(Web.Contents(“https://testapp.com“&urllogin, [Headers=[Accept=”application/json”, #”Content-Type”=”application/x-www-form-urlencoded;charset=UTF-8″], Content=Text.ToBinary(“email=user@serviceuser.com&password=testpass&grant_type=password”)])),
    data = GetJson[data],
    access_token1 = data[access_token],
    url1 = “/api/contracts”,

    source = Web.Contents(“https://testapp.com&url1, [Headers=[Authorization=”Bearer”&access_token1, ContentType=”application/json”]]),
    Result = Json.Document(source),
    data1 = Result[data],
    data2 = data1{0},
    data3 = data2[data]

    contracts1 = contracts{0},
    data = contracts1[data],
    #”In Tabelle konvertiert” = Table.FromList(data, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
    #”Erweiterte Column1″ = Table.ExpandRecordColumn(#”In Tabelle konvertiert”, “Column1”, {“id”, “name”, “contractnumber”})
    #”Erweiterte Column1″

    Thanks for your help!

  55. Hello Chris,

    Thank you first of all for all the particles you wrote, they really give me a lot of help.

    Recently I encounter a problem with Web.Contents in BI Service. I created a function for processing excel into our standard format, there can be many excels in different folder paths in a Sharepoint site. Therefore, I need to pass folder path + file name to this function I created to process excels one by one.

    The problem is, when here is a parameter used in URL in Web.Contents, it works totally fine in Desktop, but I cannot refresh in Power BI Service.

    Source = Excel.Workbook(Web.Contents(“https://globalappsportal.sharepoint.com/sites/mySiteName/Shared Documents/General/”&fileNameWithRelativePath), null, true),

    If I use relative path parameter of Web.Contents, credential will fail, I think because BI uses the site’s root to check credential, but my ID can only access certain sites in Sharepoint.

    Is there a way to solve this? Thank you so much for your time and help.

  56. Hi, this is helpful but I need some help.
    I am connecting to api and i am getting an issue
    I am not sure about what happened since I used an api before and it is not returning this issue. I also setup a scheduled refresh before.

    Here is my code below (App ID and key are parameterized):

    enddate = Date.ToText(Date.AddDays(Date.From(DateTime.FixedLocalNow()),-1), “yyyy-MM-dd”),
    url = “https://sampleapi.net/api/metrics/”,
    query = “/dataTable?startDate=2020-03-01&endDate=” & enddate & “&userText=Text”,
    Source = Json.Document(Web.Contents(url & AppID & query, [Headers=[#”x-api-key”=AppKey]]))

    I tried using variables for other components of the url but it still didn’t work.

    I wanna try the relative path but it still wouldn’t work, I don’t know what I’m doing wrong.

  57. Pingback: Como obter dados de APIs e atualizar automaticamente - Aprenda Power BI

  58. Hi Chris,

    I am trying to create a connection from Power Bi to GraphQL which is in Prisma
    I have converted my GraphQL query into POwer Query
    and using blank query added my code in Advance editor as shown below it not the complete code but the header is same I am connecting it via anonymous and Authorization as blank as shown below (I have commented it) and with this I am able to load the data in POwer Bi desktop.
    when I publish it to POwer BI service I am not able to refresh the dataset.
    as it is live connection how can I manage this.
    As source is also in cloud I am not sure why Gateway is required in first place and even if gateway is required I am not able to refresh the data with same settings done at Power BI desktop end.
    Please help!

    Source = Web.Contents(
    //#”Authorization”=”Bearer ”
    Content=Text.ToBinary(“{“”query””: “”query { findManyClaim(take: -10, where: { identifier: { startsWith: \””N\”” } }) { identifier effectivePeriodStart

  59. = (Date as text) => let
    Source = Json.Document(Web.Contents(“https://api.exchangeratesapi.io/” & Date & “?base=USD”)),
    rates = Source[rates],

    I have the above code. How make it work for refresh in power bi service

  60. Pingback: Resources to learn Power Query – Ninmonkey

  61. Hello Chris,

    I am a newbie on Power BI, and I am facing the very same situation that you described in this post, I need to go thru several pages to get some info , this is the URL which needs credentials:

    I need to go from “?id=1 to 1000”
    I set the following:

    Source = Web.Page(Web.Contents(“http://webapps.tenaris.net/TLN/NonConformity/Manage?id=1”,[Query=[id=IDWeb]])),
    I have set the full URL due to the fact that any otherroot of the URL is invalid. IDWeb is an index column. Everything run smoothly on Power BI Desktop,

    However, when I got to PBI Server and I check the Data Source Settings I got this : http://webapps.tenaris.net/TLN/NonConformity/Manage which is not a valid URL, I expected to have: “http://webapps.tenaris.net/TLN/NonConformity/Manage?id=1” therefore I am not able to set the credential on Power BI server, I am not sure what I am doing wrong.
    Thanks in advance.


        Source = (VersionsRelevantSharePointListName as text, VersionsRelevantSharePointLocation as text, VersionsRelevantItemID as number) => let

        Source = Xml.Tables(Web.Contents(Text.Combine({



        VersionsRelevantSharePointListName ,





        entry = Source{0}[entry],

        #”Removed Other Columns2″ = Table.SelectColumns(entry,{“content”}),

        #”Expanded content” = Table.ExpandTableColumn(#”Removed Other Columns2″, “content”, {“http://schemas.microsoft.com/ado/2007/08/dataservices/metadata”}, {“content”}),

        #”Expanded content1″ = Table.ExpandTableColumn(#”Expanded content”, “content”, {“properties”}, {“properties”}),

        #”Expanded properties” = Table.ExpandTableColumn(#”Expanded content1″, “properties”, {“http://schemas.microsoft.com/ado/2007/08/dataservices”}, {“properties”})


        #”Expanded properties”

  62. Pingback: How to get and automatically refresh APIs data - Power BI Experience

  63. Pingback: Power Query Custom Functions – Cheat Sheet – Ninmonkey

  64. Pingback: Power Query - Dynamic Data Source and Web.Contents() - Hat Full of Data

  65. Hi, I have a problem with that. I used exactly code like you did. I just changed the URL in Source and I have error like this:

    Expression.Error: The field ‘success’ of the record wasn’t found.

    Thank you in advance for any hints 🙂

  66. I have a similar use case which used to work before. But from May 2021 version of Power BI Report Server, the report refresh on service fails with the error: “This report contains dynamic datasources, which cannot be refreshed in Power BI Report Server.” Is there a way to make these work?

  67. Thanks Chris for such a good and explanatory blog!

    Here is my code which I am using on power bi desktop, trying to fetch records from my Jira query, everything works fine, however when I am trying to schedule the refresh on app.powerbi.com is gives error as mentioned in above blog.

    Source = Json.Document(Web.Contents(JIRA_URL,[RelativePath=”/rest/api/2/search”,Query=[jql=& TESTCASE_QUERY]])),
    #”Converted to Table” = Record.ToTable(Source),
    #”Transposed Table” = Table.Transpose(#”Converted to Table”),
    #”Promoted Headers” = Table.PromoteHeaders(#”Transposed Table”, [PromoteAllScalars=true]),
    #”Changed Type” = Table.TransformColumnTypes(#”Promoted Headers”,{{“expand”, type text}, {“startAt”, Int64.Type}, {“maxResults”, Int64.Type}, {“total”, Int64.Type}, {“issues”, type any}}),
    #”Removed Other Columns” = Table.SelectColumns(#”Changed Type”,{“total”}),
    #”total” = #”Removed Other Columns”{0}[total],
    #”startAt List” = List.Generate(()=>0, each _ < #"total", each _ +100), #"Converted to Table1" = Table.FromList(#"startAt List", Splitter.SplitByNothing(), null, null, ExtraValues.Error),
    #"Renamed Columns" = Table.RenameColumns(#"Converted to Table1",{{"Column1", "startAt"}}),
    #"Added Custom" = Table.AddColumn(#"Renamed Columns", "URL", each JIRA_URL & "/rest/api/2/search?maxResults=100&jql=" & TESTCASE_QUERY & "&startAt=" & Text.From([startAt])), data = List.Transform(#"Added Custom"[URL], each Json.Document(Web.Contents(_))),
    #"Converted to TableQuery" = Table.FromList(data, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
    #"Expanded ColumnIssues" = Table.ExpandRecordColumn(#"Converted to TableQuery", "Column1", {"issues"}, {"issues"}),
    #"Expanded issues" = Table.ExpandListColumn(#"Expanded ColumnIssues", "issues"),
    #"Expanded issues1" = Table.ExpandRecordColumn(#"Expanded issues", "issues", {"id", "key", "fields"}, {"id", "key", "fields"}) in
    #"Expanded issues1"

    What I am missing here?

  68. Hi Chris,

    Thanks for great blog, really appreciate it!

    However, I am struggling to fix it, here is my code.

    While trying to schedule the dataset refresh for my Power BI Jira report (I am using Power BI Desktop for building the report and getting published to my Power BI account).

    I am getting below error:

    You can’t schedule refresh for this dataset because the following data sources currently don’t support refresh:
    Data source for Query1

    When I checked my “Data Source Settings” I can see the warning saying Some data sources may not be listed because of hand-authored queries.

    Here is my Power BI Query:

    Source = Json.Document(Web.Contents(JIRA_URL,[RelativePath=”/rest/api/2/search”,Query=[jql=& TESTCASE_QUERY]])),
    #”Converted to Table” = Record.ToTable(Source),
    #”Transposed Table” = Table.Transpose(#”Converted to Table”),
    #”Promoted Headers” = Table.PromoteHeaders(#”Transposed Table”, [PromoteAllScalars=true]),
    #”Changed Type” = Table.TransformColumnTypes(#”Promoted Headers”,{{“expand”, type text}, {“startAt”, Int64.Type}, {“maxResults”, Int64.Type}, {“total”, Int64.Type}, {“issues”, type any}}),
    #”Removed Other Columns” = Table.SelectColumns(#”Changed Type”,{“total”}),
    #”total” = #”Removed Other Columns”{0}[total],
    #”startAt List” = List.Generate(()=>0, each _ < #"total", each _ +100), #"Converted to Table1" = Table.FromList(#"startAt List", Splitter.SplitByNothing(), null, null, ExtraValues.Error),
    #"Renamed Columns" = Table.RenameColumns(#"Converted to Table1",{{"Column1", "startAt"}}),
    #"Added Custom" = Table.AddColumn(#"Renamed Columns", "URL", each JIRA_URL & "/rest/api/2/search?maxResults=100&jql=" & TESTCASE_QUERY & "&startAt=" & Text.From([startAt])), data = List.Transform(#"Added Custom"[URL], each Json.Document(Web.Contents(_))),
    #"Converted to TableQuery" = Table.FromList(data, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
    #"Expanded ColumnIssues" = Table.ExpandRecordColumn(#"Converted to TableQuery", "Column1", {"issues"}, {"issues"}),
    #"Expanded issues" = Table.ExpandListColumn(#"Expanded ColumnIssues", "issues"),
    #"Expanded issues1" = Table.ExpandRecordColumn(#"Expanded issues", "issues", {"id", "key", "fields"}, {"id", "key", "fields"}) in
    #"Expanded issues1"

    I tried using the relative path as well, however, not able to solve this issue. Is it because I have used one more query inside as well #"Added Custom" = Table.AddColumn(#"Renamed Columns", "URL", each JIRA_URL & "/rest/api/2/search?maxResults=100&jql=" & QUERY & "&startAt=" & Text.From([startAt])), data = List.Transform(#"Added Custom"[URL], each Json.Document(Web.Contents(_))),

  69. Pingback: Connecting to Harvest in Power Query – JenKuntz.ca

  70. Pingback: Connecting to Harvest in Power Query - Microsoft Dynamics GP Community

  71. Pingback: Connecting to Harvest in Energy Question – Ahosti – Latest IT News Trends

  72. Pingback: powerbi Web.Contents Error 400 - Boot Panic

  73. Hi Chris! Very useful info.
    Nevertheless i still cannot figure out why cant I schedule the refresh from the PBI Service. The API call implemented in my Query is as follows, and still i am getting the error related to dynamic data source’s incompatibility to automatic refresh. Your guidance on how to fix my approach will be greatly appreciated:

    Source = let
    sitename =”RPATeamChile”,
    listname = “Logs”,
    baseurl = “https://teams.wal-mart.com/sites/” & sitename & “/_api/web/lists/GetByTitle(‘” & listname & “‘)/”,
    itemcount = Json.Document(Web.Contents(baseurl&”ItemCount”, [Headers=[Accept=”application/json”]]))[value],
    skiplist = List.Numbers(0, Number.RoundUp(itemcount/5000), 5000),
    #”Converted to Table” = Table.FromList(skiplist, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
    #”Renamed Columns” = Table.RenameColumns(#”Converted to Table”,{{“Column1”, “Skip”}}),
    #”Changed Type” = Table.TransformColumnTypes(#”Renamed Columns”,{{“Skip”, type text}}),
    fieldselect = “&$top=5000&$select=Title,process_id,state,retry,started,ended,message,custom_field_1,custom_field_2,custom_field_3,Attachments,AttachmentFiles,AttachmentFiles/ServerRelativeUrl&$expand=AttachmentFiles”,
    Custom1 = Table.AddColumn(#”Changed Type”, “Items”, each Json.Document(Web.Contents(baseurl& “/items?$skipToken=Paged=TRUE%26p_ID=” & [Skip] & fieldselect, [Headers=[Accept=”application/json”]]))),
    #”Expanded Items” = Table.ExpandRecordColumn(Custom1, “Items”, {“value”}, {“value”}),
    #”Expanded value” = Table.ExpandListColumn(#”Expanded Items”, “value”)
    #”Expanded value”,
    #”Expanded value” = Table.ExpandRecordColumn(Source, “value”, {“AttachmentFiles”, “Title”, “Attachments”, “state”, “retry”, “started”, “ended”, “message”, “custom_field_1”, “custom_field_2”, “custom_field_3”, “process_id”}, {“value.AttachmentFiles”, “value.Title”, “value.Attachments”, “value.state”, “value.retry”, “value.started”, “value.ended”, “value.message”, “value.custom_field_1”, “value.custom_field_2”, “value.custom_field_3”, “value.process_id”}),
    #”Added Custom” = Table.AddColumn(#”Expanded value”, “screenshot”, each try “https://teams.wal-mart.com/”&Record.Field([value.AttachmentFiles]{0},”ServerRelativeUrl”) otherwise “”),
    #”Removed Columns” = Table.RemoveColumns(#”Added Custom”,{“Skip”, “value.AttachmentFiles”, “value.Attachments”}),
    #”Renamed Columns” = Table.RenameColumns(#”Removed Columns”,{{“value.Title”, “reference”}, {“value.state”, “state”}, {“value.retry”, “retry”}, {“value.started”, “started”}, {“value.ended”, “ended”}, {“value.message”, “message”}, {“value.custom_field_1”, “custom_field_1”}, {“value.custom_field_2”, “custom_field_2”}, {“value.custom_field_3”, “custom_field_3”}, {“value.process_id”, “process_id”}}),
    #”Changed Type” = Table.TransformColumnTypes(#”Renamed Columns”,{{“started”, type datetime}, {“ended”, type datetime}}),
    #”Replaced Value” = Table.ReplaceValue(#”Changed Type”,null,1,Replacer.ReplaceValue,{“custom_field_1″}),
    #”Reordered Columns” = Table.ReorderColumns(#”Replaced Value”,{“reference”, “process_id”, “state”, “retry”, “started”, “ended”, “message”, “custom_field_1”, “custom_field_2”, “custom_field_3”, “screenshot”}),
    #”Changed Type1″ = Table.TransformColumnTypes(#”Reordered Columns”,{{“custom_field_1″, Int64.Type}})
    #”Changed Type1”

    I cant understand the code in detail

  75. Ok, I know it is 2022, you created this in 2016 but man… You just SAVED my life… Thank you so so much!!! You should be proud of it, congratulations! Thank you a million times. PS: this is obviously a bug… MS should fix this… And no, skip test connection didn’t work for me.
    Have I said thank you already? THANK YOU!

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: