Web.Contents(), M Functions And Dataset Refresh Errors In Power BI

One slightly frustrating feature of Power BI is that some of the cool stuff you can do in M code for loading data, and which works in Power BI Desktop (and in Power Query), causes errors when you try to refresh your dataset after it has been published to PowerBI.com. I recently learned some interesting tricks for working around these problems when you are using Web.Contents() and M custom functions, which I thought deserved a wider audience and which are the subject of this post; thanks are due to Curt Hagenlocher of Microsoft and Idan Cohen for sharing this information.

First of all, I recommend you read my previous post on using the RelativePath and Query options with Web.Contents() to get some background, not just on the M functionality I’ll be using but also on the web service I’ll be using in my examples.

Let’s look at an example of where the problem occurs. The following M query uses a function to call the UK government’s open data metadata search API multiple times and then return a result into a table:

let
    Terms = 
      #table(
       {"Term"},
       {{"apples"}, {"oranges"}, {"pears"}}
       ),
    SearchSuccessful = (Term) => 
    let
        Source = 
         Json.Document(
           Web.Contents(
             "https://data.gov.uk/api/3/action/package_search?q=" 
             & Term
            )
           ),
        Success = Source[success]
    in
        Success,
    Output = 
     Table.AddColumn(
       Terms, 
       "Search Successful", 
       each SearchSuccessful([Term])
      )
in
    Output

Here’s the output:

image

This is just a variation on the widely-used M pattern for using functions to iterate over and combine data from multiple data sources; Matt Masson has a good blog describing this pattern here. In this case I’m doing the following:

  • Defining a table using #table() with three rows containing three search terms.
  • Defining a function that calls the metadata API. It takes one parameter, a search term, and returns a value indicating whether the search was successful or not from the JSON document returned. What the API actually returns isn’t relevant here, though, just the fact that I’m calling it. Note the highlighted lines in the code above that show how I’m constructing the URL passed to Web.Contents() by simply concatenating the base URL with the string passed in via the custom function’s Term parameter.
  • Adding a custom column to the table returned by the first step, and calling the function defined in the second step using the search term given in each row.

This query refreshes with no problems in Power BI Desktop. However, when you publish a report that uses this code to PowerBI.com and try to refresh the dataset, you’ll see that refresh fails and returns a rather unhelpful error message:

Data source error Unable to refresh the model (id=1264553) because it references an unsupported data source.

image

image

The problem is that when a published dataset is refreshed, Power BI does some static analysis on the code to determine what the data sources for the dataset are and whether the supplied credentials are correct. Unfortunately in some cases, such as when the definition of a data source depends on the parameters from a custom M function, that static analysis fails and therefore the dataset does not refresh.

The good news is that when, as in this case, the data source is a call to Web.Contents() then Power BI only checks the base url passed into the first parameter during static analysis – and as my previous blog post shows, by using the RelativePath and Query options with Web.Contents() you can leave the value passed to the first parameter as a static string. Therefore, the following version of the query does refresh successfully in Power BI:

let
    Terms = 
      #table(
       {"Term"},
       {{"apples"}, {"oranges"}, {"pears"}}
       ),
    SearchSuccessful = (Term) => 
    let
        Source = 
         Json.Document(
           Web.Contents(
             "https://data.gov.uk/api/3/action/package_search", 
             [Query=[q=Term]]
            )
           ),
        Success = Source[success]
    in
        Success,
    Output = 
     Table.AddColumn(
       Terms, 
       "Search Successful", 
       each SearchSuccessful([Term])
      )
in
    Output

This technique will only work if the url passed to the first parameter of Web.Contents() is valid in itself, is accessible and does not return an error. But what if it isn’t? Luckily there’s another trick you can play: when you specify the Query option it can override parts of the url supplied in the first parameter. For example, take the following expression:

Web.Contents(
 "https://data.gov.uk/api/3/action/package_search?q=apples", 
 [Query=[q="oranges"]]
)

When static analysis is carried out before dataset refresh, the url

https://data.gov.uk/api/3/action/package_search?q=apples

..is evaluated. However when the dataset is actually refreshed, the search term in the Query option overrides the search term in the base url, so that the call to the web service that is actually made and whose data is used by the query is:

https://data.gov.uk/api/3/action/package_search?q=oranges

This means you can specify a base url that isn’t really just a base url just so that static analysis succeeds, and then use the Query option to construct the url you really want to use.

Of course this is all a bit of a hack and I’m sure, eventually, we’ll get to the point where any M code that works in Power BI Desktop and/or Power Query works in a published report. However it doesn’t sound as though this will be happening in the near future so it’s good to know how to work around this problem. I wonder whether there are other, similar tricks you can play with functions that access data sources apart from Web.Contents()? I need to do some testing…

UPDATE: The “Skip Test Connection” option on Power BI data sources, added in April 2019, solves some of the problems you run into where you can’t use RelativePath or Query to construct the url. See https://blog.crossjoin.co.uk/2019/04/25/skip-test-connection-power-bi-refresh-failures/

113 responses

  1. Really interesting post, as always Chris. I am actually struggling with apparently the same problem, this case with the AzureEnterprise.Contents() function. If you want more details in order to test it, just mail me. Regards!

  2. I recently stumbled across this problem when deploying a PowerQuery to our SharePoint site to run with Excel Services. I’m going to try this technique instead of passing a static url with the query embedded. It might work! Thanks for your posts!

  3. I stumbled upon this technique myself, the relativePath and Query arguments are well hidden in the documentation for sure.

    Iterating over many requests can help with paged API requests for large datasets, just remember power BI is asynchronous ask the data may not come in in order.

    Merging two data sources can still be a problem on the refresh. I get around this by having one data source pull I the original data then start a second table with:

    Let mergedTable = OriginalTable

    You can now merge as many tables of different API SOURCES together without the refresh complaining.

  4. Fun fact: I added RelativePath as an option to Web.Contents solely to provide a workaround for this problem.

  5. Pingback: Dashboards, M, NFL and more... - Roundup #58 | Guy in a Cube

  6. Hello Chris, thank you for this amazing post: it’s been very helpful because I am stuck in this precise scenario (with a slight twist).
    I am trying to parametrize the access query to files inside a specific OneDrive folder.

    I can get the list of all files inside my OneDrive easily with
    src = SharePoint.Files(my_organization_url, [ApiVersion = 15])
    and then get my folder
    filtered = Table.SelectRows(src, each ([Folder Path] = “my_folder_path”))
    I then select two, columns, “Folder Path” and “Name”
    select_cols = Table.SelectColumns(filtered,{“Folder Path”, “Name”})
    and merge them to get my pathname list
    file_path_names = Table.CombineColumns(select_cols,{“Folder Path”, “Name”},Combiner.CombineTextByDelimiter(“”, QuoteStyle.None),”filePathName”)

    At this point, I am ready to apply my function fxExtract to [filePathName]:
    let
    fxExtract =(filepathname as text) as table =>
    let
    data = Excel.Workbook(Web.Contents(filepathname), null, true)
    in
    data
    in
    fxExtract
    by adding a custom column like this:
    extract_data = Table.AddColumn(file_path_names, “wb_data”, each fxExtract([filePathName]))

    After reading your post everybody knows that this will work in Power BI Desktop but it will not support refresh in Power BI Service (I also verified it by myself).

    So i tried to follow your suggestion, decomposing the Web.Contents part but I encountered two blocking problems.

    FIRST PROBLEM – Base URL
    You said that, in order for it to succeed, the first part of the URL must be a valid URL.
    This means I cannot use any “folder” URLs (by omitting the file name in the URL), because then authentication will fail and this in turn will make Web.Contents fail as well.

    (P.S.: I have tried with several paths in my OneDrive and it seems Power BI would not let you connect to them. If you use Web as a source, it will only accepts full file path names., not folder path names).

    Do you have any idea, in light of this, on how to setup the base URL in this case?

    SECOND PROBLEM – Parametrization using Query option
    Even if I figured out the first problem, I would stay stuck on how to pass the file names to the base URL by using the Query option.
    Let’s consider your example for clarity’s sake:
               Web.Contents(
                 working_base_url,
                 [Query=[q=filename]]
                )
    This is obviously wrong in my case, because ‘q’ has been specified as a parameter, so the resulting URL will be:
    working_base_url/?q=filename
    while I am looking for the following form:
    working_base_url/filename

    I actually can’t find any documentation on how to use the optional Query record, but I have a feeling this must be possible to accomplish.
    I would try with [Query=filename] but I first need to solve the first problem.

    Sorry for the lengthy comment and thanks for reading, any help would be greatly appreciated.
    I believe this is an interesting technique that could be useful to many people.

    P.P.S.: I also started to approach the same problem through Microsoft Graph API calls, but I am not sure it’s the way to go because of the ApiKey. I think it might be more of a hassle than a real solution. Nonetheless I actually got to list the files in the folder, but I still need to get the binaries:
    https://graph.microsoft.com/beta/users(USER_ID)/drive/root/children(USERFOLDER_ID)

    • Hmm, this looks like a difficult problem. I have no idea what the answer to either of these questions could be – I would need to spend quite a lot of time testing to try to find out. When I have some free time I’ll try to take a look but it will be some time before I’ll be able to do this I’m afraid.

  7. Hi Chris,
    thanks for your reply.
    It is indeed a tricky scenario, and I posted it to you almost exclusively for informative reasons.
    I would dig deeper myself, but I cannot find any additional reference other than this:
    https://msdn.microsoft.com/en-us/library/mt260892.aspx

    In this MSDN reference there is no focus on parameters and I really don’t know what I am touching there.
    By the way, I tried with [Query=filename] and it doesn’t work (expected I would say).

    I would be more than happy If you can point me towards more thorough reference or documentation, and if there isn’t any I’ll write directly to Microsoft.

    Thanks for your time,
    if I make progress I’ll report back here, this technique could be quite usueful.

    • I can’t resist a challenge, and so even though there are thousands of more important things I should be doing I just spent several hours solving this problem 🙂 I will write up the solution in a day or two in a separate blog post.

      • Chris, this is amazing!
        Your reply really made my day, and I can’t wait to read your take on this issue.

        If this technique proves to be feasible, Power BI users can basically use an arbitrary OneDrive folder as a data source that supports scheduled refresh: this would simply awesome (M language is awesome!)

        Thanks again for the kind interest you showed for this issue.

        Andrea

      • Here are a few hints while you wait… Don’t try using Web.Content(), I don’t think there’s a way it can be made to work here. Instead, because SharePoint.Files() returns a link to the content of each file(in the Content column), you just need to create a function that takes a binary, turns it into an Excel workbook and extracts the table/range/sheet you need, then call that function in a Calculated column on the table returned by SharePoint.Files(), passing the Content column to your function, then finally expand out the resulting column to get at the data.

      • Hello Chris,
        following your hints I was able to develop the solution on my own with two Functions and a Query.
        The dataset now offers the option to schedule updates, this is absolutely fantastic.

        I wont spoil anything to your reader while we wait for the post, but I already noted down a couple of observations to do.
        I will wait for the post to appear and use that comment section.

        As always, I want you to know that your work is greatly appreciated.
        Thank you very much for the help!

  8. Chris,
    Thanks for this excellent post – I am trying it now to get round a similar problem in PowerBI.

    In my case I have defined a separate function which I call from another table.
    I have also been using one of your other posts on monitoring performance to see how long the API calls take to run as there is some concern over the impact of my API calls on the server.
    What I’m a bit puzzled by is why PowerBI seems to call the API twice when the function is called. I have included a fragment of the trace analysis below.
    I wasn’t sure if it was because of the way I was invoking the function or how Web.Content retrieves the data.

    I was wondering if you had ever encountered something similar?

    Many thanks again for your great post and apologies if the question is out of place.

    Date Command Request Duration
    2016-09-15T10:22:28.8108993 Engine/IO/Web/Request/GetResponse RequestUri:https://some url 00:00:16.4505802
    2016-09-15T10:22:14.8284810 Engine/IO/Web/Request/GetResponse RequestUri:https://some url 00:00:13.9823147

  9. Pingback: Loading Data From Multiple Excel Workbooks Into Power BI–And Making Sure Data Refresh Works After Publishing – Chris Webb's BI Blog

  10. Wondering how these can be leverage for function that dynamically iterate over the facebook post id. For e.g In the code below, how do we pass PostID with query option like you shown in your blog?

    Source = Json.Document(Web.Contents(“https://graph.facebook.com/v2.7/”
    & PostID
    &”/insights/post_engaged_users?access_token=”
    & AccessToken)),

    • Try as below :
      Var = AccessToken,
      Json.Document(Web.Contents(“https://graph.facebook.com/v2.7/”
      & PostID
      &”/insights/post_engaged_users,[Query=[access_token=Var]])

      • Thanks @Sachin, but it didn’t work. I think as Chris mentioned it tries to evaluate the URL as it is and it fails as it wont be the valid URL with the “& PostID” in it. Not sure though, but it didn’t work.

      • I am wondering if anyone managed to refresh the Facebook data on power bi service by any chance chance.

  11. Thank You Chris. I was puzzled by this error while working for a customer. The error message thrown by Power Bi online was not at all helpful. Thanks for posting this lovely tip.

  12. Hi guys, same issue here….

    I am pulling paginated rows from API (roughly 500 pages, 50 rows each) via Web.Contents(). The call also needs to contains access token.

    I wonder if someone has already figured out a workaround which could be implemented by a non-programmer?

    Basically, my steps are:
    1. I call API to see how many pages of data there are.
    2. I get the number.
    3. Based on the number, I create the exact number of the “paginated URLs”, I attach access token and I call all those URLs.
    4. I wait.
    5. Voila, my data is in Excel.

    This obviously works in Excel or PBI for Desktop but does not work in PBI Online. I thought I was gonna get heart attack. I spent so many hours on the report to learn that this kind of “data sourcing” is not supported in PBI Online.. Terrible dissapointment..

    Thx for any help!

    • It’s hard to say without knowing a lot more about your API, but there’s no reason why this should not be possible if you can use Web.Contents() in the way I describe in this post.

      • Does not seem to work… I basically took your sample query, did very small changes, converted it to my scenario, pubiished to Power and still same error… Data source cannot be refreshed. I wonder if the token part is causing it.

        The code is:
        let
        Terms =
        #table(
        {“Term”},
        {{“1”}, {“2”}, {“3”}}
        ),
        ApiCall = (Term) =>
        let
        Source =
        Json.Document(
        Web.Contents(
        “https://api.callrail.com/v2/a/SOMEIDSOMEID/calls.json?date_range=all_time&per_page=250&page=”
        & Term
        &”&fields=company_name,keywords,landing_page_url,gclid”, [Headers=[Authorization=”Token token=461817fcf20f88eae7c08028skfkshfhYYZSAMPLESAMPLE”]]

        )
        )
        in
        Source,

        #”XX – Call Pages” =
        Table.AddColumn(
        Terms,
        “Search Successful”,
        each ApiCall([Term])
        ),
        #”Expanded Search Successful” = Table.ExpandRecordColumn(#”XX – Call Pages”, “Search Successful”, {“page”, “per_page”, “total_pages”, “total_records”, “calls”}, {“page”, “per_page”, “total_pages”, “total_records”, “calls”})
        in
        #”Expanded Search Successful”

      • Yes, this is a known issue: only the url in the first parameter is evaluated during the tests, so authentication will fail because it doesn’t pass the token in the header.

    • Hey Daniel – wanted to touch base as I started working with CallRail today as well if you were ever able to find a viable solution. Considering we’re both using v2 – I’m going to say that’s a big “NO” – but if you had any weird workarounds I would certainly be open to hearing about them if you did get it up and running.

      • Hi Alex,
        I do have a solution but it’s relatively major workaround… I pull the data from CallRail to Big Query and then from Big Query to PBI.

        Obviously, CallRail=>Big Query involved some development work outside PBI.

        I was also using a GoogleApps script to pull data to Google Sheets before this solution, but I decided to go for BigQuery as storage which is more stable (I don’t remember the exact reasons anymore, but Google Sheets were not realiable).

        DZ

  13. Hey Chris,

    I have a table in Power BI that contains a single column with id’s I get from a previous api call. I’ve then added a custom column that makes a separate call for each row while appending the id to the end of the path. This worked in desktop, but gave me the head ache you’ve addressed in this post. So, now I have separated the id in my function from the static path to the RelativePath property in my Web.Contents call and though my dataset will now refresh, the credentials cannot be verified without an id at the end of the base URL. This is similar to your example for replacing a Query, but it’s my understanding that RelativePath always gets appended to the end. Is that true and do you know of a work around?

    Thanks

    • RelativePath always gets appended to the end, yes. It sounds like you have hit one of those scenarios where data refresh can’t be made to work, unfortunately – something I’ve run into rather too often myself.

  14. Thanks for sharing this technique. I’m trying to use a table of parameters rather than a hardcoded table and I can’t seem to get it to work in Power BI Desktop without doing the “Ignore the Privacy Levels” workaround. I want the query to be automatically updated in the Power BI Service, so I can’t use that workaround. I’m using a public API for Web.Contents() and the table of parameters is a file on SharePoint. I’ve tried creating the “staging” call per good practices, but that doesn’t seem to help. Is there a way to get this technique to work in the Power BI service without hardcoding the table?

      • Thanks! I read those posts and figured out a way for it to work. I have to (a) combine the queries into a single query thereby removing the “staging” workaround that others have used to avoid the Formula.Firewall() error and (b) match the privacy settings for the file on SharePoint and the external website. Doing either (a) or (b) is not enough to avoid the error. That doesn’t seem like the right implementation to me. I would think the “staging” call should be how it works in this case and you should be allowed to have different privacy settings. Am I missing something? Is there an equivalent to Value.NativeQuery() for Web.Content()?

      • You have no choice but to use Web.Contents(). I don’t fully understand what you’ve done from your description, though – can you explain what you mean about being allowed to have different privacy settings?

      • My understanding is the “staging” workaround allows us can have queries rely on datasets with different privacy settings. In this case, I’d like to have the SharePoint file be “Organizational” and the public website be “public.” That doesn’t seem possible here.

  15. Even though the Power BI desktop offers no error (with privacy settings set to the Combine Data Levels…), the Power BI service failed with this error: [Unable to combine data] Section1/State Legislation Summary/AutoRemovedColumns1 is accessing data sources that have privacy levels which cannot be used together. Please rebuild this data combination.

    It looks like the workaround about setting the Data Source Settings to the same (Organizational in my case) doesn’t seem to work.

    • Have you set the data privacy levels on the data source in the Gateway? You need to do this in the Manage Gateways screen in the browser when you set up the data source; it’s in the Advanced section at the bottom of the screen on the Data Source Settings tab.

  16. I’m not sure I understand. I’m not using a Data Gateway for this dataset — I’m using the Power BI service to host. The workaround I’m speaking about is setting the Data Source Settings in Power BI Desktop. Perhaps I misunderstood how that’s supposed to work?

      • OK, got it. I’m afraid I don’t know what’s going on here – although I did hear about a similar problem today, so when I get more details on it it may turn out that you are suffering from the same problem. As a test, can you install a Personal Gateway and see if that works?

  17. Hi Chris! Thanks for your amazing post. I’ve been triying to use this method with the twitter API. However, i cant get it to work. I believe this may be happening because my base URL is “https://api.twitter.com” which does not seem to be a valid one. Is this the problem? Do you know if there might be something to fix it? Thanks!!

    • It could be. However you’re going to have a lot of problems using the Twitter API anyway, for example with authentication, so I think it would be a better approach to build a separate app to get the data from Twitter and load it into a database, and then use the database as a source for Power BI.

  18. I can’t reply to the latest message in our thread, Chris, so apologies. Anyhow, I really appreciate your help here. I won’t have time to test this through a Personal Gateway so I’ll just have to deal with the behavior as-is for now.

  19. Pingback: [TFS] Visualizando Queries (WIQ) no PowerBI – ESX

  20. Pingback: Dynamic Web.Contents() and Power BI Refresh Errors – Data Inspirations

  21. Hi Chris

    Thanks for the post. I’m having the issue where I’m calling an API with a changing parameter (TestArea). It works in desktop but not Power BI Service as in your examples. The issue is that I can’t authenticate against the URL in Power BI service as it’s not valid. From some of the comments above, I’m interpreting this as something that cannot be solved. Is this correct? My example is below.

    (TestArea as text) as list =>

    let
    Source = Json.Document(Web.Contents(“https://directapi.imaginarysite.com/api/v1/gateway/”,
    [
    RelativePath= “”&TestArea&”/userLogins”,
    Query=[dateStart=”2017-01-01″],
    ]
    ))
    in
    Source

      • Hi Chris

        Is it possible to authenticate in Power BI Service in the manner below? I’m still struggling to get what works in Power BI Desktop to work in Power BI Service with changing URL Parameters coming from an API. The below is a proposed solution I need to get me head around where you authenticate against a base URL within the Power BI desktop that then applies to the entire model. I can’t get it to work however.

        Authenticate separately against the URL below in a generic function/somewhere in Power BI desktop:
        http://www.testsolution.com/auth

        and then use the below URL as the required data source (without the auth part):
        Web.Contents(“www.testsolution.com”,
        [RelativePath=”/api/datsets/”&TestArea&”/location”])

        where “&TestArea&” is the changing parameter coming from the API.

        Thanks for the help and your blog.

  22. Hi there,

    I had a working solution to get PowerBI data into a desktop visual and refreshed fine but when put into the PowerBI service and a refresh was tried I got the same problem you speak about above.
    Unfortunately I am unable to resolve the problem
    Code I am currently using which works fine in PowerBI Desktop:
    let
    SearchIssue = () =>
    let
    Source = fnAllJiraIssues(“”, 500, 1),
    table = Table.ExpandRecordColumn(Source, “Column1”, {“id”}, {“id”}),
    Search = (id) =>
    let
    Issue = Json.Document(Web.Contents(#”URL (3)” & “/rest/api/2/issue/” & id & “?expand=changelog”))
    in
    Issue,
    Output = Table.AddColumn(table, “ChangeHistory”,each Search([id]))
    in
    Output
    in
    SearchIssue

    Anybody know what I should change this to, to get it working In the PowerBI Service?

    I have wrote the following which gives me a syntax error, only changing the issue=

    Issue = json.Document(Web.Contents(“https://jira.co.uk”, [RelativePath=”/rest/api/2/issue/, [Query=[id=id, end=”?expand=changelog”]]] ))

    Or could I just explicitly enter the url as per the 1st one instead of referencing it in a parameter field?
    Issue = Json.Document(Web.Contents(“https://jira.co.uk” & “/rest/api/2/issue/” & id & “?expand=changelog”))

  23. Hi Chris, this did solve the first error message I was receiving but I am now onto the 2nd error
    ” [Unable to combine data] Section1/Change/AutoRemovedColumns1 is accessing data sources that have privacy levels which cannot be used together. Please rebuild this data combination. Table: Change.”

    I have tried to changed privacy settings to always ignore but still get the above error, any ideas?

  24. Hi Chris,

    I am experimenting with PowerBI for a few weeks ( i am a newbee) and I am stuck apparently with the same problem. The ESI Funds open data REST API that I am connetcing to provides two types of services. Lists

    (e.g.) Source = Json.Document(Web.Contents(“https://opendata.itms2014.sk/v2/zonfp/zamietnute”))
    #Easy to get, can be converted to table, refreshes fine in both desktop and powerbi service

    and Details. In order to get them I use the project id from the list as a parameter with the base URL https://opendata.itms2014.sk/v2/zonfp/zamietnute/{projekt id}

    (e.g.) Source = Json.Document(Web.Contents(“https://opendata.itms2014.sk/v2/zonfp/zamietnute/” & #”projekt id”)),
    # with this I get a refresh error in powerBI service

    I tried to implement your solution with the [Query] optional field, but the problem is that it apparently adds by default a “?” (question mark) to the URL and therefore it does not work with the service. The service supports only the GET method so my experiment with the [Content] optional field and POST web requests did not work either.

    I would appriciate any idea on how to get PowerBI service to refresh my details table automatically.

    Thanks,

    Tamas

  25. Hi Chris,

    I think I found a solution. Instead of using the [Query] optional field I constructed the parametrized URL using the [RelativePath] optional field:

    Source = Json.Document(Web.Contents(“https://opendata.itms2014.sk/v2/zonfp/zamietnute/”, [RelativePath=projekt_id])),

    Fortunately the base URL is also valid in itself and now it refreshes also in the PowerBI service. I would not have solved it without your post, thank you very much!

    Tamas

  26. Hi Chris,

    I’m loading Dynamics 365 option sets using a function and I’m not able to refresh it in the PowerBI Service – hitting the issue you try to resolve in this blog.

    The M-code I’m currently using:

    Source = Json.Document(
    Web.Contents(
    “https://xxx.crm4.dynamics.com/api/data/v8.2/EntityDefinitions(LogicalName='”
    & OptionSetTable
    & “‘)/Attributes/Microsoft.Dynamics.CRM.PicklistAttributeMetadata?$select=LogicalName&$filter=LogicalName%20eq%20%27”
    & OptionSetColumn
    & “%27&$expand=OptionSet”
    )
    ),

    I’ve been reading throught the comments (and testing some suggestions I’ve read in the blog and comments) and I’m afraid it won’t be possible to solve my case but just want to double check if there is any solution.

    Thanks in advance,

    Ilse

    • I don’t know if there is a solution here – as I say in the post, it will depend on whether your can use the RelativePath option and still have a url that can be called without an error.

    • Hello Ilse, did you find a solution for the option sets? I’m still looking for a solid solution and was on the same path as you mentioned here.
      It would be so great to have D365 option sets better supported in PowerBI. With ODATA v4.0 it is of course possible to extend the records in M, but performance will be BAD.

    • Hi Chis, Thx for the suggestion. I will definitely check that one out. For getting data out of D365 there is a better solution now by using the CDS (common data services) connector in stead of the D365 connector.
      It has a better performance and ……whoohooo…… has both option value and option label in the result set.

  27. Hi Chris,

    Would you mind looking at this code? I am wondering if I am wasting time trying to get this resolved??

    I am calling an API from an accounting package and I can get a list of page url’s using the BuildPageUrls line.

    The API returns some metadata including a nextpagelink with the relevant stop and skip which maybe I could use somehow?

    The code below works fine in the service and desktop as far as refreshing goes up to the point where I want to call the pages urls and grab each page of the table data.

    Is there any changes to the code you can suggest? I am no 100% sure how I can modify the CallEach Page line to get the data.

    Here is my sample code to date

    let
    Source = Json.Document(Web.Contents(“http://localhost:8080/AccountRight/”,
    [RelativePath = “e76e28f2-87a6-44f9-be7e-93a88f8799bc/GeneralLedger/JournalTransaction/?api-version=v2″,
    Query = [q=”api-version=v2”]
    ]
    ) ),
    PageCount = Number.RoundUp(Source[Count]/1000),
    Pages = {0.. PageCount-1 },
    ToTable = Table.FromList(Pages, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
    BuildPageUrls = Table.AddColumn(ToTable, “GetPages”, each if [Column1] = 0 then
    BaseUrl & CFUID & Resource & Apiversion & “$stop=1000” else
    BaseUrl & CFUID & Resource & Apiversion & “$stop=1000” & “&$skip=” & Text.From( [Column1] * 1000)),
    DataTypes = Table.TransformColumnTypes(BuildPageUrls,{{“GetPages”, type text}}),
    CallEachPage = Table.AddColumn(DataTypes, “CallPages”, each Json.Document ( Web.Contents ([GetPages]))) /*page call fails in service*/
    in
    CallEachPage

    Is there a workable solution in this case out there? or would a custom connector achieve the result required?

    I can get a list of page urls so could I use that data some how to grab the data in a different manner to avoid the issue?

    Thanks for your help in this space, I can see its a real issue for a lot of users..

  28. Hi Chris,

    Really helpful blog post, thanks for your insights.

    It seems to me that your example works because you know in advance what “terms” you are going to pass to the web call. But what if you don’t know these terms?

    I have a similar issue with paginating a response from Airtable’s API. I have tried to think about how I can modify my code so that it doesn’t fail Power BI’s static analysis, but I have drawn a blank.

    Here is the code I’m starting with…

    let
    Pagination = List.Skip(List.Generate( () => [Last_Key = “init”, Counter=0], // Start Value
    each [Last_Key] null, // Condition under which the next execution will happen
    each [ Last_Key = try if [Counter]<1 then "" else [WebCall][Value][offset] otherwise null,// determine the LastKey for the next execution
    WebCall = try if [Counter]<1 then Json.Document(Web.Contents("https://api.airtable.com/v0//?api_key=“)) else Json.Document(Web.Contents(“https://api.airtable.com/v0//?api_key=&offset=”&Last_Key&””)), // retrieve results per call
    Counter = [Counter]+1// internal counter
    ],
    each [WebCall]
    ),1),

    The way you iterate through the API call is to pull the first page of results and then use the offset key from the end of the results in the next call. You have to go through each page to get to the next key. These change with every call.

    The first thing I need to understand is whether the reason the power BI service won’t refresh the dataset is because of the static analysis test, or because of something else that this code does.

    If the former, is it possible to somehow rewrite this using your technique so that the service can be tricked into running this code?

    In your example you begin by creating a table of terms to send to the web call. I wondered about trying to do something similar by creating a table of the offset keys and then passing each one to the web call. But in order to generate such a table you would need to iterate through the whole API call in order to retrieve the keys and you would end up using a similar piece of code to the above, in order to achieve this.

    Any help would be greatly appreciated.

    Chris

    • Looking at your code I would say yes, the reason it’s failing is that it’s failing the static analysis because you’re trying to generate the entire url dynamically. The fact that the key changes with each call is not a problem, neither is the fact that you might be passing different parameters with each call. You should be able to rewrite your code using the technique in this post and make refresh work.

  29. Hi Chris,

    Thank you very much for posting this. It is very helpful. I’m facing a similar, although somewhat different issue and would very much appreciate help.

    I’m getting data either through Web or OData Feed via the following query:

    https://dummy.com/API/Dsl/v1/Attributes(ElementId=**********,%20Name='TimeSeriesValue‘)/RetrieveValuesTimeSliced(Start=2018-10-19T18:30:00Z,End=2018-10-25T00:00:00Z,TimeSpan=’PT10M’)

    in which element ID (*****), start date (2018-10-19T18:30:00Z) and end date (2018-10-25T00:00:00Z) are dynamic. End date is the current time, start date is current time minus 7 days and I got ca. 200 different IDs that I import from the excel file into one column in Power BI.

    I solved this by creating 3 parameters (GUID, StartTime and EndTime) and a following function:

    let
    Source = (GUID as text, StartTime as text, EndTime as text) => let

    Source = OData.Feed(“https://dummy.com/API/Dsl/v1/Attributes(ElementId=” & GUID & “, 20Name=’TimeSeriesValue’)/RetrieveValuesTimeSliced(Start=” & StartTime & “,End=” & EndTime & “,TimeSpan=’PT10M’)”)

    in
    Source
    in
    Source

    Finally, I invoke this custom function to a table containing actual IDs, start and end times, and get my data imported. This works like a charm in Power BI Desktop, no issues. Also refresh works in Power BI Desktop, no problem.

    However, when I publish to Power BI web service, refresh doesn’t work anymore. I read many posts online, but wasn’t able to find a solution for my problem.
    Your url example “https://data.gov.uk/api/3/action/package_search?q=” contains “?” that can be exchanged by “Query”, but I’m not sure if and how could I do this for my example.

    Thanks a lot in advance,
    Ivana

      • I’ve converted this:

        Source = OData.Feed
        (“https://dummy.com/API/Dsl/v1/Attributes(ElementId=”
        & GUID &
        “,%20Name=’TimeSeriesValue’)/RetrieveValuesTimeSliced(Start=”
        & StartTime &
        “,End=”
        & EndTime &
        “,TimeSpan=’PT10M’)”)

        Into:

        Source = OData.Feed(“https://dummy.com/API/Dsl/v1/Attributes(ElementId=”, [Query = [GUID], RelativePath = “,%20Name=’TimeSeriesValue’)/RetrieveValuesTimeSliced(Start=”, Query = [StartTime], RelativePath=”,End=”, Query = [EndTime], RelativePath=”,TimeSpan=’PT10M’)”])

        But unfortunately, it doesn’t work 😦 Do you have any suggestion on how to convert the query properly?

      • Hmm, I’m afraid not. I was looking at this myself recently, and I hadn’t realised that the Query option for OData.Feed() doesn’t work in the same was as the Query option for Web.Contents() – it can only be used for custom query options, not system query options.

  30. Thanks so much Chris for the great article. I was struggling for some time now over this. It has really opened a big aveneue for me. now for the first time I am able to run loops and publish the report for auto refresh. Thanks so much.
    emudria

  31. Pingback: Credentials, Data Privacy Settings And Data Sources In Power Query/Power BI « Chris Webb's BI Blog

  32. Pingback: Fourteenth Blog Birthday « Chris Webb's BI Blog

  33. Does this technic means the URL can’t be a parameter anymore?

    Web.Contents(
    “https://data.gov.uk/api/3/action/package_search?q=apples”,
    [Query=[q=”oranges”]]
    ) works fine

    BUT not

    Web.Contents(#”My Data Source”)

  34. Brilliant write-up, thanks. Are you aware of any changes to the behavior for online-refreshes to help prevent hand-coding around this? I find myself writing to Web.Contents sources a lot – and as you note examples like loops for pagination (E.g. “http://www.anyweb.com/results/page=” & MyParmHere “) <- where MyParmHere is a table, each {1….100}, fails online.

  35. My base url is behaving strange…
    The full url I am trying to access is “https://www.acompany.co.il/odata/abc.ini/xyz/accounts” (where xyz is a parameter)
    I have tried this:
    Source = Json.Document(Web.Contents(“https://www.acompany.co.il/odata/abc.ini/”,
    [RelativePath=”xyz/accounts”])),
    In Desktop works great!
    But in the service I get DataSource error. It asks for user/password for:
    “https://www.acompany.co.il/odata/abc.ini” but doesn’t accept the credentials.
    I have tried the url directly in Chrome and this is SOOO strange:
    This works: “https://www.acompany.co.il/odata/abc.ini/” – it is accepting my credentials
    While this doesn’t: “https://www.acompany.co.il/odata/abc.ini” !!!
    All the difference is in the last SLASH!!!!
    I have tried different syntax but in the service it always askes for credentials without slash in the end – so it fails….
    Also I notice that in the service it recognizes this url as OData (https://www.acompany.co.il/odata/abc.ini/xyz/accounts)
    while this url – as Web
    (https://www.acompany.co.il/odata/abc.ini)
    And it is always shown without a slash in the end, so I cannot log in…
    Please help!
    Thank you!

    • Strange – I don’t know the answer to this, sorry. It must be something to do with the way Power Query stores credentials and permissions (which is related to the url).

  36. Chris, I just want to make sure I understand the you are using here, I have for example a report that makes two api calls. One to recursively get a weather radar station location and then a second to retrieve forecast data from an api based on the previous url returned.

    I will apologies for the sloppy M Code queries below as this is my first venture into using M Query within power bi so I am somewhat still learning its capabilities.

    My queries are as follows;

    = (Latitude as text, Longitude as text) => let
    Source = Json.Document(Web.Contents(“https://api.weather.gov/points/”& Latitude & “,” & Longitude, [Timeout=#duration(0, 0, 2, 0), Headers=[token=”xIZyacjOaGCxWLiGjTprTdAhcXzSclmT”]])),
    #”Converted to Table” = Record.ToTable(Source),
    #”Transposed Table” = Table.Transpose(#”Converted to Table”),
    #”Promoted Headers” = Table.PromoteHeaders(#”Transposed Table”, [PromoteAllScalars=true]),
    #”Changed Type” = Table.TransformColumnTypes(#”Promoted Headers”,{{“@context”, type any}, {“id”, type text}, {“type”, type text}, {“geometry”, type any}, {“properties”, type any}}),
    #”Expanded properties” = Table.ExpandRecordColumn(#”Changed Type”, “properties”, {“forecastOffice”, “forecast”, “county”}, {“properties.forecastOffice”, “properties.forecast”, “properties.county”})
    in
    #”Expanded properties”

    and then

    = (ForecastUrl as text) => let
    Source = Json.Document(Web.Contents(“” & ForecastUrl)),
    properties = Source[properties],
    #”Converted to Table” = Record.ToTable(properties),
    Value = #”Converted to Table”{7}[Value],
    #”Converted to Table1″ = Table.FromList(Value, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
    #”Expanded Column1″ = Table.ExpandRecordColumn(#”Converted to Table1″, “Column1”, {“number”, “name”, “startTime”, “endTime”, “isDaytime”, “temperature”, “temperatureUnit”, “temperatureTrend”, “windSpeed”, “windDirection”, “icon”, “shortForecast”, “detailedForecast”}, {“Column1.number”, “Column1.name”, “Column1.startTime”, “Column1.endTime”, “Column1.isDaytime”, “Column1.temperature”, “Column1.temperatureUnit”, “Column1.temperatureTrend”, “Column1.windSpeed”, “Column1.windDirection”, “Column1.icon”, “Column1.shortForecast”, “Column1.detailedForecast”})
    in
    #”Expanded Column1″

    Any suggestions on how I might alter these two M Queries to comply with your suggestion ;

  37. Hi Chris, this post is so tantalizingly close to solving our problem. I am helping a client work with data from Airtable to power dashboards in Power BI Service. We are both new to Power BI, but I have a fair amount of experience with web APIs and Airtable.
    You mention that Power BI will analyze the first parameter (the url) passed to Web.Contents, like https://data.gov.uk/api/3/action/package_search?q=apples. However, when we use a dataset with a query parameter like that and try to schedule a refresh, Power BI seems to choke. In our particular case, if we remove the parameter, the url returns unauthorized. Are we publishing the data set or dashboard incorrectly? The key lines look like:

    if [Counter]<1
    then Json.Document(Web.Contents("https://api.airtable.com/v0/app3jBqxbLcBLRS9R/Evaluations?api_key=****&quot;))
    else Json.Document(Web.Contents("https://api.airtable.com/v0/app3jBqxbLcBLRS9R/Evaluations?api_key=****&quot;, [offset=[WebCall][Value][offset]]])),

    Any clues might help. Or, if you do consulting work and have a 2-3 hours available, I'd love to get your expertise. Thanks, great post in any case!

  38. Hi Chris,

    I’m working on an application where I need to use API calls for detailed data constructed based on data from previous calls. Of course, I’ve faced the problem where Power BI Service didn’t allow to perform such a refresh with a given dataset because my data sources were dynamically constructed.

    RelativePath saved the situation – I think it would be good if you reference it in your another post where you’ve resolved a similar problem using query search parameters orange/apples.

  39. Hello Chris,

    First I want to thank you for the many useful posts you made. I’ve read this post but I’m not able to solve my problem as it is quite different than your example. Maybe you could help if possible? In Power BI desktop my code works perfectly.

    First of all I want to fetch my access token from Zendesk with the asked credentials. As for this I’ll probably need to change my path?

    let
    Source = (Subdomain as text, ClientID as text, ClientSecret as text, LoginName as text, Password as text) =>
    let
    ZendeskAccessTokenContent = [
    grant_type = “password”,
    client_id = ClientID,
    client_secret = ClientSecret,
    scope = “read”,
    username = LoginName,
    password = Password
    ],
    ConvertZendeskAccessTokenContent = Text.ToBinary(Uri.BuildQueryString(ZendeskAccessTokenContent)),
    ZendeskAccessTokenRequest = Web.Contents(“https://” & Subdomain & “.zendesk.com/oauth/tokens”,
    [
    Headers = [#”Content-Type”=”application/x-www-form-urlencoded”],
    Content = ConvertZendeskAccessTokenContent
    ]
    ),
    ZendeskAccessTokenJSON = Json.Document(ZendeskAccessTokenRequest),
    GetZendeskAccessToken = ZendeskAccessTokenJSON[access_token]
    in
    GetZendeskAccessToken
    in
    Source

    As for getting my actual data, I work with a parameter in which I’ll look if there’s a next page to fetch the data. I guess when working with a parameter and paging, it makes it more difficult, no?

    let
    GetZDAccessToken = ZendeskAccessToken,
    GetURLParameter = (URL) =>
    let
    HTTPHeader = [Headers = [#”Authorization”=”Bearer ” & GetZDAccessToken, #”Content-Type”=”application/json”]],
    ZendeskGroupsRequest = Web.Contents(URL, HTTPHeader),
    ZendeskGroupsJSON = Json.Document(ZendeskGroupsRequest),
    ZendeskGroupsList = @ZendeskGroupsJSON[groups],
    IterationOverNextPage = try @ZendeskGroupsList & @GetURLParameter(ZendeskGroupsJSON[next_page]) otherwise @ZendeskGroupsList
    in
    IterationOverNextPage,
    GetAllZendeskGroups = GetURLParameter(“https://blablabla.zendesk.com/api/v2/groups.json”),
    ExpandListOfRecords = Table.FromRecords(GetAllZendeskGroups)
    in
    ExpandListOfRecords

    Is it possible to solve this, so I can automatically refresh my data in Power BI service?

    Kind regards

      • Hello Chris,

        I want to thank you because it works for the simple queries like the authentication and such even without skipping the test connection. Though, it still doesn’t work for my last query. I have a guess that it has to do with my iteration and that I call my URL in a function? If I remove the iteration and use the relative path for “/api/v2/groups.json” it works, but as soon as I use this iteration, it fails.

        let
        GetZDAccessToken = ZendeskAccessToken,
        GetURLParameter = (URL) =>
        let
        HTTPHeader = [Headers = [#”Authorization”=”Bearer ” & GetZDAccessToken, #”Content-Type”=”application/json”]],
        ZendeskGroupsRequest = Web.Contents(URL, HTTPHeader),
        ZendeskGroupsJSON = Json.Document(ZendeskGroupsRequest),
        ZendeskGroupsList = @ZendeskGroupsJSON[groups],
        IterationOverNextPage = try @ZendeskGroupsList & @GetURLParameter(ZendeskGroupsJSON[next_page]) otherwise @ZendeskGroupsList
        in
        IterationOverNextPage,
        GetAllZendeskGroups = GetURLParameter(“https://blabla.zendesk.com”),
        ExpandListOfRecords = Table.FromRecords(GetAllZendeskGroups),
        GetSelectedColumns = Table.SelectColumns(ExpandListOfRecords, {“id”, “name”, “deleted”, “created_at”, “updated_at”}),
        RenameSelectedColumns = Table.RenameColumns(GetSelectedColumns, {{“id”, “ID”}, {“name”, “Name of Group”}, {“deleted”, “Is Deleted”}, {“created_at”, “Created at”}, {“updated_at”, “Updated at”}}),
        ChangeTypeSelectedColumns = Table.TransformColumnTypes(RenameSelectedColumns, {{“ID”, type text}, {“Name of Group”, type text}, {“Is Deleted”, type logical}, {“Created at”, type datetime}, {“Updated at”, type datetime}})
        in
        ChangeTypeSelectedColumns

        Kind regards

  40. Hello Chris,

    Is there any way to do something like this for ODBC.Query()?

    It seems like setting your DSN as a Parameter is not something the PBI Online Service is okay with.

    To prevent a ton of wasted work I just hardcoded the DSN name, but ideally I would be able to parameterize this in the future.

    Any feedback is greatly appreciated.

    Thanks,
    -Joey

      • Well, it does work to an extent:

        1.) It works as a function within PBI Desktop, but I get the following error message when I push it to schedule refresh in service: you can’t schedule refresh for this data set because the following data sources don’t support refresh: ODBC.Query() – I’ve determined this to be a false error message as it refreshes just fine as long as the DSN is hard coded.

        2.) It works as a parameterized query both in desktop and in the service, but this isn’t ideal as I have two (nearly) identical tables I need to pull from, a live table and an archive table – having a function ensures the two always match perfectly when I need to change something. I have narrowed the problem down to the DSN parameter because I have several parameters within these queries that still refresh fine as long as the DSN is hard coded.

        It is my understanding that the issue lies within PBI’s “test connection”. Essentially, when your query is a function the service takes your query as is (without our parameter injections) and gives the DB a test run, clearly, this will never work with a parameterized DSN. This gives me hope that “skip test connection” will solve my problem once it is fixed and made available (late June ETA), but I would still like to identify ways around the issue in the meantime.

        I have a hunch that “tricking” the service into testing a hard-coded “dummy” DSN is the way to go here, but I’m open to any method. I appreciate your help – your website has saved me countless times and despite your belief that your book is outdated, it has served as an incredible foundation for the work I do.

      • I think your best bet will be to wait until ‘Skip Test Connection’ is back. There may be a way to get it working before then but it might involve a lot of trial and error.

  41. Fair enough, thanks for your feedback.

    You’d think there would be an M function that allows you to pass “TRUE” to the connection attempt.

    Shame they made “Skip Test” GA and then took it away.

  42. Hi Chris,

    great post!

    It would be much appreciated to get your advice on the below query. We are struggling to pull all Project Sites risk data (they have custom columns) from a PWA:

    ——————————————————————

    let
    ProjectsList =
    #table(
    {“ProjectName”},
    {{“Implementation Project A”}, {“Infrastructure Project A”}, {“Infrastructure Project B”}}
    ),
    GetRisks = (ProjectName) =>
    let
    Risks = Json.Document(
    Web.Contents(“https://***.sharepoint.com/sites/pwa/”,
    [
    RelativePath=ProjectName&”/_api/web/lists/GetByTitle(‘Risks’)/Items()”
    ])),
    Success = Risks[success]
    in
    Success,
    Output = Table.AddColumn(ProjectsList, “All Risks”, each GetRisks([ProjectName]))
    in
    Output

    ——————————————————————

    The error we get is:

    ——————————————————————

    DataFormat.Error: We found extra characters at the end of JSON input.
    Details:
    Value=
    Position=0
    ——————————————————————

    Our idea is to get it working in Power BI Desktop and then publish it to Power BI Service to check if we can schedule refresh the report…

    Thanks in advance!

  43. Hi Chris,

    Great post and it’s from 2016…amazing. I would love you help for an issue regarding this topic.

    I need to make a call to the following api:

    https://api.itmplatform.com/(CompanyName)/projects

    Using Relative Path I’m able to use as base URL the beginning of the url

    Source = Json.Document(Web.Contents(“https://api.” & BaseURL, [RelativePath = “/” & CompanyName & “/projects”, Headers=myToken]))

    The problem is that “https://api.itmplatform.com” as itself returns an error. It would work if it was “https://itmplatform.com”.

    Is it possible to add two RelativePath? Meaning something like this

    Source = Json.Document(Web.Contents(“https://”, [RelativePath = “api.”], BaseURL, [RelativePath = “/” & CompanyName & “/projects”, Headers=myToken]))

    Like this I think I would be able to solve the test connection.

    Thanks in advance!!

    • According Microsoft:

      The connection string that we construct in M will not be identified in service at the time of static analysis. Meaning, we try to look into M and pick the connection strings (datasource uri’s). But the one’s we construct will come to life only when we actually execute that particular M script which happens only during execution not when we perform static analysis. I do not think refresh will be successful in this case.

      So case closed, it does not work.

  44. Pingback: Tracking AEMO data using PowerBI – Project Controls blog

  45. Hello Chris, Need some advice if possible.

    I’m running the following query fine in Power BI desktop, however when I publish it to PBI service and attempt to refresh I get the following error

    “Unable to refresh the model (id=6884172) because it references an unsupported data source.”

    (name as text) =>

    let
    Source = Json.Document(Web.Contents(“https://goma-webgui-tmf.vodafone.com/prweb/PRRestService/api/v1/cases/”&name)),
    content = Source[content],
    …………..

    I’ve attempted to implement your solution about via changing the web.contents line to

    Option 1.)

    Source = Json.Document(Web.Contents(“https://goma-webgui-tmf.vodafone.com/prweb/PRRestService/api/v1/cases”,
    [Query=[name]]
    )),

    Option 2.)

    Source = Json.Document(Web.Contents(“https://goma-webgui-tmf.vodafone.com/prweb/PRRestService/api/v1/cases”,
    [RelativePath=”/”&name]
    )),

    Option 1.) Gave me an error in power bi desktop. Option 2.) Worked in Desktop version but get the following error when trying to refresh in PBI online.

    Underlying error code-2147467259 Table: Cases List.
    Underlying error message[Unable to combine data] Section1/Cases List/AutoRemovedColumns1 references other queries or steps, so it may not directly access a data source. Please rebuild this data combination.

    Any help would be appreciated thanks.

    • This looks like a different problem, relating to data privacy settings (which I’ve blogged about a lot). Have you set Power BI Desktop to ignore data privacy checks?

      • Sorry to jump on this thread. I’m curious why you feel this might be a privacy issue. I’m asking because I am getting a similar error and feel I’ve ruled out Privacy Settings issues.

        The query I’m trying to run looks like this:

        Source = Table.Combine(List.Transform(list_parameters, each Xml.Tables(
        Web.Contents(“https://” & Server_Name & “/esys/”,
        [RelativePath=”api/restxml/” & API_Name & “/”,
        Query=_
        ]
        )))
        )

        Server_Name and API_Name are the name of the server and the name of the API to call.

        list_parameters is a list of records that contain the Query parameters. There are two – Project and BudgetType (as in Project = 11111& BudgetType = BD).

        The query works in Desktop and in a Dataflow but balks with the error the other poster listed when it is run as a non-dataflow dataset refreshed through the Service.

        However, if I remove the part that tries to parse through the parameter list, I can refresh through the Service. That query looks like this:

        Source = Xml.Tables(
        Web.Contents(“https://” & EcoSys_Server & “/ecosys/”,
        [RelativePath=”api/restxml/” & API_Name & “/”,
        Query=params
        ]
        )
        )

        where params = [Project=#”Project Number”]

        Based on that, I don’t feel that there is a privacy setting issue otherwise neither query would be refresh-able.

        Any insight would be appreciated as I’m at a dead-end. The point of the query is to be able to dynamically handle the budget types since there can be anywhere from 4-16 and new ones can be added at any time.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: