M · Power BI · Power Query · Web Services

The “Block length does not match with its complement” Error In Power Query In Power BI And Excel

When using the Web.Contents function to call a web service in Power Query in either Power BI or Excel you may encounter the following error:

DataFormat.Error: Block length does not match with its complement.

Here’s an example of an M query that calls a web service and, at the time of writing, gives this error:


The problem is caused by something to do with how some web services handle “deflate” compression; apparently the same issue exists in the equivalent native .NET functionality. There are two ways to avoid it though.

First of all, the easy way: ask the web service to use “gzip” compression instead. You can do this by setting the “Accept-Encoding” header to “gzip” like so:

    Headers = [
      #"Accept-Encoding" = "gzip"

Having done this the error will no longer occur and you can go on to handle the data returned by the web service (which is a JSON document) as normal.

The second is to use the Web.BrowserContents function instead of Web.Contents:


This is more complicated because the Web.BrowserContents function returns the HTML for a web page as viewed by a web browser. Apart from issues like handling authentication you’ll need to parse the result to get the data you need:

It’s doable using the Html.Table function though. Here’s an example of how to handle the response for this particular web service call:

  Source = Web.BrowserContents(
  HTMLTable = Html.Table(
    {{"JSONPayload", "pre"}}
  JSON = Json.Document(

[Thanks to Thais Marshall for bringing me this problem, and to Curt Hagenlocher for explaining and it showing me the first workaround]

8 thoughts on “The “Block length does not match with its complement” Error In Power Query In Power BI And Excel

  1. Was playing with this practically yesterday. It does solve the loading part, but making sense of the data is impossible, while using transformation the data looks off, transforms hourly into same hour, repetitive information without no sense, I’ll opt out in using this and rather go cleanup data via python script. Thank you for the solution.

    1. Chris Webb – My name is Chris Webb, and I work on the Fabric CAT team at Microsoft. I blog about Power BI, Power Query, SQL Server Analysis Services, Azure Analysis Services and Excel.
      Chris Webb says:

      Based on what you said, I think you made the mistake of expanding the two Record columns (hourly_units and hourly) independently, which does lead to duplication of data. I didn’t include the full query to get all the data in the post because it was slightly off topic, but here it is:

      Source = Web.BrowserContents(“https://api.open-meteo.com/v1/forecast?latitude=52.52&longitude=13.41&hourly=temperature_2m”),
      HTMLTable = Html.Table(Source, {{“JSON”, “pre”}} ),
      #”Parsed JSON” = Table.TransformColumns(HTMLTable,{},Json.Document),
      JSON = #”Parsed JSON”{0}[JSON],
      #”Converted to Table” = Record.ToTable(JSON),
      #”Transposed Table” = Table.Transpose(#”Converted to Table”),
      #”Promoted Headers” = Table.PromoteHeaders(#”Transposed Table”, [PromoteAllScalars=true]),
      #”Changed Type” = Table.TransformColumnTypes(#”Promoted Headers”,{{“latitude”, type number}, {“longitude”, type number}, {“generationtime_ms”, type number}, {“utc_offset_seconds”, Int64.Type}, {“timezone”, type text}, {“timezone_abbreviation”, type text}, {“elevation”, Int64.Type}, {“hourly_units”, type any}, {“hourly”, type any}}),
      #”Expanded hourly_units” = Table.ExpandRecordColumn(#”Changed Type”, “hourly_units”, {“time”, “temperature_2m”}, {“hourly_units.time”, “hourly_units.temperature_2m”}),
      #”Expanded hourly” = Table.ExpandRecordColumn(#”Expanded hourly_units”, “hourly”, {“time”, “temperature_2m”}, {“hourly.time”, “hourly.temperature_2m”}),
      #”Added Custom” = Table.AddColumn(#”Expanded hourly”, “Custom”, each List.Zip({[hourly.time], [hourly.temperature_2m]})),
      #”Removed Columns” = Table.RemoveColumns(#”Added Custom”,{“hourly.time”, “hourly.temperature_2m”}),
      #”Expanded Custom” = Table.ExpandListColumn(#”Removed Columns”, “Custom”),
      #”Extracted Values” = Table.TransformColumns(#”Expanded Custom”, {“Custom”, each Text.Combine(List.Transform(_, Text.From), “,”), type text}),
      #”Split Column by Delimiter” = Table.SplitColumn(#”Extracted Values”, “Custom”, Splitter.SplitTextByDelimiter(“,”, QuoteStyle.Csv), {“Custom.1”, “Custom.2″}),
      #”Changed Type1″ = Table.TransformColumnTypes(#”Split Column by Delimiter”,{{“Custom.1”, type datetime}, {“Custom.2″, type number}}),
      #”Renamed Columns” = Table.RenameColumns(#”Changed Type1″,{{“Custom.1”, “Hour”}, {“Custom.2”, “Temperature”}})
      #”Renamed Columns”

  2. Amazing! thanks so much Chris, before reading your post I managed to get the table I need from this site but I ran into the same issue of expanding the list causing duplicates, I fixed it by expanding each list first and added an index column and finally merged the lists into one table, this wasn’t ideal of course, thanks to you now I have only one query! a true time saver!

Leave a ReplyCancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.