A few weeks ago I replied to a question on reddit where someone was experiencing extremely slow performance when importing data from a CSV file using Power Query. The original poster worked out the cause of the problem and the solution themselves: they saw that removing all date columns from their query made their Power Query query much faster and that using the Date.FromText function and specifying the date format solved the problem. While I couldn’t reproduce the extreme slowness that was reported I was able to reproduce a performance difference between the two approaches and Curt Hagenlocher of the Power Query team confirmed that this was expected behaviour.
Let’s see an example. I created a CSV file with five date columns and one million rows, then created a Power Query query to import this data into Power BI Desktop using the default M code generated by the Power Query Editor:
let
Source = Csv.Document(
File.Contents("C:\GenerateDates.csv"),
[
Delimiter = ",",
Columns = 5,
Encoding = 65001,
QuoteStyle = QuoteStyle.None
]
),
#"Promoted Headers" = Table.PromoteHeaders(
Source,
[PromoteAllScalars = true]
),
#"Changed Type" = Table.TransformColumnTypes(
#"Promoted Headers",
{
{"Extra Spaces", type date},
{"Extra Spaces - 2", type date},
{"Extra Spaces - 3", type date},
{"Extra Spaces - 4", type date},
{"Extra Spaces - 5", type date}
}
)
in
#"Changed Type"

The dates in the CSV file were in the following format:
02 Jan 1901
…and this is important: there are two spaces between the day and the month name and three spaces between the month name and the year.
Using SQL Server Profiler I found that this query took around 14 seconds to run.
I then created a second query that, instead of using Table.TransformColumnTypes to set the data type on the columns, used Date.FromText and the Format option:
let
Source = Csv.Document(
File.Contents("C:\GenerateDates.csv"),
[
Delimiter = ",",
Columns = 5,
Encoding = 65001,
QuoteStyle = QuoteStyle.None
]
),
#"Promoted Headers" = Table.PromoteHeaders(
Source,
[PromoteAllScalars = true]
),
DateConversionFunction = (inputDate) as date =>
Date.FromText(
inputDate,
[Format = "dd MMM yyyy"]
),
ChangeDate = Table.TransformColumns(
#"Promoted Headers",
{
{
"Extra Spaces",
DateConversionFunction,
Date.Type
},
{
"Extra Spaces - 2",
DateConversionFunction,
Date.Type
},
{
"Extra Spaces - 3",
DateConversionFunction,
Date.Type
},
{
"Extra Spaces - 4",
DateConversionFunction,
Date.Type
},
{
"Extra Spaces - 5",
DateConversionFunction,
Date.Type
}
}
)
in
ChangeDate
This version of the query took around 10.5 seconds to run, so not a huge improvement but a noticeable one. It’s certainly not the 6/7x performance improvement seen on the reddit post but I’m sure different data, different date formats and different hardware might result in bigger differences.
I was told by Curt that when Power Query uses Table.TransformColumnTypes to parse date data from CSV files it tries a series of different date formats in order: first it tries ISO-8601 (for example 9th February 2025 would be “2025-02-09”), then a long date format, then a short date format, and finally it uses a generic .NET date parsing function which is slower than the others. It does this to make sure date parsing “just works” as often as possible. The dates in the example above, with the extra spaces, were deliberately designed to be slow for Table.TransformColumnTypes. When I tested on CSV files that contained dates in IS-8601 format I found that Table.TransformColumnTypes performed the same as Date.FromText.
So, to sum up, if you’re using CSV files containing date columns as a source for Power Query and you’re experiencing performance problems, try changing your M code to use Date.FromText instead of Table.TransformColumnTypes to set the data types on the date columns.
hi Chris awesome tip as usual!
Do you know if this can work for PDFs too? I have a 200 page PDF being imported by PowerQuery flawlessly but it takes 3-4 minutes saying “connecting to datasource”. The PDF is a bank statement so it has dates too.
It’s hard to say but this sounds like a different issue. How large is the PDF? If this is in Desktop, try increasing the memory availble to Power Query https://learn.microsoft.com/en-us/power-bi/create-reports/desktop-evaluation-configuration#maximum-memory-used-per-simultaneous-evaluation
Although its 200 pages, the PDF filesize is only 1.3MB. The PQ Excel file is intended to be used by non technical users so ideally I optimize the M code behind it not increase memory allocation.
The data source doesn’t matter; what matters is the use of the transformation Table.TransformColumnTypes with a date or datetime type (and a large enough number of rows to make a difference).