I had meant to follow up my recent post on how to find the columns touched by a DAX query by writing one on how to use this technique to find the size of these columns in memory, so you can find the total size of the columns that need to be paged into memory when a DAX query runs on a Direct Lake semantic model. Before I could do that, though, my colleague Michael Kovalsky messaged me to say that not only had he taken the query from that first post and incorporated it in Semantic Link Labs, he’d done the work to get column sizes too. All that’s left for me to do, then, is give you some simple examples of how to use it.
To use Semantic Link Labs you just need to create a new Fabric notebook and install the library:
%pip install semantic-link-labs
After that you can use sempy_labs.get_dax_query_dependencies to get the columns touched by any DAX query, for example:
import sempy_labs as labs
labs.get_dax_query_dependencies(
dataset = 'InsertSemanticModelName',
workspace = 'InsertWorkspaceName',
dax_string = "InsertDAXQuery",
)

This returns a dataframe with one row for each column touched by the query, plus various statistics about the size of each column in memory.
If you’re working with a Direct Lake semantic model, though, in order to get the correct sizes of each column in memory the query itself will need to have been run beforehand; you can ensure that this happens by setting the optional parameter put_in_memory to True:
import sempy_labs as labs
labs.get_dax_query_dependencies(
dataset = 'InsertSemanticModelName',
workspace = 'InsertWorkspaceName',
dax_string = "InsertDAXQuery",
put_in_memory = True
)
Last of all, if you don’t want a dataframe but just want a single number representing the total memory needed by all columns touched by a query, you can use sempy_labs.get_dax_query_memory_size, for example like this:
import sempy_labs as labs
labs.get_dax_query_memory_size(
dataset = 'InsertSemanticModelName',
workspace = 'InsertWorkspaceName',
dax_string = "InsertDAXQuery"
)

Yet more evidence that, for any Power BI user, Semantic Link and Semantic Link Labs are the best reasons for you to flip the switch to enable Fabric. To find out more about what they are capable check out this user group presentation.
There’s a new update of Semantic Link Labs that allows you to pass multiple DAX queries to get_dax_query_dependencies at once: https://github.com/microsoft/semantic-link-labs/releases/tag/0.8.9