Converting a Direct Lake model to Import

Converting a Direct Lake model to Import

One of my favorite features of Tabular Editor (both 2 and 3) is its C# scripting capability, which allows you to run code, see the impact immediately, and even undo the changes made by the script using Ctrl+Z.

I have always been a huge fan of productivity tools that let users write scripts, record macros, or set up automation within the tool. Naturally, I wanted Tabular Editor to have such a feature as well. Over the years, it has become clear why this feature is so valuable.

Just yesterday, a user asked if Tabular Editor could help convert a Direct Lake model to an Import-mode model. This isn't an out-of-the-box feature, as this requirement is quite rare. However, when the need arises, performing this conversion manually would be incredibly cumbersome and tedious—especially if the model is large and contains many tables.

Enter C# scripting! Without further ado, here's the script (click the link below to see the code):

https://gist.github.com/otykier/ac53aac2a2f22838c5b2019fd45aaa98

Copy and paste this script into TE2/TE3's C# script editor and run it on a Direct Lake-mode model. It will convert all Direct Lake partitions to their equivalent Import partitions and reset the model collation to its default value (since Direct Lake models are case-sensitive regarding data).

You can run the script while Tabular Editor is connected to a semantic model through the XMLA endpoint. However, you likely won't be able to save the changes back to the Power BI / Fabric workspace, as in-place conversion of Direct Lake models to Import models is not supported. Instead, you can use the Model > Deploy... option to deploy the newly converted model as a new model in a workspace.

Note: After deploying the newly converted Import-mode model, you will need to specify the credentials for accessing the Lakehouse to refresh data into the model. In the Power BI / Fabric workspace, click the ellipsis button next to the model and choose "Settings":

Article content

Then, expand the "Data source credentials" section and click "Edit credentials" to ensure the connection from the model to the Lakehouse is using the correct credentials.

Article content

In my case, setting the Authentication method to "OAuth2", choosing "Public" for the Privacy level setting for this data source, and then hitting Sign in, was all that was needed.

Article content

And lo and behold, our Import-mode model can now be refreshed:

Article content


Erik de Jong

Scalable BI Solutions | Freelance BI Consultant

12mo

Incredible! Using an ever expanding collection of C# scripts in Tabular Editor when working on a semantic model never stops feeling magical🧙🏼♂️.

Like
Reply
Thomas LeBlanc

Microsoft Fabric Architect | Business Intelligence Architect | Microsoft Data Platform MVP | Power BI Super User | Speaker | Mentor | Technical Business Strategist | Author

12mo

incredible!!!

Like
Reply
Chris Woolderink

Serving "a la carte" DAX dishes or data buffets from a kitchen with Power BI and Azure Data Platform.

12mo

💡👉That's another awesome usecase for Tabular Editor's C# Scripting functionality. Daniel, thank you for sharing!

To view or add a comment, sign in

Others also viewed

Explore topics