Composite Models for the Power BI Self-Service Developers - A Practical Guide

Composite Models for the Power BI Self-Service Developers - A Practical Guide

This topic is something I'm deeply passionate about! I'm no expert - at the end you can read credit and resources to people and documentation I have read. I will never be able to explain the magic of the complex DAX engine in depth - but we should at least have a basic understanding of it.

I have tried to take this complicated topic into relevance for the Power BI self-service developer through practical examples that can enable the business to have a higher degree of self-service and flexibility - faster time to insight.

⚠️Not everything I show is something I would recommend! But I show it so you can take an informed decision whether to disable this feature or create a training or process on how you will work with composite models in your company.

📖Hope you are in for a read and some practical examples.

📺Alternatively, watch the live stream at Analytic Endeavors

🎁 and slides is available at my GitHub Session Slides

Unlock the power of Composite Models in Power BI to bridge the gap between self-service and enterprise reporting. In this session, we’ll explore how to enrich local data with enterprise Power BI semantic models, giving self-service Power BI developers the best of both worlds—a hybrid solution that combines flexibility with governance.

I will provide a guided tour on how to simplify the user experience and considerations when working with Composite Models against a Power BI Semantic Model in Direct Query for example using perspectives, naming convention and the considerations on calculation groups, security and performance.

And if you are a Fabric Administrator or Enterprise Semantic Model Developer and want full control and governance, I will show you how to disable it.

Enjoy 🙏


I Love Live-Connection, but …

Where is my table view? Where is my Power Query Transformations? Why can’t I edit relationships, add columns, tables or field parameters? Sometimes the business need to explore, extend, enrich with other data. But I don’t have access to add tables or?

In live connection mode, you can't modify the data model itself. For example, you can't add new columns, tables, calculation groups or field parameters

You can only create report level measures and visual calculations - well I have tried connecting to an old MDX cube where report level measures were not possible to add...

Splitting Semantic model and building multiple reports or Excel Pivot-Tables is awesome!

Add Local Report Level Measure

I prefer to have a Local Measures Display Folder or even a seperate measure group table to locate my report level measures.

Adding report level measures provides flexibility for the report developer to use DAX to communicate data insights with numbers, text or icon to the specific target group of that report - and not all report level meaures should be migrated back to the semantic model, as it could quickly bloat in number of irrelevant measures for other report developers.

💡Auto-updated Model Documentation from the main semantic model

When connecting live to a semantic model, report developers are not able to see the measure DAX expression - unless you use a Tabular Editor script to copy DAX Expression as Measure Description.

With INFO.VIEW DAX Functions, the semantic model developer included a DAX calculated table "Model Documentation" that auto-updates from the main semantic model. Now the report developer can see more properties easier than ever. For example double-checking the Quantity ACT YTD measures is defined in the same logic as the main models Revenue ACT YTD logic.

If you want this DAX Script for this specific model documentation - find it here on my GitHub

Auto-updated Model Documentation

The Holy Grail for self-service BI

It's been a long time since I tested it the first time with Public Preview in Dec 2020 and General Available in Apr 2023.

https://www.sqlbi.com/articles/new-composite-models-in-power-bi-a-milestone-in-business-intelligence/

Composite Model (“Mixed storage mode”)

But to confuse you or what I intend to is clarify - we have several composite models available or mixed storage modes - and just one added from May-2025.

Overview of composite models

We have different engines like VertiPaq Engine in Import, Direct Query to a relational database and Direct Query to a Analysis Service Engine (Power BI Semantic Model) and now Direct Lake connecting to OneLake Delta-Parquet tables.

Some will create a new semantic model, others will have one semantic model with a mixed storage mode or composite model within the model of different tables or even rows as with Hybrid tables.

So be careful on what you mean when you say you want a composite model...

One Size Fits All?

Are you tired of consultants or Power BI Specialists using "It depends"?

Well, one of the reasons for saying that (can mean to buy more time), but Microsoft have a great overview of all the different usage scenarios for companies adopting and utilizing Power BI and Fabric - only 16 scenarios!

So delivery model, organization, licensing, maturity, competency, data culture, so many things needs to be considered.

  • Advanced data model management

  • Advanced data preparation

  • Customizable managed self-service BI

  • Departmental BI

  • Embed for your customers

  • Embed for your organization

  • Enterprise BI

  • Enterprise content publishing

  • Managed self-service BI

  • On-premises reporting

  • Personal BI

  • Prototyping and sharing

  • Self-service content publishing

  • Self-service data preparation

  • Self-service real-time analytics

  • Team BI

Scenario for this article

I will focus on the following scenario where my enterprise team deliver ingest, transform and load and uses Tabular Editor with XMLA-endpoint to create few validated semantic models. They provide build access to report developers to create live-connection in Power BI reports or Excel Pivot-tables.

Enterprise owns data, Business owns analysis

However, there are some super users in the business that want to add their own department logic, testing field parameters or create an ad hoc analysis - but can't wait for the enterprise team to deliver that as a task in a backlog where capacity is full.

Thus we will explore the usage scenario Customizable managed self-service BI - discipline at the core and flexibility at the edge

Self-service business users will still connect to the main semantic model, but can extend and enrich with their own local data or logic using Power Query and DAX to create their own semantic model - a composite model.

Customizable managed self-service BI

Extending the capabilities and features for Power BI Self-Service Developers in a Composite Model

I will now go through several examples of where and what you can do now as a Power BI self-service developer.

But first, we need to click on the inviting hyperlink "Make changes to this model" and click "Add a local model"

Like any other pop-up we don't read, we just click the green button, right?

and now I will select all the tables in the semantic model - more on these options later in the article.

And we are back!

New game mode unlocked 😎

I will now go through several examples of where and what you can do now as a Power BI self-service developer.

Field Parameter

Let's start with one of my favorites - field parameters. I want to remove some bookmarks and create a more dynamic interaction for my end-users switching between measures or dimensions on a report page.

Enrich with local Budget table in import mode

I do love Power Query - a great self-service ETL tool for the business and self-service developers who are not able to write T-SQL or Python.

Let's add what all self-service analysts dream of doing - adding their own Excel data.

In this case we add Budget and use Power Query to transform it, create relationship between the tables and finally a measure Budget that can be utilized together with the main semantic models revenue.

This is awesome - and high degree of self-service!

Rename remote model objects

Let's say I don't get the ACT abbreviation and would like to change it to Actuals - I would have to do that manually on each visual in my report. Now I can simply right-click/double click on the object and rename for example measure "Revenue ACT" to "Revenue Actuals"

You can even do it on tables! 😅

Add calculated columns to tables in DQ

We can even add calculated columns on a table that's from the remote model. For example adding a column "Day" with the format of number of days in the month for a better visual on the x-axis.

However, in table view I'm not able to see the table or column - but it works. (This is wholesaleable query which we can see later in DAX Studio)

RLS & OLS

We do have Row-Level-Security in the model, but we are not able to view or manage it.

We can only add RLS or OLS through the imported tables on Budget and the Field Parameter.

Be extremely careful on composite models if you have RLS and OLS!

You can only add RLS on import tables

VertiPaq Analyzer in Tabular Editor

Opening Tabular Editor, here as an External Tool, we can see the measure expressions is somewhat different for the model measures compared to the local measures.

When collecting stats in VertiPaq Analyzer, we can see all tables and columns (meta data), but the VertiPaq engine (import) is not able to collect information on cardinality or data size in % of model from tables in remote - only on the local tables budget and field parameter.

Weak and Limited Relationships

This is also why we see a limited or weak relationship between Budget and Product.

Assume referential integrity or checking it, which should be done checking the cardinality. It's not possible when we are working with different storage modes. So instead of a inner join, it will create a outer join - i.e. you need to know what you are doing and performance is not optimal.

Referential integrity is for example you have a product sold to in budget, which you don't have mapped in your product dimension. The engine will then automatically create a Blank value - a blank you often see in your slicer visuals in a Power BI Report.


Remote & Local

From that point on, instead of a live connection we will have DirectQuery connection - meaning that queries will travel from our local model to the remote model, and data will flow back which might in turn be further transformed to be able to render the visual object.

If we head over to the model we can see which tables belong to each model, by the color of the header. The local table is displayed without any color.

We can also call the remote the Whole or enterprise model. The local model for retail or self-service dev.

Others have also used the term of multiple "islands" - and we need to see later why it's important to understand and also what happens in the engine room when we combine or "visit" the islands together in one visual and/or relationships between them in the model


Who loves to include a calculation group in their semantic model? ✋

https://www.linkedin.com/feed/update/urn:li:activity:7331027969357824000/

I love the SELECTEDMEASURE() and how reuseable calculation items based on measures in the report canvas

  • Time Intelligence (MTD, QTD, YTD, LY, YOY %)

  • Unit conversion (show figures in abs, in k, in m)

  • Format String Expressions (like dynamic format string for measures)

  • Handling Multiple Dates (USERELATIONSHIP to switch between order and invoice date)

  • Custom Date or Week Periods as slicer in report

  • My vs. All stats

  • And more …

What could go wrong with calculation group in a composite model?

We obtain wrong numbers as the calculation group is not able to "communicate" with our local data, which is our Budget.

Daily works fine, but as soon as I choose MTD / QTD / YTD budget will not be changed.

The unit conversion on Current is also fine, but in 000's and in mio will not change my budget numbers.

So my beloved calculation groups are quite confusing in composite models.

Also notice the icons of the tables - I would expect my calculation group tables to have the calculator icon in brackets.

What about creating our own local calculation group? 😎

Yes, that will actually work. Now we can in one query send all information to the remote model what filters we would like. Why's its like this, ask the Italians or Microsoft.

I love DAX Scripts in Tabular Editor 3, you can also use manual way in Model View, TMDL view, etc.

Let's create one ourselves

When it comes to calculation groups, we really need to be careful on understanding model calculation groups will not be working as expected with local import objects.

We could consider creating perspectives to remove calculation groups, when self-service developers are connecting to the model - but that would require new processes and training.


DAX Query, Direct Query to AS and Performance Analyzer in DAX Studio

This takes us into the most advanced section on composite models - and please see resources from the Italians and Microsoft to get more in-depth understanding of this complex engine and query.

The DAX the engine needs to generate in order to render the visual, we can see that in the Performance Analyzer. We can copy query and open DAX Studio (as external tool), to see the query.

Normally I'm quite scared of all the DAX variables and output that Microsoft provides.

This will however be even wilder and complex for composite models - we don't just have DAX query, but also Direct Query.

DAX query is for the local tables and objects.

Direct Query is what we sent of information to the remote model in service.

That combined will bring us the values in the visual of Power BI.

I'm no expert here - this is wild terroritoty. There are even until recently undocumented DAX functions I've never seen or heard of before this concept of composite model.

169 lines of DAX... split into

  • // DAX Query

  • // Direct Query

  • // Query

and new DAX Functions:

  • GROUPCROSSAPPLY

  • GROUPCROSSAPPLYTABLE

  • ALLSELECTEDREMOVE

  • FILTERCLUSTER

  • ASDQ___DS0FilterTable2 (ASDQ is Analysis Services Direct Query"

Do you want a challenge - then understand this in-depth. I don't have time for that.

Security, performance and complex engine

When even SQLBI and Alberto Ferrari says the engine is complex and uncertain of how it fully works - then I'm fine with just a basic understanding of this. It works neatly, but don't get me wrong - we need to take this into account when we are designing solutions and overall architecture and usage scenario for the company.

Worst case, we can end up sending data as part of the Direct Query query and any admin with logging competencies would be able to retrive this data. In this case, from Alberto Ferrari's presentation at SQL Day Poland 2023, it's the entire customer table with customer name, ID, etc.

Do you now understand this message, potential security risk?

Another aspect of performance on composite models: “Have you tried Visual Calculations?”

Based on below, it looks Visual Calculations will have a benefit when working with combining these two islands or remote and local data in a composite models - in terms of performance.

I hope Jeroen (Jay) ter Heerdt or Erik Svensen could shed more light on this claim?

  • Defined on a visual

  • "Visible context”

  • Computed at query execution

  • Can refer to visual structure


💭A dream in a dream in a dream

I don't know why you should, but extending a composite model in a composite model in a composite model - that's a nightmare!

The maximum length of a chain of models is three. Extending beyond the chain length of three isn't supported and results in errors.

Ownership?

  • On the other hand - who owns data?

  • Who supports it

  • What about scheduled refresh?

  • Is there a data contract?

  • Would enterprise modelers even accept integrating an Excel-file into an enterprise setup?

Well then – I will Export your Enterprise data to Excel and continue my work to solve my business problems.

Super users will always find a way - but don't IT/Enterprise at least want a way to audit and monitor the usage of the "shadow IT" or not?

It's difficult.

Export to Excel - really? Well, if that's solves the business problem today

All right ... Enough ...

We have now seen several examples of what composite models in this scenario can provide of new features to enrich (or confuse) a solution.

I know my audience and some may have skipped the above to read this section :)

⚙️I’m from IT or Enterprise team: How can I disable this?

  • [Individual model] Power BI Desktop settings before publishing

  • [Tenant] Admin Portal in Tenant Settings

What about sharing semantic models to external partners?

And if you see composite models as a great option to enrich and share data - you can even share semantic models to external partners outside your tenant 😅

Thankfully, this is not enabled by default - but could for some companies make sense - maybe enriching and sharing ESG data?

External Semantic Model Sharing – Jon Vöge:


Summary of Considerations

With all above in mind, please remember best practices for example: Roche’s Maxim of Data Transformation: Data should be transformed as far upstream as possible, and as far downstream as necessary.

If your objective is only to extend an existing tabular model with more data, whenever possible, add that data to the existing data source.

  • ⚠️ Choose the right storage mode for your need and whenever possible add to existing shared model and create live-connection*

  • Composite models are NOT for the enterprise team to create and deliver large models with relationships

  • Composite models are for the business self-service developers to provide flexibility at the edge (explore, ad-hoc, Pilot, POC, let’s go fast and solve business problems today)

  • Be aware of implications of composite models when using calculation groups and calculating measures between remote and local data source groups

  • Calculation group with format string expression or measures with dynamic format string works only for model measures in remote – even in live connection

  • Be aware of security implications of a query sent to remote model can include data values (see session from Alberto Ferrari)

  • Be aware of performance as composite models will not benefit from visual caches across reports as with live-connection, assume referentially integrity and multiple query types

  • You should not use composite models with relationships that have 10,000 unique values or more. For example, create relationship on year or month instead of date

  • Be aware of ownership (read access to semantic model in Power BI workspace app vs. Org App in Fabric) and avoid creating chain-on-chain-on-chain models

  • Shared expressions (parameters in Power Query), Translations and RLS are not imported from the remote model nor possible to add for remote model objects

  • Consumers of a composite model see the results of the OLS rules that were applicable to the author of the composite model when they created the model

  • Potential break connection to composite model if renaming semantic model or workspace (hard coded for name and not ID)

  • For “I Want It All In One Page Dashboard” check Metric Sets for a more reuseable assets and see Bernat Agulló Roselló article here: I want it all in one page

  • As default enabled, but based on your scenario you can disable it on individual semantic models or at tenant level (and in addition investigate external semantic models)

  • Naming convention “Local” for table if duplicates and Enterprise team semantic model owner can utilize Perspectives to share subsets of model excl. remote calculation groups

  • As with Live-connection, add model documentation with INFO.VIEW functions or measure expression as description field for self-service developers

Credit and resources

Enjoy - and lookout for the upcoming newsletters for more tips and tricks 🙏


Thank you for reading my article, hope you enjoyed it. Comment or provide some input is much appreciated.

👋 Hi my name is David Kofod Hanna, Senior Advisor and Consultant at twoday and my aim is to guide self-service Power BI developers for more enterprise managable concepts in a consumable and practical way - because I have been on that journey for years, and still is...

Great article! Many interesting topics covered.

Like
Reply
David Kofod Hanna

Senior Advisor | Academy | Speaker | MCT

2mo

My pulse was a bit higher than usual 💗- picture just before the live-stream with Steve Campbell and Analytic Endeavors

  • No alternative text description for this image

To view or add a comment, sign in

Others also viewed

Explore topics