Connecting Power BI to Planning Analytics for Free
As a standalone product, Power BI has been around for about 10 years (and even longer if you count from Denali, a tabular project in Analysis Services) and continues to evolve rapidly within the Microsoft Fabric platform. IBM Planning Analytics has a history dating back to the 1980s, starting with its predecessor, Applix TM1. Both platforms often coexist in the IT landscapes of enterprises, which typically involves transferring data from CPM to BI.
It is surprising that to this date, there is no unified way to integrate these platforms that meets three basic conditions: direct (i.e., without using intermediate data stores), simple (i.e., not requiring Power BI analysts to install additional components, libraries, or other external software), and ideally, free.
All the integration solutions I am aware of meet no more than two of these conditions at once. For those who are just beginning to explore integrating IBM Planning Analytics and Microsoft Power BI, I will save searching time and briefly describe the known interface options at the beginning of the article. Those who are already familiar with the tools on the market and who are now looking for a native and straightforward way to exchange data between the systems discussed can skip the first part and move directly to the description of the native integration method I use in my projects.
Methods of Connecting Power BI to Planning Analytics
So, if you are tasked with setting up data import from PA to Power BI, you can choose from the following options:
Paid Connectors. I am aware of at least three providers offering commercial integration solutions for these two platforms. The advantages usually include good functional coverage of integration (for example, the feature for synchronizing user access levels in both systems) and ease of setting up the data exchange itself. The downsides include the cost of licenses, sometimes significantly exceeding the cost of Power BI licenses, which makes such solutions impractical, especially for medium and small companies opting for self-service solutions.
Built-in Power BI Connector. This is not actually a "native" connector written by teams at Microsoft or IBM, but a Custom PQ connector developed by Micatio Software, and, as I understand, it has been included in Power BI Desktop almost unchanged. I tested this connector back in 2020, and I remember that after a certain trial period, the user was prompted to contact the solution vendor to purchase a license. Interestingly, as of the first half of 2024, the connector's behavior in Desktop has not changed. Firstly, it is still called Micatio TM1 Connector. Secondly, when attempting to use it, the connector directs you to the sales department, but now, even without any trial periods:
Additionally, this connector has been in beta status for several years in Power BI, and the intention of moving it to production remains unclear.
Open-source tools, the most well-known of which is likely the tm1py package, which offers, among other things, an integration interface with Power BI. You will need to install Python, Python libraries (Pandas & Matplotlib), and the tm1py package itself to use it. All software is free, but a Power BI analyst is required to have at least basic knowledge of Python (which is not so rare). The package supports the ability to read data from tm1 cubes through MDX queries or customized views.
Enabling scheduled report refresh in the Power BI Service is also possible. For that, you will need to configure a Power BI Gateway using a dedicated server or service that can host the machine where Python with the tm1py package is installed.
If setting up and configuring Python and its libraries on a dedicated server is not a limiting factor, then using tm1py can be a suitable solution for integration. It works fast and nicely, especially if using common data extraction methods rather than their wrappers for Power BI.
What's Under the Hood of Integration Tools?
Interestingly, all these tools utilize the standard REST API of Planning Analytics to retrieve data from its models, basically serving as convenient software that allows analysts and developers to focus on business tasks rather than technical integration issues. The PA REST API supports the OData standard, which was initially developed by Microsoft, which also happens to be the vendor of Power BI solutions. All this cannot but suggests that there should be a straightforward way to retrieve data from TM1 into Power BI without needing to use paid connectors or third-party tools. And that's how it is.
Main Challenges for Power BI Analysts to Retrieve Data from Planning Analytics
Implementing the application layer of a solution for integration between platforms, in my opinion, includes four main challenges that a BI analyst who has never worked with TM1 will need to address:
Establishing a Stable Connection and Handling Various Errors related to connections and data retrieval. In Power Query, there are two main functions that interact with REST APIs and retrieve data for processing: Web.Contents and OData.Feeds. They often require fine-tuning when working with different specific services. Power Query evangelists have dedicated numerous articles to dissecting these intricacies. Microsoft has further extended the applicability of these functions by developing custom patterns, such as the Wait-Retry. However, even all this may not be enough when you first experience retrieving data from Planning Analytics models. You need to interpret and handle a multitude of different connection errors and "bad requests" generously generated by Planning Analytics. It can be daunting for a Power BI specialist with only a basic understanding of the data source architecture.
Formulating Queries to the Planning Analytics Cubes. The second challenge for a Power BI analyst might be retrieving data from TM1 cubes. The younger generation might not remember when Microsoft was actively developing Multidimensional SSAS. The multidimensional model significantly differs from the tabular paradigm. Besides architectural differences, these two platforms use different query languages (MDX and DAX, respectively) and only a few Power BI specialists are familiar with the former. Thus, data extraction boils down to the necessity of learning a new, complex, and specific query language, which is not a good way out. Imagine that would have to be done for hundreds of possible data sources to which Power BI has pre-configured connectors. An additional challenge here might be creating the correct OData request, which requires a good understanding of the TM1 EDM structure (it describes the relationship between entities, their properties, and navigation), and a confident grasp of the TM1 metadata structure (cubes, dimensions, hierarchies, tuples, etc.).
Transforming the Retrieved Data for Use in Power BI Datasets. The third complex task for a BI analyst may be interpreting and transforming the dataset obtained from PA. In general, this will be JSON, containing both the data and related (and sometimes not directly related) metadata. An analyst will need to spend some time understanding the structure and relationships of the received dataset and perform the necessary Power Query transformations to bring the dataset to tabular data structures of Power BI.
Query Execution Performance. Once the first three challenges are overcome and data finally starts loading into the target Power BI model, all troubles might seem to be behind. However, the next query brings a new "black swan." When attempting to refresh data in Power BI, the loader "freezes" for a long time, data refresh takes unreasonably long, and eventually, the entire update fails with a "Query Timeout" error or another variety of critical error, which results from excessive memory/CPU consuming by one of the queries. The reasons might lie in the suboptimal writing of an MDX query, returning an excessive amount of data or poor PQ code. Sometimes this occurs due to excessive use of query dependencies, where the output of a query is fed by the results of other heavy queries. All four issues are solvable with standard Power Query tools. I will provide a solution for each.
Connecting Power BI to Planning Analytics
I wrote about it in my previous article, which I recommend reading before proceeding. Here, I will only mention one non-obvious issue that took me quite some time to sort out.
Typically, TM1 Servers are configured to use SSL certificates to encrypt data in transit between Planning Analytics components and clients, protecting sensitive business data. If the entire system is deployed on-premise within an intranet, CA certificates generally do not need to be used. We already operate entirely on a trusted internal network, with a low risk of external attacks and malicious traffic interception. In this case, TM1 Server configuration often includes a default self-signed certificate.
If you try to connect from Power BI Desktop to a server configured this way, an error occurs: “DataSource.Error: The underlying connection was closed: Could not establish trust relationship for the SSL/TLS secure channel.” No setting in the Certificate Revocation section will resolve this issue. However, there is a workaround, and it's pretty simple to configure. I won't overload the article with its description here, but I'm always ready to help and share the solution upon personal request.
Creating Queries to Planning Analytics Cubes
Let's be honest, MDX is currently not the most widespread language. Its peak popularity was from the early 2000s through the mid-2010s, when Multidimensional SSAS dominated. So, a Power BI analyst probably won't want to spend much time learning a query language that is now mostly used in multidimensional structures like TM1.
Indeed, we can always make a query to a saved cube view, but this method is not the most reliable: the view can be modified or even deleted by the TM1 user, contain many empty cells (even with suppressed zeros), etc.
A possible solution here is a function converter from Power Query notation to MDX notation. I have prepared and published two universal functions on GitHub:
and
where the tuple has the format:
If the list of dimension/hierarchy elements is empty, data will be returned for all elements.
This syntax is intuitively understandable to a Power BI analyst and does not require them to dive deep into the architectural peculiarities of Planning Analytics.
Transforming JSON
The dataset returned by Planning Analytics is structured both by the MDX query and OData request. For example, in some cases, JSON will simultaneously contain a set of metadata and the data itself, and both datasets will not be directly related to each other. Tests have shown that sometimes the second method works even faster if you perform the mapping of tuples and cell data using Power Query. Therefore, the transforming functions work in two modes. Detailed descriptions of the modes are provided in the annotations to the functions.
I am sure that further optimization of the data transformation chain is possible and I plan to add additional modes in the future.
Conclusion
The method I propose meets the criteria set out at the beginning of the article: it is direct – connection is just only between Power BI and Planning Analytics, simple – it does not require a Power BI analyst to install additional software or learn additional technologies, and it does not require any fees.
However, I cannot guarantee that it will be suitable for large corporate implementations – although the Power BI still has limitations on dataset volumes and the number of requests per hour (RPH). The best way to see if the solution suits you is to try it out.
Business Intelligence Consultant at BI2run
9moThank you for saving me time and providing such a great solution! I could adapt your approach to meet the needs of my Power BI/TM1 project and it works perfectly 👏
SAP Analytics Cloud & Financial Consolidation Consultant, Financial Planning Consultant IBM Planning Analytics Solution Architect
1yI think it is worth pointing out, that this solution requires the Power BI user to be user of PA as well. Otherwise you are stuck with DAX security definitions... Or am I missing some obvious workaround? Personally (mostly for simplicity reasons) I go with SQL Server in the middle between PA and PBI (as many data teams do not have enough knowledge about PA... Thanks for the great article - I will check the mentioned Github repo!