Tools to Analyze the Performance of your Power BI Solution

 In Blog, Performance, Power BI

In this post I’ll show you several tools that can help you analyze the performance of your Power BI solutions at every step of the way, some of them are built-in features and others are free and open source.

In this checklist you will find the tools you can use at every step of the way:


Query Diagnostic

 This Power BI Desktop feature will help you understand what happens inside the Power Query when a query is executed, the parts of the query that are being performed locally or remotely, and the slowdowns you may encounter during:

– Loading and transforming your data.

– Refreshing your data in Power Query.

Start a Query Diagnostic:

  • Go to the ‘Tools’ tab in the Power Query Editor ribbon. You will have the following options Diagnose Step and Start Diagnostics / Stop Diagnostics:

Start Diagnostics / Stop Diagnostics is more used because it provides complete information about each query while Diagnose Step analyzes the performance of a specific step without showing you the diagnostic of the other steps of the query.

  • In Power Query Editor, select the query you want to analyze.
  • Go to Tools in the Home ribbon, click “Start Diagnostics”.
  • Perform the evaluations you want analyze, as: authoring, refreshing a preview, full refresh, etc.
  • When you are finished, make sure that you select “Stop Diagnostics”.

This generates two new queries with the diagnostics results:

In my opinion the most important column is the Exclusive Duration, you should convert it to seconds in the Power Query Editor and then summarize it to get an idea of where time is spent and identify how you can optimize the queries in your power bi models:


DAX Studio

– Launch DAX Studio. Select the model that you want to analyze:

– Go to the advanced tab and click on “View Metrics”:

This will show metrics that help you quickly detect the performance issues.

Cardinality: This is the number of unique values in a column. High cardinality columns can be particularly expensive, especially when have a string or date/time format.

If possible, remove columns such as GUID and timestamp from a SQL Server table from your model. In addition, I recommend splitting Datetime columns into two columns, one for date and one for time.

Size information: The other important thing you can see is the space that is taking up each table and column of the model, for example:

In this case, the Work Order table is taking up almost 20% of the entire model, which helps you analyze where your model consumes the most memory and choose which columns to remove to optimize it.

Summary information: In the Summary tab you can see the total size of this model, in this example the analyzed PBIX is consuming 198.27 MB. You can also see the number of tables and columns:

Bravo for Power BI

This is a free open-source tool for users who do not need a high level of complexity. In case you need more advanced functionalities you should use other more advanced tools such as DAX Studio and Tabular Editor.

After downloading Bravo, go to the “Analyze Model” and connect to the data model:

You will get information about the tables and columns in your model that are consuming the most memory:

In addition, you can drill down to see more detail for any table:

To analyze the unreferenced columns, I recommend my tool Tabular Editor, which can help you identify the reports connected to the dataset where these columns are used.


Performance Analyzer: To Analyze the performance of your report elements, such as visuals and DAX formulas when users interact with them, and which consume the most resources.

In the View ribbon you can find the Performance Analyzer. By opening this pane and clicking Record, this will show you how fast your report renders when performing different functions in your report.

You can review the results of your performance test in the Performance analyzer pane. To review the tasks in order of duration, longest to shortest, right-click the Sort icon next to the Duration (ms) column header:

The log information for each visual shows the time it has taken (duration) to complete the following categories of tasks, which will help you detect what is causing performance delays so you can work on improving those sections.

  • DAX query– Time taken by the to send the query and the time it took Analysis Services to return the results.
  • Visual display– Rendering time on the screen, including the time required to retrieve web images or geocoding.
  • Other– The time it took the visual to prepare queries, wait for other visuals to complete, or perform other background processing tasks.

The only real way to reduce the long duration is to optimize DAX queries for other visuals or reduce the number of visuals in the report.


Best Practice Analyzer within Tabular Editor.

Best Practice Analyzer is an external tool that scans your data model and easily validates if your data model complies with all best practices at formatting level, performance, naming conventions and DAX expressions.

I analyzed a test data model getting the following information which shows the potential issues and improvements that can be made in the data model in each category:

This can help you save a lot of time by avoiding having to validate all these recommendations manually.

In this article 👉 Best practice rules to improve your model’s performance you can read the rules that analyze which are segmented into categories. Some rules are more aesthetic-oriented while others are more focused on performance optimization.

Recent Posts

Leave a Comment