DAX Performance

I am struggling with Performance and I now think pretty much its down to the DAX.
The model looks ok, mostly joining on surrogate keys and have stripped out what I dont need.

Some of the design I am stuck to. So we have a period slicer for YTD , closed YTD, closed period , open period etc and about 10 other date ranges

Another design I am stuck with is I also have two sales values (different currency) in the fact table for reporting.

I am struggling to get some Matrix visualisations to complete within 10 seconds. When you open a report then ther is one Matrix and some other visualisatuions.

The Matrix contains an object such as Sales territory and values are
current value , prior value, growth and growth percent.

Current Value is a large switch based on the period slicer.
SWITCH(PeriodSlicer,
“Closed Year”, CALCULATE([Sales],‘Dim Calendar’[Closed Year] = 1),
“Closed Year -1”, CALCULATE([Sales],‘Dim Calendar’[Closed Year-1] = 1)
)

I then do similar to get values for prior periods

Any gains here that should help me out with all of the calculations.

Is there a good guide on how to improve the performance? Step by Step?

Thanks
E

Hi @ells

If you could attach pbix file then it will be easy for forum member to help you.

@MK3010
Unfortunatelly I cant attach the PBIX. Commercially sensitive.
I can give some details.
E

Try using Calculation Groups instead of SWITCH for variations.

@AntrikshSharma
No I dont understand that at all or I did not explain clearly.

Sales Current Period is a measure based upon a slicer. So if the user chooses last month
Sales CurrentPeriod = CALCULATE(Sales)[Last Month] =1

Sales PreviousPeriod is based on the same slicer

I could see that I could have a calculation group with all the measures in but then that would need to be available as Measures and that seems to be extra work?

thanks
E

I got that, there is a base measure Sales, and then all the measures are just variations of it right? Calculation Groups are built exactly for this. There will be just sales measures but when you drag column of calculation group all other variations will be available to you. and then you can use columns from calculation group in a slicer.

@AntrikshSharma
So if I create a calc group for Sales (current,Prior,growth and Growth%) Would this really give me a performance boost over fou seperate measures?

Can certainly give it a try but I just cant see it being quicker in the report.
Thanks
E

I can’t guarantee anything about that without looking into the report. If 4 simple measures are causing performance issues then the problem surely isn’t with DAX, you may need to work on the data model.

Try and use DAX studio to check the size of each column, analyze the query that’s generated, and most importantly make sure data model is in good shape.

1 Like

Hi @ells

It will be very challenging to know the exact reason of slowness for simple measures unless we look at your data model.

  1. check if you have very large date and time column. If yes then please separate it as new date table.

  2. Check if any key column can be removed if not required.

  3. Un-check your auto date for the report.

  4. If you are unsure then try to avoid Both direction relationship to be on safer side which may result in unexpected result, if you have any in your model.

  5. Check the data type is correct or not.

The data model is in the best condition it could be.
4 Measures with 4 different variations (current, prior, growth, growth %) are the Base items I am interested in at the moment (partitcularly the growth as this seems to be an issue)

Started the calculation groups so will see what happens
Thanks
E

No time
Dates are trimmed to what I need
Auto date time is unchecked
Any columns that are internal data base keys and not used are removed.
Removed the last bi-deirectional filter early this morning.
All data types are correct.

Every dimension has been trimmed to just those in the fact (except for Dim Date) - so I only have products that are in the sales table, only countries where I have sold something.

Data types are correct as in there are no integers that are showing as text

1 Like

Hi @ells

Please try checking with DAX Studio which column is causing the slowness.

@MK3010

All four growth percentages are the time killers.

So as an example
Qty Growth %=
VAR Diff = [Qty] - [Qty Prior]
VAR Result = DIVIDE(Diff,[Qty Prior],BLANK())
RETURN
Result

[QTY] and [QTY Prior] both use switch statements to calculate based on the period selected

To me in my ignorance I dont see an issue in the growth %

Data Model size is 250 MB
From DAX Studio
One fact Table (Data = 349,871,608) - Cardinality 15 million
23 Dimension Tables
Sales table (Fact) is around 80% of the model size.

No local date tables
2 largets dimensions by data size are 4-5% of the DB
One dimension with in excess of 2M rows.

All joins are 1:M

Thanks
E

Hi @ells,

Can you help us with a VPAX file. It will help us in understanding your model metadata like column and relationships. Its a good way to share vpax file and then askin gfor model optimization guidance without sharing the actual data.

I believe model metadata wont affect the privacy policy. You can refer to this video for “How to export vpax file”

Regards,
Hemant

2 Likes

@ells
Echoing others here, but the measures themselves arent probably the issue. The issue seems to be coming from the data model. Any chance you can upload a sample pbix file?

@Nick_M
As above no chance of uploading a pbix.
Just having a look to see if I can upload the vertipaq analyser file.
E

I did share some of this info above in relationship to sizes of table and % of DB. That is almost as far as I can go. To share the VPAX file I would have to do a lot of renaming.

Thanks
E

I have two tables with high cardibnality
Sales (15M cardinality)
Dimension Table 2.2M Cardiality

4 Dim tables with cardianlity between 30,000 and 90,000
Then the rest are all below 1500

Sales Table is down to Measure and Keys
Large dim table is down to keys and columns used in reports

80% of the DB is in the fact table now

In the summary table of the Vertipaq Analyser it shows model size 579MB, 28 Tables, 270 Columns

Thanks
M

Ok,
so I have trimmed the massive dimension table. This has reduced the cardinality to 517,000

This then allows me to open the report in the desktop and the Matric that used to take 12 seconds is there in underr a second.
however when I publish to the Power BI Service this reverts to 10 seconds the first time you open the page in the report.

Are my expectations unreasonable for the Matrix to be there rendered in 2 seconds or can I cache the DAX Query when the report is opened on the first page?

Thanks
E

Can you post screenshots of:
VertiPaq Analyzer:

  • Tables. Redact the sames from a screenshot tool. Need to see the info in those columns, no one cares what the table names are
  • Columns: Just the top 10 or so. Again redact the name
  • Relationships: Overall is fine, but drilling down would be helpful
    The data model

Some things to keep in mind:

  1. Cardinality is definitely important, but there are other factors

  2. Is your data model a star schema?

  3. How many columns are there in the tables? DAX/VertiPaq work much better with long narrow tables vs. short wider tables

  4. In the measures that are slow can you run the Query Plan and Server settings?

  5. Want as much work done in the Storage Engine as possible. Do you see any CallbackdataID?

  6. How many records are being generated when you run your query? The # of records generated should be as close to the final output rows as possible. If there are alot of records being generated in the Query plan (called early materialization) it slows down DAX tremendously.

  7. Only results generated from the SE are cached and there is no way for you manually set it.

There are a lot of things that could be going on and it’s hard (yet understandable) that you cannot upload any files. But we are going to need some more info if possible.

2 Likes