 # Average inventory and cost balances for large transaction table

Hi,

I have a fairly large transaction table (over 27 million lines and growing) with inventory and their related cost transactions. To get a inventory (or cost ) on hand balance I do the standard formula

`Balance Qty = CALCULATE(SUM('Item Value New'[Item Ledger Entry Quantity]),FILTER(ALL(Dates),Dates[Date]<=MAX(Dates[Date])))`

Which works great. However, Now what I want to do is get an average of the inventory balances on hand over a date range. That way for items that are produced less often we still get a good idea of the carrying quantity and value of inventory.

To do so, I first tried creating a new table with summarize columns that would have the balance on hand for each date, and then taking an average of that balance

``````Inv Summary Col = SUMMARIZECOLUMNS('Item Value New'[Item Ledger Entry Quantity], 'Item Value New'[Posting Date],

"Bal Qty", CALCULATE(SUM('Item Value New'[Item Ledger Entry Quantity]), FILTER(ALL('Item Value New'[Posting Date]), 'Item Value New'[Posting Date]<=MAX('Item Value New'[Posting Date] ) ) ),

"Bal Cost", CALCULATE(SUM('Item Value New'[Cost Amount (Actual)]) , FILTER(ALL('Item Value New'[Posting Date]) , 'Item Value New'[Posting Date] <= MAX('Item Value New'[Posting Date] ) ) )

)
``````

Then I do an average with something like
`Average(Bal Qty)`

However, while that worked fine in Power BI desktop, every time I tried to refresh from the Power BI service I got a memory allocation error after the refresh timed out at around two hours.

So now I tried an alternate method. Basically just two measures

`Balance Qty = CALCULATE(SUM('Item Value New'[Item Ledger Entry Quantity]),FILTER(ALL(Dates),Dates[Date]<=MAX(Dates[Date])))`

followed by

`Avg Bal Qty = AVERAGEX(VALUES(Dates[Date]),[Balance Qty])`

Which seems like it’s doing the job, but when I update the matrix (say expand the product hierarchy, or switch to a different product category), things are going REALLY slow.

So is there a better way to approach this?

Thanks,

Hi @MathewA

The slowness is not due to measures if you’re not using the direct query mode.
Are you using SQL direct query?

Then it can hamper the performance.

Thanks,
Ankit

Hi @MathewA. With such a large transaction table, two other possibilities may prove useful:

1. Can you use a filter the reduce the number of records imported (to just those records you need to make a decision)?, and/or

2. Create an aggregation table for your historical data and import it so as not to redo “static” calculations every refresh? (there are plenty of videos online about aggs [the first that jumps to mind is Guy in a Cube])

Hope it helps,
Greg

Hi,

I am not using direct query.

I need all the transactions because you need to sum everything up to a date to get today’s inventory balance.

An aggregation table sounds promising, that’s what I was trying to do with summarizecolumns. I will look into this thanks,

The aggregation tables seem to mainly do simple aggregations, like group by. Not sure how that would work with sum everything together up to a date.

@MathewA Can you post your model or a representation of it so we can see it in detail? It sounds like Power Query might be an option. It wouldn’t be as dynamic as a measure but should make the model faster once the data’s loaded.

Thanks for posting your question @MathewA. To receive a resolution in a timely manner please make sure that you provide all the necessary details on this thread.

Here is a potential list of additional information to include in this thread; demo pbix file, images of the entire scenario you are dealing with, screenshot of the data model, details of how you want to visualize a result, and any other supporting links and details.

Including all of the above will likely enable a quick solution to your question.

Ok, I think I figured it out. I’m reading through the excellent “The Definitive guide to Dax”, and came across a section where (if I understand it correctly), you are getting context transition when you call a measure from within a measure.

So instead what I did was use a variable so the measure call is outside the iterator.

``````Avg Inv Value =
VAR balCost = [Balance Cost]

RETURN  AVERAGEX(VALUES(Dates[Date]), balCost )
``````

It seems to be going much faster now.

Thanks,

Never mind, the numbers are similar, but not correct if you use a VAR.

Ok, I figured it out. The problem was a had a filter on the visual that I only wanted to show total sales quantity > 0

For whatever reason, that was REALLY dragging the visual down. I assume it was checking every row for quantity > 0 instead of just removing the items where the final total was zero from the report.

Important lesson learned here. I’m loving the performance analyzer, and the ability to inspect the actual query.

Thanks,