Hi there,
The big thing for me here is that you seem to be doing calculations and summaries in your queries. Personally I would not do this on a large data set because as you’re experiencing it will be very slow.
You should look to simplify what you need from your core Sales table and then do any the calculations using DAX measures. This is the most optimized way to do things in Power BI.
I’m sure you Sales table doesn’t need to be that wide (or have that many columns). It’s just about getting the table as optimized as possible and possibly doing a bit of work with SQL as you bring it into your Power BI model.
Some background,
When a query is loaded to the report, it will always be evaluated in isolation. It may also be evaluated by different processes, so it does not form a dependent tree structure.
PowerQuery evaluations keep a cache of data seen by evaluations on disk. So if you are within the same cache session and pulled on multiple times, you will only pay for it the first time.
This cache only applies to raw data coming from the data source, any additional transformations will need to be performed on top of it.
Each loading session is a new cache session since you always want the latest data when you load queries to report.
So you will only need to pay for the data coming from data source once per loading all of the queries. But you will need to pay for the transformations on top of that.
With the situation you have there, DirectQuery can help with the performance. You always operate on the remote data source with DirectQuery and never keep a local copy of the data.
So there isn’t a “loading” phase and you will pay for the data as the visuals need them.
Hopefully this gives you some background.
Check out many ideas in this course module for optimizing your queries and tables.
Thanks
Sam