Recursion logic performance issue

This is in reference to one of my previous queries…
I have tried tuning the method @Melissa had mentioned. But still it is taking time while refreshing (around 30 minutes for 20000 rows.

If there is no other way to tune this, will it be good to use dataflows for this transformation?

Yes you could transfer this query to a Dataflow (and should you have Premium you can use the enhanced compute engine). With the Dataflow you can schedule the refresh of this table separately and ingest that ready to use table in any model you require.

Basically just right-click the query name in the Query pane and select “Copy”. Go to a Workspace (note that dataflows aren’t available in the “My Workspace” environment) create a Dataflow if you don’t have one already (and via a new Blank Query or whatever, find your way to the Online Power Query Editor) there you can paste it in with “CTRL+V”. Then all associated queries (if any) should automatically get transferred as well. So don’t copy and paste the code between Advanced Editor windows.

I hope this is helpful.

  1. Is premium capacity needed for creating dataflows?
    Did not get this part "(and should you have Premium you can use the enhanced compute engine) "
  2. Also , in dataflows will the transformation take the same time or less to complete?
  3. Is there any chance of tuning the query that is now existing?
  4. Is dataflow the best method to sort this or any other way?
  1. No minimum licence requirement is Pro (Dataflows are available in all workspaces except for the “my workspace” environment).
  2. You’ll have to test that.
  3. At this time I don’t see other optimization options for this specific part of your query
  4. You’ll be transferring the workload to the Dataflow. When you connect to the Dataflow to load this table is’t ready for use so your Report refresh will take less time.

So don’t copy and paste the code between Advanced Editor windows.
and should you have Premium you can use the enhanced compute engine
Could you please explain these parts?

Right-click the query name in the Query pane and select “Copy”. Go to a Workspace and in the Online Power Query Editor paste it in with “CTRL+V”. Then all associated queries (if any) should automatically get transferred as well.

Don’t worry about this if you don’t have Premium

Hi @Anu, did the response provided by @Melissa help you solve your query? If not, how far did you get and what kind of help you need further? If yes, kindly mark the thread as solved. Thanks!

Hi @Anu, we’ve noticed that no response has been received from you since the 21st of January. We just want to check if you still need further help with this post? In case there won’t be any activity on it in the next few days, we’ll be tagging this post as Solved. If you have a follow question or concern related to this topic, please remove the Solution tag first by clicking the three dots beside Reply and then untick the checkbox. Thanks!

Hi Anu,

There is something I have learned about dataflows that I would like to share with you and others from eDNA.

I ran into a problem with extracting and transforming data. It concerned several tables where I performed various transformations such as merges etc.

Every time it went wrong with applying and closing the query steps. It literally took hours (4+) and I ended up getting vague error messages and it failed. Several tables were identified as the cause of the crash. It kept changing.

Then a new colleague showed me how you can use dataflows to solve these kinds of problems (in some cases).

You can use different dataflows that have their own task. In one you only make the connection with the data source. So the core purpose is to extract data from the source. In the other dataflow you perform transformations. Let’s say you have to merge two large tables for example Orders and Cases.

First you create a dataflow with which you only connect to the Orders table. All complex columns must be expanded in dataflows if you need that data, otherwise they will be removed automatically by the online Power Query Editor. But other than that no transformations. Then you close and refresh the data flow.

Then you create a dataflow with which you only made the connection with the Cases table like with the Orders table. Then you close and refresh the data flow.

Then you create a third dataflow in which you make the connection with the first two dataflows and then perform the merge.

All 3 dataflows were done with the refresh in a few minutes. After the refresh in Power BI Service they become available in Power BI Desktop and the importing is a matter of minutes. Instead of performing all steps at once, you can perform steps in different dataflows.

In this situation dataflows were a BIG help!

By the way… Several data flows can easily live in a workspace, so it is a good idea to agree on certain naming conventions. We like to use the following naming conventions:

DF_T3

Hope it helps you the way it helped me :smile:

Daniel

2 Likes