Are Dataflows useful without Premium Capacity?

Hello all,

I’d like to start a discussion about how Dataflows could be useful without purchasing Premium Capacity.

I have attempted to use Dataflows a few times, but I always run into limitations that require Premium. These paywalls are difficult to work around because any query that requires in-memory compute (“Computed Entity”) is not allowed.

I’m not sure I fully understand the use cases of Dataflows, because most of what I’ve attempted requires some transformations that I’m not allowed. They are a tool I haven’t used properly, so I would be interested to know how others have found value.

Weigh in if you are using Dataflows; what types of data sources do you use them for? What transforms are you using? How have you avoided the dreaded “Computed Entities”?

Here is the documentation | Also, “Computed Entities”

@CStaich,

I haven’t used dataflows, so don’t have anything substantive to contribute to this thread, but I did remember seeing this informative post from @Melissa, which does provide some information that seems relevant to your questions:

  • Brian
1 Like

@CStaich I have just switched my reporting to using dataflows, and I love it - and we do not have Premium Capacity. I am reporting out of a Dynamics NAV database, and connecting the dataflow through a Gateway that our IT team set up on PowerBI.

I have dataflows for different types of entities - and then connect the dataflows together in PowerBI to build my reporting dataset. The part I really like about this is that my dataflows can refresh at different rates, reducing the strain on the server.

Example: I only refresh customer tables once a day, but sales tables are refreshed every 3 hours.
Sales team is told that new customers sales won’t show until the following day, and they are good with this.

If a particular dataflow fails, I only have a few tables in the flow that I have to review for errors.

Avoiding Computed Entities is an interesting trick - basically keep in mind that you cannot reference any table that you are going to load in the final output of the dataflow. So, as an example: in this dataflow, I have Item Unit of Measure table, and I need to merge it into the Item table, but I also need to return it to the final dataset for transformation against another table (not part of this dataflow).
image

So, I loaded Item Unit of Measure, and removed the “enable load” checkmark from that one.
I then was able to merge that table to my Item table. And rather than reload the Item Unit of Measure table, I simply referenced it for Item UoM
image

that is my work-around. :slight_smile:

2 Likes

Hi @CStaich, A response on this post has been tagged as “Solution”. If you have a follow question or concern related to this topic, please remove the Solution tag first by clicking the three dots beside Reply and then untick the check box. Thanks!

Without premium you can’t modify the data too much. For example, reference data doesn’t work if you don’t have premium.

Actually, you can reference data - you just can’t reference data that you are going to load to the dataflow.
To work around that, I pull in the table, and remove the load option - then merge it as needed for a table I am going to load.
Then I go back to the table that doesn’t have the load, and I reference it and make sure the new referenced table is marked to load to the dataflow.
image

1 Like