Deployment across multiple environments

The issue
I want to promote my objects seemlessly through

  • 3 Source environments (SQL dev / test / prod)
  • 3 Power BI Serevice environments

I am using

  1. Native queries against SQL
  2. dataflows (including linked entities) and data sets.
  3. Date set to provide the data for the report

Where I am at -
My dataflows contain parameters to change to the appropriate SQL Server environment so I am happy with this. It means there is still a manual change but it is a simple single change.

Whats left to do.
I would like to be able to move the data set to a new Power BI Service and as much changes seemlessly to redirect to the correct dataflow in the environment.

So in t data set when I look at the advanced editor
Let
Source = PowerBI.DataFlows(null),
#"…" = Source{[workspaceId="…"]}[Data],
#"…" = #"…" {dataFlowId ="…"]}[Data],
…

Is there a way to change to reeference the name rather than Id?

This would save a lot of grief
Cheers
M

Ok
so looks like I can add a parameter to the data set for the environment
Looks like I can change the data flow id to data flow name

That should help
Cheers
M

1 Like

Is there anything else you need from this? It looks like you were able to solve the data flow id/data flow name question and were able to add a parameter to the dataset for the environment, is there anything else? Were you able to have the parameter detect the correct environment or are they still manual?

@bradsmith
Thanks. I am trying to make moving between environments as seemless as possible.

The two remaining points are

  • The data set - if that could pick up or populate the environemt that would be awesome.
  • Likewise can the report do the same for where it looks for the data set.

Any help with those would be awesome.
Thanks
M

I’m assuming by environment you’re referring to the SQL environment, correct? Well, it’s maybe not the most elegant way of doing things, but if each environment is linked to a workspace 1-to-1, you could have the query find the workspace and then just perform a simple conditional statement. I’d need to actually test it out on one or two of the workspaces to make sure it worked appropriately, but here’s what I’d do:

Dataflows

let
    Source = PowerBI.Dataflows(null),
    #"Removed Other Columns" = Table.SelectColumns(Source,{"workspaceId", "Data", "workspaceName"}),
    #"Expanded Data" = Table.ExpandTableColumn(#"Removed Other Columns", "Data", {"dataflowId", "dataflowName"}, {"dataflowId", "dataflowName"}),
    #"Reordered Columns" = Table.ReorderColumns(#"Expanded Data",{"workspaceName", "workspaceId", "dataflowName", "dataflowId"}),
    #"Pivoted Column" = Table.Pivot(#"Reordered Columns", List.Distinct(#"Reordered Columns"[dataflowName]), "dataflowName", "dataflowId")
in
    #"Pivoted Column"

DataflowSettings

let
    Source = Dataflows,
    #"M Code" = #table(Text.Split(Text.Combine(Table.ColumnNames(Source),", "), ", "), {Text.Split(Text.Combine(Table.TransformColumns(Table.CombineColumns(Source,Table.ColumnNames(Source), Combiner.CombineTextByDelimiter(", ", QuoteStyle.None),"Columns"), {{"Columns", each "" & _ & "", type text}})[Columns], ", "), ", ")})
in
    #"M Code"

Workspace List

let
    Source = Dataflows[workspaceName]
in
    Source

WorkspaceParameter

null meta [IsParameterQuery=true, Type=“Text”, IsParameterQueryRequired=true]

Parameter List

let
    Source = PowerBI.Dataflows(null),
    #"Removed Other Columns" = Table.SelectColumns(Source,{"workspaceId", "Data", "workspaceName"}),
    #"Expanded Data" = Table.ExpandTableColumn(#"Removed Other Columns", "Data", {"dataflowId", "dataflowName"}, {"dataflowId", "dataflowName"}),
    #"Reordered Columns" = Table.ReorderColumns(#"Expanded Data",{"workspaceName", "workspaceId", "dataflowName", "dataflowId"}),
    dataflowName = #"Reordered Columns"[dataflowName],
    #"Removed Duplicates" = List.Distinct(dataflowName)
in
    #"Removed Duplicates"

DataflowParameter

null meta [IsParameterQuery=true, ExpressionIdentifier=#“Dataflow List”, Type=“Text”, IsParameterQueryRequired=true]

fxGetWorkspaceId

let
    Source = (pColumn as text, pWorkspace as text) => let
    Source = #"DataflowSettings",
    #"Filtered Rows" = Table.SelectRows(Source, each [workspaceName] = pWorkspace),
    #"Removed Other Columns" = Table.SelectColumns(#"Filtered Rows",{pColumn}){0},
    #"Converted to Table" = Record.ToTable(#"Removed Other Columns"),
    #"Changed Type" = Table.TransformColumnTypes(#"Converted to Table",{{"Value", type text}}),
    Value = #"Changed Type"{0}[Value]
in
    Value
in
    Source

DataflowID

let
    Source = #"DataflowSettings",
    #"Filtered Rows" = Table.SelectRows(Source, each [workspaceName] = WorkspaceParameter),
    #"Removed Other Columns" = Table.SelectColumns(#"Filtered Rows",{DataflowParameter}){0},
    #"Converted to Table" = Record.ToTable(#"Removed Other Columns"),
    #"Changed Type" = Table.TransformColumnTypes(#"Converted to Table",{{"Value", type text}}),
    Value = #"Changed Type"{0}[Value]
in
    Value

Then a standard conditional lookup would determine the environment.

1 Like

@bradsmith
Thanks. Thats given me a lot to think about. When I meant environment I meant Power BI environment.
My experience is unless you have a very strict organisation that the UAT environmenmt may take data from just about any source environment.

I have 3 Power BI workspaces (PROD / DEV / TEST)

DEV Power BI workspace will connect to SQL(either DEV SQL or UAT SQL ), Sharepoint(PROD), File stores (PROD)
UAT Power BI workspace will connect to SQL(either DEV PROD or UAT SQL ), Sharepoint(PROD), File stores (PROD)

PROD Power BI should always go to Prod environments for all sources

I have a Data set and a report that I want to migrate from DEV workspace to UAT workspace to PROD workspace. I am hoping I do not have to open the data set and report and change connections manually as this is where there is a chance to make a mistake.

Is this do able? I am resigned to the SQL environment being a manual parameter in the Data flow.

Thnaks
M

1 Like

Implemented this but not sure if this is a step forwards. Learnt a lot and I think I may have not thought this through.

1.In Power BI desktop I want to be able to set the workspace I use by name for the data flows.
2. When the data set gets placed in a Power BI workspace I want it to use the data flows from the workspace it is in.
3. When the report gets placed in a Power BI workspace I want it to use the data set from the workspace it is in.

The data set, report and data flow names will remain constant from one workspace to another.

I think understanding if the data set / report is in the desktop or Power BI Service is key

I think retrieving the current workspace ID/ name is also key.

Hope that helps getting towards a solution.
Thanks
M

1 Like

Hi @ells, a response on this post has been tagged as “Solution”. If you have a follow question or concern related to this topic, please remove the Solution tag first by clicking the three dots beside Reply and then untick the check box. Also, we’ve recently launched the Enterprise DNA Forum User Experience Survey, please feel free to answer it and give your insights on how we can further improve the Support forum. Thanks!