Legacy System Data Migration

Recently, my company moved to a new ERP system. We have approximately 5 years of legacy data as well as 6 months of data in the new ERP system. I’m in the planning phase of combining the two data sets into one Power BI Data Model.

While transitioning, we created mappings for our dimension tables so that dimensions such as Vendors, Manufacturers, and Customers were migrated as-is. We have mapping tables for each dimension with the old ID and the new ID.

Here’s my plan:

  • I will leave the old (static) data in its own database.
  • I will query the legacy database in Power BI, query the new ERP system, and combine the queries via append.
  • The only tables I want to bring in from the legacy system are the Fact Tables. I would like all the dimension ID’s in the legacy fact tables to be updated to their new dimension ID in the new system.
  • I will also query the dimension mapping tables, so I can map each legacy dimension ID to a new dimension ID via LOOKUPVALUE()

Essentially what I need to do is query a legacy fact table, use LOOKUPVALUE() against the dimension ID maps to pull in the new dimension ID, and remove the legacy dimension ID column. Now I have a legacy Fact record that correctly references new Dim tables.

So long as each legacy dimension id has one corresponding new ID, I think this method should work. A legacy Sales Order record should reference existing dimensions from the new system. I can use queries to create a single combined Fact table and throw out the legacy Dim tables.

Does this sound right? From the bit of thought I have put into this, it seems to be one of the simplest methods with the least moving parts, so long as my mapping tables are complete.

Let me know if you have questions or if my thoughts above are not clear.

Any suggestions or resources for this transition would be appreciated.

Hi @CStaich,

Alternatively you could:

  • load all the legacy tables into a Workspace Dataflow,
  • load all accompanying dimension mapping tables into the same Workspace Dataflow.
  • Perform the replacement of the legacy ID’s by the new ID’s in this Dataflow.
  • Shape all output tables so they match your new ERP column headers
  • Dataflow refresh only has to be performed once and doesn’t have to be scheduled because this data will now never change.
  • Decide if Dataflows could be perferred over querying directly against the new ERP system.
  • (If so, create a new Dataflow for all Queries against the new ERP system and schedule refresh)
  • In Power BI you’ll query the legacy system Dataflow fact tables
  • and query the new ERP system fact- and dimension tables (directly or via a dataflow)
  • You can append all fact queries because the column headers match

I hope this is helpful.



Thank you for the suggestion. I have never used Dataflows, but will done a bit of research and they seem to fill this gap very nicely. The option to refresh a dataflow on a separate schedule from the rest of my dataset will be crucial.

I will likely follow these steps almost exactly. I’ll report back if I have any trouble.

1 Like