Updating a huge data set over 6 million rows without refreshing part of it

Hi all,

I have been battling this one now for some time and need a little advice on the best way to do it.

In our SAP system i have a data set called MB51 which is every single goods movement of every single product in our factory since 2010 which is a Aerospace company so as you can imagine its HUGGEEEEEEE.

Ok so i have downloaded the MB51 report in yearly excel files and saved them in a folder called historic MB51 and file names 2010, 2011, 2012, 2013 and so on upto 2019.

I then created another folder called current year MB51 and inside i have 2020 files that i have automatically downloaded daily into files.

So in query editor i imported the FOLDER MB51 Historic data. Made my changes then disabled refresh as i don`t need to refresh historic data & loaded it to the model. This took a while but is a one time deal.

I then imported current year MB51 folder. Made the same changes in the advanced editor and loaded it to the model leaving enable refresh active as this needs to be updated daily.

Now i need to UNION the two tables together as this was the only way i can append the 2 folders and refresh just the current year MB51 folder. if i do an append in the query editor it always tries to refresh the huge historic file.

This is OK but as you know when you union 2 tables you cant see it as a table in the query editor. does anyone have any advice on another way of doing this? i know the incremental refresh but i don`t have a PRO license.

Thanks

Dan

i dont think there is anything besides incremental refresh. Maybe something using Data Flows, but would need the pro license. It’s only $10/mth, well worth it.

1 Like