Got a scenario where we have around 6 columns which need to be cleaned, they will all have the same values needing to be cleaned. So rather than individually find & replace each value in each column I wanted to do a bulk find & replace, we have around 16 names to “clean” in each column.
Works fine when we only have 3 rows in our Find & Replace table but when we add more rows it runs into trouble, but it has worked a few times when we have had the whole Find & Replace table present. So there is not a clear issue springing to mind. Our full table which we clean has around 300,000 rows currently.
Just wondering if this function can’t cope with this volume or is there a better way to carry this out?
let
Source = DataTable,
Old = List.Buffer(FindReplaceTable[From]),
New = List.Buffer(FindReplaceTable[Replace]),
fxReplacements = (x as table, n as number, ColList as list ) as table =>
let
Replace = Table.ReplaceValue(x, Old{n}, New{n}, Replacer.ReplaceText, ColList),
Checking = if n = List.Count(Old)-1 then Replace else @fxReplacements(Replace, n+1, ColList )
in
Checking,
Result = fxReplacements( Source, 0, Table.ColumnNames(Source) )
in
Result
You’ll need to amend the Result variable, where the function is invoked: Result = fxReplacements( Source, 0, Table.ColumnNames(Source) )
Now, in it’s final argument it’s considering all columns from the table: Table.ColumnNames(Source)
You’ll need to replace that with a list of actual column names, something like: { “Column1”, “Column2” }