Latest Enterprise DNA Initiatives

Extract a product code in middle of string

I am trying to extract the product code from a string so I can merge the product description into the table. In Power Query, I split the string but it isn’t going to work.

Not sure of the best way to approach this.

  1. Is there a way to extract the product code in DAX using a like operator such as like ‘E%%%%’?
  2. If it is done in DAX, I need to be able to lookup the description from a product table.

DEMO - extract product code.pbix (21.3 KB)


This is definitely going to best be handled in Power Query with a custom M function. If you can please provide the Excel file, I’ll be glad to take a run at that.

  • Brian

Hi Brian, I included the demo with the excel loaded. Thanks!

Product Lookup.xlsx (10.8 KB)

Yup, but when you go to modify PQ on someone else’s PBIX it looks for the original data file again and throws an error like this until you repoint the source back to the Excel file:

  • Brian

Oh, that makes sense


As @BrianJ said, this is probably best handled in PQ, but one way to do this in DAX is with 2 measures, the first using FIND and MID, and the second using LOOKUPVALUE (my verbose measures below):

Extracted Product Code =
VAR _CurrentString = SELECTEDVALUE( Invoices[Invoiced Item] ) 
VAR _SearchString = " E" 
VAR _ProductCodeStart = FIND( _SearchString, _CurrentString, 1, -1 ) 
VAR _Result = IF(
    _ProductCodeStart = -1,
    "<not found>", MID( _CurrentString, _ProductCodeStart + 1, 5 )


Matching Product =
VAR _ExtractedProductCode = [Extracted Product Code] 


Hope this helps.
eDNA Forum - Product Lookup.pbix (18.9 KB)



The best approach would be to do this in the query editor.

Please find the attached solution in Excel.

I copied your data from Pbi to Excel.

Product Result.xlsx (21.8 KB)

Example of raw data:



To get the same result as @Greg I split the column by delimeter and then I used Text.Middle formula.

The last step was to do a vlookup betwen two tables.

Please see below code:

Source = Excel.CurrentWorkbook(){[Name=“Table2”]}[Content],
#“Changed Type” = Table.TransformColumnTypes(Source,{{“Invoiced Item”, type text}}),
#“Duplicated Column” = Table.DuplicateColumn(#“Changed Type”, “Invoiced Item”, “Invoiced Item - Copy”),
#“Split Column by Delimiter” = Table.SplitColumn(#“Duplicated Column”, “Invoiced Item”, Splitter.SplitTextByDelimiter(“E”, QuoteStyle.Csv), {“Invoiced Item.1”, “Invoiced Item.2”}),
#“Changed Type1” = Table.TransformColumnTypes(#“Split Column by Delimiter”,{{“Invoiced Item.1”, type text}, {“Invoiced Item.2”, type text}}),
#“Added Custom” = Table.AddColumn(#“Changed Type1”, “Custom”, each Text.Middle([Invoiced Item.2],0,4)),
#“Changed Type2” = Table.TransformColumnTypes(#“Added Custom”,{{“Custom”, type text}}),
#“Added Custom1” = Table.AddColumn(#“Changed Type2”, “ProductCode”, each “E”&[Custom]),
#“Removed Columns” = Table.RemoveColumns(#“Added Custom1”,{“Invoiced Item.1”, “Invoiced Item.2”, “Custom”}),
#“Added Custom2” = Table.AddColumn(#“Removed Columns”, “Description”, each if [ProductCode] = null then “not found” else Product{[productcode=[ProductCode]]}[description])
#“Added Custom2”

I hope that helps.

Please have a look at the file.

Thank you,


1 Like

Nice one @Matty :+1:


Here’s an alternative Power Query solution. Create this Custom function.

(myString as text) as nullable text =>
    MyProducts = List.Buffer( Products[product_code] ),
    StringLength = 5,
    GetProduct = List.Intersect( { List.Transform( Text.Split( myString, " " ), each Text.Start(_, StringLength )), MyProducts }, Comparer.OrdinalIgnoreCase ){0}?

And invoked that, with this result.


Here’s your sample file. eDNA Forum - Product Lookup.pbix (21.4 KB)
I hope this is helpful


@Greg, @Matty and @Melissa,

Wow, nice work! Three completely different approaches. This one would have made a great Problem of the Week. I was working on a fourth approach based on List.FindText that I’ll try to finish up and post.

@LASHBURN - thanks for a great question!

  • Brian
1 Like

This looks like the best solution for me! Thanks!

Thanks for your work on this Melissa!

I appreciate your time on this, I am going with Greg’s solution. Thanks!

I just figured how I send the response to everyone, I think.
Thank you @Greg, @Matty, @Melissa and @BrianJ !

1 Like

Hi @LASHBURN, did the response provided by the users and experts help you solve your query? If not, how far did you get and what kind of help you need further?

If yes, kindly mark as solution the answer that solved your query. Thanks!

All were great solutions, thanks everybody. I chose Greg’s to use in my project.