Select first result based on minimum value in group

Hi all,

I am needing help to create a new table from an existing table that has duplicate values and retrieve the the first record only based on the BC# column and the minimum value of Last Source Date RFP Influenced column

To better help explain, let me provide visuals. In Image 1, is the current table structure and in image #2 is the expected output.

Thanks in advance .

Image #1
image

Image #2


WIP -.pbix (18.0 MB)

1 Like

Hello @ysherriff,

Thank You for posting your query onto the Forum.

The reason why you’re getting the same results for the remaining set of rows is because for each group of “BC #” there’s a unique “Campaign ID” on their respective “Latest Source Date” and therefore due to that it’s showcasing this type of results. Below is the screenshot provided -

So now, since you want the results based on the minimum date for each group of “BC #” we’ll replace the field “Campaign ID” from the table with a measure i.e., basically creating a measure which shows the results of very first “Campaign ID”. Below is the measure alongwith the screenshot of the final results provided for the reference -

First Campaign ID - Harsh =
VAR _Start_Day = 0

VAR _End_Day = 548

VAR _Filtered =
FILTER( Campaigns ,
    Campaigns[Campaign Name] <> BLANK() )

VAR _Date_Range_Condition =
FILTER( Campaigns ,
    DATEDIFF( Campaigns[Latest Source Date] , Campaigns[RFP Received Date] , DAY ) >= _Start_Day &&
    DATEDIFF( Campaigns[Latest Source Date] , Campaigns[RFP Received Date] , DAY ) <= _End_Day )

VAR _Results =
IF( ISINSCOPE( Campaigns[BC #] ) ,
    CALCULATE(
        FIRSTNONBLANK( Campaigns[Campaign ID] , MIN( Campaigns[Campaign ID] ) ) ,
        _Date_Range_Condition ,
        _Filtered ) ,
    BLANK() )

RETURN
_Results

I’m also attaching the working of the PBIX file for the reference purposes.

Hoping you find this useful and meets your requirements that you’ve been looking for.

Thanks and Warm Regards,
Harsh

WIP - Harsh.pbix (18.0 MB)

Works perfectly Harsh. I appreciate your analysis.

Awesome. Like always. I will have to review the Isinscope function.