Latest Enterprise DNA Initiatives

Power BI Challenge 12 - Transport & Shipping Data

Hi All,

I trust you are all keeping safe.

Its that time again, Challenge 12!

The bar is raised with every challenge and challenge 11 provided some brilliant entries and encapsulated what the challenges are all about. The work on display was by far some of the best Power BI development being produced across the globe.

Those of you that haven’t be sure to check out the reports and writeups.

To find out more about the challenge please see the link below.

As always if you have any suggestions or comments, we are always happy to listen. Please feel free to reach out to myself or the Enterprise DNA team.


Couple of bits I just want to cover of before diving into the brief.

We have been listening to feedback and have decided that the expert panel will no longer be eligible to win.

However we believe there is massive value in the experts taking part in the challenges and the knowledge share and learning opportunities presented through their work. So going forward the experts will be in a whole category of their own.

There are a number of you who are interested in participating in the challenges however are a little unsure of how to go about preparing an entry.

Well, worry not expert @JarrettM has created a course taking you through his process of creating a data challenge entry step by step.

I highly encourage you all to set aside some time to go through the course. Really powerful material that provides you a basis of creating a report end to end in Power BI not just for challenges but any report development.

Awesome work @JarrettM :blush:.

There are also some super events lined up over the coming days with several experts going through data challenges and more so be sure to tune in.

Winners, prizes and categories

Were having a little shakeup this challenge as we have manged to secure some extra prizes.

The winners in each of the categories will receive the customary Enterprise DNA training membership but we have also manged to secure some cash prizes!!

This week’s entry has been kindly sponsored by a client who wishes to remain anonymous however below is a breakdown of the prizes on offer.

Winning Enterprise DNA member - $500 (Or amazon Voucher equivalent) + Free Membership

Most innovative Entry - $250 (Or amazon Voucher equivalent) + Free Membership (Open to both members and non-members and the overall winner can also claim the most innovative title)

Winning Non-Enterprise Member - $250 (Or amazon Voucher equivalent) + Free Membership (Open to everyone)

First time participant winner- Pick from a selection of prizes as mentioned here. You must be an Enterprise DNA member to qualify.

Not only is there an opportunity to win some great prizes while learning but you can also help shape the vision of what and how the client embeds Power BI.

Your work will directly impact and make a difference to the implementation of Power BI in an organisation.

The past 11 challenges have provided some of the best Power BI developments out there and the library we are developing through the challenges is second to none. So many options and inspirations to take into our own work.

I have seen several reports popping up across the web in recent weeks and for a number of them it has been evident where the inspiration is coming from.

So great work all we are making some serious waves and building one of the best libraries of Power BI work across the globe.

The Brief

As I mentioned earlier this week’s data set is courtesy of a client who are looking to use Power BI for reporting and analytics purposes.

They have taken an extract from the source system and dumped it into excel.

Under normal circumstances the team would then work with this excel and produce a number of reports through a series of pivots and templates and a whole lot of manual process.

They are now looking to use Power BI and automate this.

They haven’t asked for any specific requirement but rather want to see what’s the art of possible in Power BI.

So there really is no right or wrong answer here and the ball is in your court to produce a compelling report.

As you will see in the excel the number of columns is staggering and the data is far from optimal for Power BI reporting.

There is a data dictionary attached that covers some of the most important columns and what they mean.

There isn’t a requirement to model everything just what you feel is necessary.

The flat file extract to excel is common when working with source systems where a direct integration or middleware (data warehouse) isn’t available. Many companies prefer to keep reporting separate from operational systems to protect the operational system from any performance hinderance.

As with any good solution we at Enterprise DNA can’t emphasise the 4 pillars of development enough. So please bear this in mind as it will help structure your thought process.

So I now hand over the baton to you…

Any questions around the data and understanding of it please do feel free to reach out.


We really want encourage you all to get involved in some way shape or form so please do get involved even if its just sharing this with someone who might be interested.

SUBMISSION DUE DATE - Sunday, 28th March 2021 (PST)

Please can you all submit your PBIX files to

Best of luck!

Any issues or questions please reach out.

Thanks :slight_smile:


Enterprise DNA

Challenge 12 Data.xlsx (177.2 KB)


Hi Haroon,
I am really excited for this challenge as I am sure it will give a real boost to my knowledge. I appreciate all of hard work by all Enterprise DNA team.
Now coming towards my problem:
I am bit stuck on the container size part of data, actually I tried Jarret method which he explained in his problem 6 video of addition column, but I am sure I am missing something. so, my question is, can I merge or add all these container size columns? or what else can be done?
I know its a challenge but I need help :sweat_smile:
I am attaching screenshot as well.
Kind Regards



I imported the data from the Excel file to SQL Server table as a staging table to query data easily with T-SQL queries and understand some insights and reveal some hidden patterns.
I always believe that data discovery is very important before proceeding to modeling.

1 Like

I can understand the decision about the EDNA Experts not being eligible to win but I definitely want them to take part. Digging in and pulling apart their submissions to work out how they’ve created their respective works of art is golden!



Thanks! And not to worry - I’m literally working on the JMAP entry as we speak. :grinning:

  • Brian

Good to hear, I’m determined to make this my 1st entry…although I’ve said that before :rofl: :persevere:



Go for it! I guarantee you’ll be glad you did.

There’s definitely some challenging aspects to the data set in terms of data quality, but the free-form nature of this particular challenge makes it a good one to dive into – no constraints, just “tell us something interesting about the data” . Means you can make this one as simple or complex as you want.

Look forward to seeing your entry. :grinning:

  • Brian

Hi @Shumaila,

I hope you are well.

No issues at all and as we have said the challenges are all about learning and asking questions is encouraged.

So as you mentioned there are probably a few ways you could handle this, but I always like to go by the principle make the fact as thin as possible.

So I would consider maybe unpivoting these columns so that I have an attribute column of container size and one for value.

I could then create a dimension of container sizes. This would then allow me to see which shipments used which container sizes and even lead me towards possibly calculating utilization.

There could be a requirement that for each shipping job we want to see the containers used. Setting up the data with the unpivot would make this analysis super easy.

This is just my 2 pence, and I’m sure there are other methods, but if you follow the principle of keeping your fact tables as narrow as possible it should put you in good place.

Hope this helps.


P.S. be sure to checkout Enterprise DNA expert @Greg video on best practices. Some powerful tips and techniques covered on data modelling. Awesome work @Greg :slight_smile:


We look forward to it @DavieJoe :slight_smile:

1 Like

Oh…the pressure haha :joy: :rofl:


Thank you so much @haroonali1000, the issue is resolved.

Good approach

Hey guys ! I worked on cleaning data using SSIS. I will be sharing with you the github link in few days.

Find attached the Excel file.


DataCleaned.xlsx (76.6 KB)

@Shumaila @BrianJ Waiting for your feedback :wink:

1 Like,

Thanks! This data requires a LOT of cleaning, so it’s extremely generous of you to share what you’ve done in the midst of the ongoing challenge. It would be helpful though to know what you cleaned, how you did it, and what assumptions may have been necessary in the midst of this cleaning. (That may be the github link you refer to in your posting).

Thanks again.

  • Brian isn’t the data cleansing an essential part of the Power BI challenge?
Just like in real life. :slight_smile:

1 Like

Sometimes Power BI cannot be considered as an efficient data cleansing tool depending on the context because it simply does not have the capabilities as other tools on the market.
I loaded the Excel file to a SQL Server staging table to be able to write queries and distinguish the fact and the dimension tables.
As I stated in an earlier comment, in my humble opinion it is easy to use T-SQL queries (than DAX or Power Query) to understand some insights and reveal some hidden patterns.

1 Like

I’m kinda confused, i thought that everything needed to done with power bi not other programs

@Keith as I found the data quality of the dataset may be not as expected I loaded in a SQL Server table to be able to query it since it is easier than DAX queries to understand some insights and reveal some hidden patterns.
If you feel comfortable with DAX/Power Query go ahead and don’t forget to share your work!
Always remember sharing is caring :wink:

1 Like

Hi All. Having spent a large part of my consulting in the last decade or so in database/data warehouse/ETL environments, I can certainly appreciate the enhanced capabilities of mature data cleansing tools. That being said, and while the eDNA challenges are directed at extending your own Power BI learning, IMHO it’s in the best interests of the community to explore (and discuss) the strengths and weaknesses of the Power BI product.