Working with database

Hi All,

I am working on large tables with millions of rows. i have tried to use Direct query but its seems to have lot of constraints. now i have imported some data based on a filter and did my data model. now after publishing it to service, how do i remove the filter i used so that it get all the data from the data base. please can i get some help around this.

Thanks
Swapnil

You would have to remove the filter within your Power BI Desktop file.

You will find that the constraints won’t go away though, so I’m not sure this will achieve what you’re looking for.

Hi,

As soon as I remove filter. The data model tries to load full data from the dB again.

Is there any other way to connect to databases and not to load power bi desktop with such heavy data!?

Thanks
Swapnil

That’s just how it works. I’m not sure what you are attempting to do really using this technique. If you choose import (which give you the most flexibility) then everything will need to be loaded into your model if you want to see if in the online service…that’s just the reality.

Hi,

the problem is with the amount of data we have. there are more than 50 million rows in the fact and yes they need this to be a detail table. not aggregated. when i change the connection, it starts loading all the rows in the database to my power bi desktop. which is getting crashed.

it would be great if you can suggest me an alternative where i wont download the data in to power bi desktop and still see the reports.development.

Thanks
Swapnil

The reality with Power BI is that it isn’t built to download 50 million rows unfortunately.

I think you really get capped out around 4-10 million (but this totally depends on the data and data structure)

If you truly need the details of that many rows then I think your only option is to use direct query. Obviously this comes with many limitations.

Thanks
Sam

@swapnil

Just some of my random thoughts that may or may not help…

First, how many columns are you importing? 50 million rows, while a lot of data, isnt unheard of. It’s when you get way to many rows things start getting wonky.

Second, are you using a relational database like SQL? If so, when you load the data you can write a custom SQL statement to only pull in the data you want. It is in the advanced options. If you dont know how to write sql you can import like you have been doing, put your filters on and then click “View Native Query” under the applied steps. If query folding hasnt been broken, you will see a custom sql statement. You then can use that to only import the data you need.



Enterprise%20DNA%20Expert%20-%20Small

Hi,

Appreciate your comments, suggestions in this regards. yes i agree this is not ideal to have 50 million rows in the table and want to read from power bi. the other problem is the reporting need is for marketing application where they are trying to understand the registrations and attendance of people. so the facts are not actual facts. we have got factless facts… measures we build are more on counts and cumulative counts.

We are trying to different ways to get rid of the slowness issues and crashing issues. will keep you all updated on the outcome. Thanks again for the help.

Regards
Swapnil

Thanks to the contributors of this post. Due to the length by which this post has been active, we are tagging it as Solved. For further questions related to this post, please make a new thread.

Is there a real solution now to this? thanks