I would have suggested that you calculate the duration in Power Query. However, I notice your data connection is DirectQuery from SQL Server? Have you considered calculating duration (in secs) and duration in hours in SQL Server?
The issue might be with DATETIME datatype. You might want to consider using Transaction Duration (sec) column in your visual. A Duration Datatype would have been ideal.
Funny you should mention, I actually have a duration column available from the SQL table in a DateTime format, but you can see here that it’s always the same day as the transaction date:
Seems like I should be able to write a query that takes the DATE from the Transaction Date column and uses it as the BeginningOfTime variable in the “Transaction Duration (sec var)_v2” measure from my previous post:
Seems like I will need to use the Start and End columns and perform a calculate for each row, which is fine, but wondering if an optimized version is to extract the seconds from the existing Transaction Duraction (sec) column that is currently in a DateTime format:
I wonder if it’s possible for my computer allocated more core power to PowerBI? Because all my previous measures that were only returning values within 3-5 months max before the DATEDIFF error appeared, are now returning ~1.5 years before the error:
No, that is a core aspect of Power BI. A quote from the book “The Definitive Guide to DAX” explains it as:
“The formula engine is the absolute core of the DAX execution. Indeed, the formula engine alone is able to understand the DAX language, though it understands MDX as well. The formula engine converts a DAX or MDX query into a query plan describing a list of physical steps to execute. The storage engine part of Tabular is not aware that its queries originated from a model supporting DAX. Each step in the query plan corresponds to a specific operation executed by the formula engine. Typical operators of the formula engine include joins between tables, filtering with complex conditions, aggregations, and lookups. These operators typically require data from columns in the data model. In these cases, the formula engine sends a request to the storage engine, which answers by returning a datacache. A datacache is a temporary storage area created by the storage engine and read by the formula engine. Note Datacaches are not compressed; datacaches are plain in-memory tables stored in an uncompressed format, regardless of the storage engine they come from. The formula engine always works with datacaches returned by the storage engine or with data structures computed by other formula engine operators. The result of a formula engine operation is not persisted in memory across different executions, even within the same session. On the other hand, datacaches are kept in memory and can be reused in following queries. The formula engine does not have a cache system to reuse results between different queries. DAX relies entirely on the cache features of the storage engine. Finally, the formula engine is single-threaded. This means that any operation executed in the formula engine uses just one thread and one core, no matter how many cores are available. The formula engine sends requests to the storage engine sequentially, one query at a time. A certain degree of parallelism is available only within each request to the storage engine, which has a different architecture and can take advantage of multiple cores available.”
How are you importing your data? Through direct query? If you’ve written the SQL query yourself I’d try to leverage as much pre calculations as possible on the SQL side. If you’re using direct query, @pranamg gave excellent options. The data type being used is absolutely a bottleneck here.
I’m importing using DirectQuery. There is a calculation column on the SQL side that lists the Transaction Duration in seconds and it’s in a DateTime format. The only problem is it seems like there’s no way to know when the duration is longer than 24hrs.
@pranamg, in regards to efficiency, how would using the option of the CustomColumn for Duration.TotalSeconds[StartTime] - [EndTime] in Power Query (M) impact the model?
In DirectQuery Mode, you don’t need to worry about the performance as most of the heavy lifting will happen at the SQL end. Also, since the calculation is happening at a per transaction level, it is highly unlikely that you will have a situation where the duration is more than 24 hours. Also, even the duration is more than 24 hours, it will show up as the next day and whatever time is left over from 24 hours.
In import mode, the effect on performance should be minimal if you create a CustomColumn in Power Query as the values are numeric and would be pre-calculated beforehand and should perform smoothly during runtime as unlike DAX where the calculation are processed during run time, the values are already calculated, compressed and stored in the VertiPaq engine and the values would be aggregated depending on the choice of visual and filter applied.
The SQL duration calculation does not change dates when the duration is longer than 24hrs. As you can see here, this transaction duration is actually 6 days, 23 hours, 56 minutes, and 10 seconds (one of my longest transactions):
Can you check why you are getting negative numbers for September and October? Ideally duration should be zero or positive number. Maybe, for those transactions, there was not end time, hence the output turned negative. If this is indeed the issue, you have 3 choices:
Ignore any transaction with blank End Time
Add today’s or Year end as End Time and calculation duration
wrapping the expression in ABS to mitigate negative number issues
after reviewing the these negative numbers in the SQL database, I confirmed that there was 1 day in each month, with 4 transactions in each day with an end time of year 1899. This is very strange and I sent the issue to my DBA. For now, I am going to ignore those transactions using their TransactionID in the table’s Query Folding’ WHERE clause.
Everything seems to be working as expected now, so I’m going to consider this thread solved. Thank you @Zherkezhi and @pranamg, for coming along with me on this 3 day journey. I could not have come close to resolving this as quickly as we did without your insights. THANK YOU!