Power Query - cache shared nodes
Update Power Query in Excel to take advantage of caching in cases where a parent node refers to a child node that has already been refreshed (as exists in Power BI desktop today).
This issue creates significant performance problems with refresh times when creating highly interdependent financial and operational models. This is a show stopper from a usability and customer acceptance standpoint.
We are happy to announce that the feature is finally available on Production starting from July fork (build 16.0.10726.*) for Office 365 subscribers.
Thank you all for your votes and the feedback.
- Excel Team
Carlos Cortinas commented
Hi Darrel Ripkowski, can you tell me please where a I found that Store button or checkbox, please?
Daniel Schmidt commented
Nor have I; Office 2019 64-bit also appears to refresh referenced queries repeatedly.
I have not observed any significant improvement (Office 2019 64 bit)
Darrell Ripkowski commented
Well I was wondering why power query was slow myself and so I tried a couple things.
I finally found out the main cause of slowness what checking the box Store in data model while creating the table.
If I just loaded to table and unchecked the load to data model the refresh was drastically faster. I am talking from taking around 5 minutes and more and making excel almost unusable to less than a minute to load the new data.
The data I am refreshing are from 12 different files on a share point online server so it has a lot to gather.
Wally Wilinsky commented
Just so the Microsoft team is aware... I ran an Excel Power Query test and Excel is connecting to that database and retrieving data much more than is needed.
Excel 2016 64 bit
Workbook with multiple power queries to a data model
All data sources are from the same SQL database
Wireshark installed to monitor network traffic between my PC and the SQL database.
For every query refreshed (either manually or through refresh all) Power did the following:
1. Queried the database and returned a result set of every table and view I had access to
2. Queried the database and returned a result set of every stored procedure and user defined function I had access to
3. Queried the database and returned the calling parameter for the user defined functions I was calling
4. Queried the database and returned the table structure of the result set my user defined function was going to return.
5. Queried the database and returned the primary key index names of a seemingly random list of database tables.
6. Queried the database and returned the version of SQL
7. And FINALLY executed the query it was designed to execute.
My questions is WHY SO MUCH OVERHEAD??????? I already designed the queries. I understand getting some of this information when I open the query builder but this was a right click refresh on the query list in Excel. Is this the root of all the Power Query slowness in Excel?
If the query is already designed, why can't it just execute the query? If I don't have permission get me that error.
Microsoft PLEASE HELP!!!!!!! This is killing my user base.
Sam - it could be your data model design.
There is no significant improvement is speeds after the update
(Ver 1809 Build 10820.20006)
@Adam - anyone with 1801. The only people that won't ultimately get it are those that purchased a perpetual Office 2016 license.
It will be in Office 2019 this fall too.
In an earlier response you indicated that this was available for users with version 1801 and beyond.
Does that only apply to Office Insiders, or anyone with that version? I am on the semi-annual channel.
Chad kukorola commented
@Neil Good, if you can use incremental refresh capabilities that have recently been released, then using Azure would allow you to setup a much better process for bigger models.
Check out https://docs.microsoft.com/en-us/power-bi/service-premium-incremental-refresh or related articles. I believe it’s still in preview and only for Power BI Premium.
I don’t/can’t use it currently, some am left with workarounds for my larger models.
Neil Good commented
Is there any way working with Azure can alleviate this issue? Lots of people seem to have CSV files which need to be imported/supplemented so would going via Azure sort this out?
@Ed - thanks for the response. I think PQ is an amazing tool. We have started heavily integrating it into our finance and accounting operations and would not be able to do some of the things we do without it.
I only wish MSFT had built PQ into Excel a decade ago. Keep improving it - this is where things are going!
Chad kukorola commented
I work almost exclusively with .csv files, and pulling in from File and append.
The two algorithms begin for run-length encoding and dictionary encoding consume the time on refresh. Each time it’s refreshed, it must run these again. For example, I have a model with 3 million record fact table, a few dimensions (one using same source as fact table). Refresh time is about 12-13 minutes.
Now, you need to know it’s 46 fields in the fact table (needed), and the processor is very old. On my newer machines with a newer processor, it’s a little more than half that time.
The compression and mapping algorithms must run regardless (in excel it auto partitions at 1 million records), then it’s the hardware that matters; fast processor, large processor cache, and as fast of RAM as you can buy.
Until we can legit partition when needed, those are your bottlenecks on refresh. However, bottom line; WOW, what a great capability regardless!
@nickolas, it depends heavily on your data source. If it is a bunch of CSV files, then those are pulled in. If they are on Sharepoint/Onedrive, that can take even longer. If it is SQL Server, then it depends on how efficient your queries are and take advantage of query folding. This is one of those "it depends" based on the source (of which Power Query supports several dozen) and how efficiently you are using them.
Nicolas Bransier commented
Thanks @Ed. My models fall in the latter category :(
It's disappointing and frustrating as I was hoping this change would improve the overall performance of the refresh on any models. Looks like it's not the case.
Does anyone know what is happening at the "Retrieving Data" phase, which takes very long in my case? It doesn't seem to be retrieving data since the data download starts after this phase. So what is it retrieving and why does it take so long?
@Nicolas - not all models will see an improvement. If your model calls the same data repeatedly through queries that are referenced by other queries, this enhancement will cache the data from the first query and let the 2nd, 3rd, etc. use it without pulling the data again.
If your models don't do much/any of that, then this wouldn't show much of an improvement if any.
How can Microsoft treat users of perpetual licenses this way ?! You sold the product with an error, which leads us to inability to use it, and do not want to fix it free. Are you waiting for a riot?
Nicolas Bransier commented
Hi Excel Team,
I have version 1803 and unfortunately do not see a change in the refresh performance with my models. It seems that the "Retrieving data" phase is taking the longest time. Is there a way to troubleshoot what is happening during this phase? or Does the improvement in performance only applies to new models?
@adam - it is in build 1801 of Excel or later for Office 365 users. Insiders, Monthly Targeted, and Monthly users should have already. Deferred channel may already have or be getting soon depending on how the IT department is rolling it out.
If you have Office 2016 perpetual license, you'll either have to purchase Office 365, or purchase Office 2019 perpetual when it comes it. AFAIK, it won't be pushed out to 2016 perpetual licenses.
When will we know that this is being pushed out to the general Excel audience?
I can't wait for this new improvement.