locked
How To Maximize Power Query Peformance? RRS feed

  • Question

  • Hi guys,

    I am running Excel 2016 on a virtual machine. The machine is pretty powerful:

    It is Intel Xeon E5-2697 @ 2.60 GHz + 32 GB Ram.

    I have been updating crazy data model with PQ on weekly basis for months but this week the query is failing due to "not enough memory".

    When I look task manager when the query runs, it only takes 2 GBs of RAM, it is seems like it is not using full potential of the machine.

    I just enabled Fast Combine + increased cache memory to 40 000 MBs  but not sure if that even helps. I used to have 20 000 MBs plus Fast Combine disabled.

    Any tips how to make the query run again? 

    Obviously I cannot share the workbook, but at the end, it is rouhly 900 000 rows pulled from multiple data cubes, CSVs etc. The source data cubes have millions of rows. I am running inner join on those to "prefilter" only necessary data.

    DZ

    Monday, March 27, 2017 11:51 AM

Answers

  • Solution!

    For those how may encounter the same problem in future:

    This message:

    The operation has been cancelled because there is not enough memory available for the application. If using a 32-bit version of the product, consider upgrading to the 64-bit version or increasing the amount of memory available on the machine.

    may related to your PC but ALSO to server you are pulling the data from!!!

    So my query was failing because of the server, not because of my machines. After we restarted the server, the query ran like a breeze :)

    The only hint I have is that when you run the query in your PC when RAM is not fully used and you get this message, the problem could the server itself.

    DZ

    • Marked as answer by Daniel Zrust Wednesday, March 29, 2017 8:55 AM
    Wednesday, March 29, 2017 8:55 AM

All replies

  • This looks like the 32-bit limitation. If you're not running 64-bit-version, you will not be able to use more than 2 GB RAM.


    Imke Feldmann TheBIccountant.com

    Tuesday, March 28, 2017 6:02 AM
  • Nah, I was troubleshooting with our IT and it is using more than 2 GBs at the end. I do have 64 bit version, 100% positive about that.

    The reporting of memory usage is slightly different in virtual machines as I learned.

    DZ

    Tuesday, March 28, 2017 7:00 AM
  • Sometimes disabling "Background Refresh" helps:


    Imke Feldmann
    MVP Data Platform
    TheBIccountant.com

    Tuesday, March 28, 2017 7:30 AM
  • mmm, I do this in all files at the very beginning :(
    Tuesday, March 28, 2017 7:31 AM
  • Tuesday, March 28, 2017 8:09 AM
  • Mmm, could be the case. My "filter table" is 500 rows. I am trying to perform the 1st function option...

    I just prepared my filter function... It is called testFilter.

    Then I try to plug it back to the "main query". Worth to mention, I am pulling from "AnalysisServices.Databases".

    -----------

    let

        Source = AnalysisServices.Databases("XXX.XXX.XXX.XXX\XXXr", [TypedMeasureColumns = true]),
        CookieCampaigns = Source{[Name="YYYYYYYYY"]}[Data],
        Model1 = yyyyYYYY{[Id="Model"]}[Data],
        Model2 = testFilter(Model1{[Id="Model"]}[Data])
    in
        Model2

    -----------

    Is the "function insertion" correct? 

    The function looks like:

    let
    function = (group) =>
    let
        Source = AnalysisServices.Databases("XXXXXXXXXX\XXXXr", [TypedMeasureColumns = true]),
        CookieCampaigns = Source{[Name="YYYYYYY"]}[Data],
        Model1 = YYYYYYY{[Id="Model"]}[Data],
        Model2 = Model1{[Id="Model"]}[Data],
        #"Added Items" = Cube.Transform(Model2, {{Cube.AddAndExpandDimensionColumn, "[Cookie]", {"[Cookie].[Cookie].[Cookie]"}, {"Cookie.Cookie"}}, {Cube.AddAndExpandDimensionColumn, "[Country]", {"[Country].[CountryISOCode].[CountryISOCode]"}, {"Country.CountryISOCode"}}, {Cube.AddAndExpandDimensionColumn, "[Time]", {"[Time].[date].[date]"}, {"Time.date"}}, {Cube.AddMeasureColumn, "Total Income", "[Measures].[Total Income]"}, {Cube.AddMeasureColumn, "Installs", "[Measures].[Installs]"}}),
        #"XX - PPI Cookies Only" = Table.Buffer(Table.NestedJoin(#"Added Items",{"Cookie.Cookie"},#"PPI Cookies",{"Cookie"},"NewColumn",JoinKind.Inner))

    in #"XX - PPI Cookies Only"
    in function



    Tuesday, March 28, 2017 8:39 AM
  • Yes, folding for SSAS is different. You should check with Profiler.

    Here's how I tackle folding for SSAS: http://www.thebiccountant.com/2015/09/19/perfect-analysis-services-ssas-reports-in-excel-using-power-query/


    Imke Feldmann
    MVP Data Platform
    TheBIccountant.com

    Tuesday, March 28, 2017 8:49 AM
  • Hmmm... This getting out my knowledge.

    I just discovered that my query is even not getting past inner join where the "filter table" has only 7 values. :(

    Now, I am more thinking that the server itself can have a memory issue. The exact message is:

    The operation has been cancelled because there is not enough memory available for the application. If using a 32-bit version of the product, consider upgrading to the 64-bit version or increasing the amount of memory available on the machine.

    DZ




    Tuesday, March 28, 2017 9:03 AM
  • Solution!

    For those how may encounter the same problem in future:

    This message:

    The operation has been cancelled because there is not enough memory available for the application. If using a 32-bit version of the product, consider upgrading to the 64-bit version or increasing the amount of memory available on the machine.

    may related to your PC but ALSO to server you are pulling the data from!!!

    So my query was failing because of the server, not because of my machines. After we restarted the server, the query ran like a breeze :)

    The only hint I have is that when you run the query in your PC when RAM is not fully used and you get this message, the problem could the server itself.

    DZ

    • Marked as answer by Daniel Zrust Wednesday, March 29, 2017 8:55 AM
    Wednesday, March 29, 2017 8:55 AM
  • Thx for this feedback Daniel, very valuable insight!

    Imke Feldmann
    MVP Data Platform
    TheBIccountant.com

    Wednesday, March 29, 2017 9:10 AM