locked
cube not refreshed in dashboard designer RRS feed

  • Question

  • I updated data content in SQL Server Management Studio (data structure is not changed), the corresponding data in cube is refreshed immediately. But in PPS Dashboard Designer, the data do not refresh. I tried deploying the cube again (it says no change detected), tried click 'Refresh' in Dashboard Designer, tried delete the data source and redefine the data source, the analytic chart still use the old data.  I had this problem before, rebuild the cube would solve the problem, but I think there's way to avoid rebuild the cube. Could anyone help?  Thanks a lot!

    Haley
    Monday, February 22, 2010 9:56 PM

All replies

  • On the Data Source there is a Cache Setting where the Interval time is set.  By default the value is 10 minutes.  You would want to change this to a value like 0 (zero) if you want the Data Source to reflect the immediate changes in the Data Source.

    The other option that you have would be to simply republish the Data Source in Dashboard Designer and then you would see the changes in your reports.  You definitely do not need to rebuild the cube.
    Dan English's BI Blog
    • Proposed as answer by Merin Nakarmi Tuesday, March 4, 2014 8:10 PM
    Tuesday, February 23, 2010 11:39 AM
  • Hi Dan, I tried both and it does not work. Any other settings affect the refresh of cube in Dashboard Designer?  Thank you!
    Tuesday, February 23, 2010 2:34 PM
  • Once you have refreshed the cube, refreshed the data source, you would then need to reload the web page (F5).  If this is not working for you then I would need to know the steps that you have taken.  You definitely would never need to reprocess the cube a second time.

    You should just need to do the following:

    1.  In Dashboard Designer change the Interval time on the Cache Setting for the SSAS data source to 0 (zero) and publish to repository
    2.  Refresh your SSAS database
    3.  Load the dashboard page and/or refresh if already open

    You might also want to take a look at this earlier thread if these steps are not working for you and take a look at the Server Options - Dashboard Designer cache settings.
    Dan English's BI Blog
    Tuesday, February 23, 2010 3:23 PM
  • Hi Dan,
    May I know what version of PPS, SQL server you use?
    Mine is PPS 2007 (3.0.3916.00), SQL Server 2005, Visual Studio 2005.

    Thank you!
    Tuesday, February 23, 2010 3:55 PM
  • You are on PPS 2007 SP1.  There is actually a bug with setting the Interval time to 0 (zero) in SP1 that was reported here by Nick Barclay - PPS M&A SP1 DataSource Cache Bug.

    I am currently running PPS 2007 SP3 (3.0.4417.00) along with SQL Server 2008 SP1.


    PerformancePoint Server (PPS) 2007 SP3 now available

    More PerformancePoint Server 2007 Hotfixes Available

    PerformancePoint Server 2007 Hotfixes Available and Build List
     
    Dan English's BI Blog
    • Edited by Dan English Tuesday, February 23, 2010 6:21 PM modified display of blog posting reference for SP1 bug
    Tuesday, February 23, 2010 6:18 PM
  • Thank you  Dan. I also tried using interval time of 1 minutes but it still not refreshed after 1 minutes.   What do you mean by 'refresh the SSAS database'? Do I need to do something on the cube in SSAS after source data is changed? When I change the source data, I see the data in cube automatically updated, if I deploy the cube, it says no changed detected.

    I will try to use the new version. Thank you very much for all the useful information.
    Wednesday, February 24, 2010 2:49 PM
  • By refresh I just meant to do your updates (reprocessing).  It sounds like you have the proactive caching setup so that the relational updates are loaded into the cube automatically or at least on some type of scheduled basis.
    Dan English's BI Blog
    Thursday, February 25, 2010 7:50 PM
  • Yes! After I reprocess the cube in SSAS, PPS charts pick up the changes. I didn't know this step and have tried others like 'debug' , 'build', 'deploy' etc.  Thank you very much!
    Thursday, February 25, 2010 10:30 PM
  • Ok, so you don't have proactive caching setup.  Then the process would be the following:

    1) load data in the data warehouse / dimensional model
    2) reprocess dimensions (typically just do a process update)
    3) reprocess the cube (measure groups - best practice break this into two steps: process data and then process indexes)

    PPS should now reflect the changes based on the new data that was updated/loaded into the SSAS database.

    As you get more data in the fact tables you might need to start looking at partitioning the measure groups.  At this point I do have a posting about a process that you can use to do incremental updates (so you don't have to do a full process on the cube if you are using that option currently) - Analysis Services (SSAS) Processing and Aggregations.

    Glad you got your issue resolved.
    Dan English's BI Blog
    Friday, February 26, 2010 1:27 AM