none
Partition data vs Data Source View RRS feed

  • Question

  • Good day.

    I would like to ask for some assistance/advise. I do not work too much with ssas cube development so please forgive me if i am asking stupid questions.

    i have a particular cube that i would like to partition into individual years. i have one query in the DSV. I have created all the separate partitions into each individual years by restricting the data in the 'WHERE' clause of the sql query (which is the same query in the DSV).

    i have created partitions for years 2015,2016,2017. i then restricted the data in the DSV to look at data from 2018 to current. This is where i seem to be getting stuck, is that the correct way to to create the partitions? After doing this now i only get calendar years for 2018 and 2019 in my Date/Time dimensions when pulling data into Excel. But when i browse this dimension inside of BIDS i see Calendars from 2009 to 2019.

    i am not sure if i am correctly creating the partitions (probably not if i am not getting the desired results). i have a few questions if i may to try and help me understand a few things.

    1. Do i have to restrict my DSV query as well to only look at data from 2018 to current if i have already restricted these values in my partitions or should my DSV have all data with no restrictions?
    2. Do i need to have a partition for 2018 to current (that matches my DSV - same 'WHERE' restriction) as well as other partitions (2015,2016,2017)

    I am using BIDS 2008.

    any help would be greatly appreciated.

    Wednesday, November 20, 2019 6:26 AM

Answers

  • in regards to the File Store Error i was receiving, it seems to be the 4GB limit for dimensions/Attributes. i have managed to over come that issue by following the following article link.

    That is strange. Normally if you hit the string store limit it actually says that in the error message rather than just returning a generic write error.

    i just want to check one more thing. not that i have my cube processed and all objects are up to date, can i just process the latest partition daily (and if so, which processing option should i be using)

    or do i need to process the entire cube daily?

    If you have your cube split into partitions the normal pattern would be to do a processUpdate on all your dimensions to bring in any possible changes to them first. Then you can do a processFull on just your latest partition.


    http://darren.gosbell.com - please mark correct answers

    • Marked as answer by Brendt Wooi Wednesday, November 20, 2019 8:15 PM
    Wednesday, November 20, 2019 8:05 PM
    Moderator

All replies

  • The reason you are not getting the data is that you are restricting the data in DSV.

    There is no need to restrict the data in DSV.

    While creating the partitions make sure you cover all the data by adding right conditions for each individual partitions.

    If you follow all the above said points you should be able to see.

    Even after following all the above points , in case you are not getting then then some other data might be restricting the results, this can be better understood in dimension and measure mapping.

    Regards

    Naveen

    Wednesday, November 20, 2019 6:36 AM
  • Thank you for you reply Naveen.

    i have tried not restricting the data in the DSV but when i do that i get the physical file limit error.

    I'm sorry if this is dumb question, but what is the use of the partitions if all the data needs to be unrestricted in the DSV?

    Wednesday, November 20, 2019 8:48 AM
  • i have tried not restricting the data in the DSV but when i do that i get the physical file limit error.

    What is the exact text of the error? Is it actually an error or is it that design warning in Visual Studio? (which is hopelessly out of date now)

    I'm sorry if this is dumb question, but what is the use of the partitions if all the data needs to be unrestricted in the DSV?

    The partition queries define the actual data that is loaded. The DSV does not hold any data, it's primary function is as a mapping layer between the cube and the relational source. 


    http://darren.gosbell.com - please mark correct answers

    Wednesday, November 20, 2019 11:08 AM
    Moderator
  • Hi Darren.

    thank you for your reply.

    i get the following error after removing any restrictions on the DSV.

    File system error: A FileStore error from WriteFile occurred. Physical file: XXX.asstore. Logical file: . .

    Errors in the OLAP storage engine: An error occurred while the '' attribute of the '' dimension from the '' database was being processed.

    Wednesday, November 20, 2019 1:29 PM
  • i get the following error after removing any restrictions on the DSV.

    File system error: A FileStore error from WriteFile occurred. Physical file: XXX.asstore. Logical file: . .

    Errors in the OLAP storage engine: An error occurred while the '' attribute of the '' dimension from the '' database was being processed.

    That is not a "normal" error that you would expect from changing a filter in the DSV. Unfortunately for you that sounds like either a disk space issue or a physical fault on the drive. You should check the Windows Event Viewer for other disk based issues and run some checks on that drive.


    http://darren.gosbell.com - please mark correct answers

    Wednesday, November 20, 2019 7:35 PM
    Moderator
  • Good day Darren.

    thank you to you and Naveen for all your help. i have managed to to get my cube working as expected.

    in regards to the File Store Error i was receiving, it seems to be the 4GB limit for dimensions/Attributes. i have managed to over come that issue by following the following article link.

    i just want to check one more thing. now that i have my cube processed and all objects are up to date, can i just process the latest partition daily (and if so, which processing option should i be using)

    or do i need to process the entire cube daily?


    • Edited by Brendt Wooi Wednesday, November 20, 2019 8:16 PM Spelling correction
    Wednesday, November 20, 2019 7:59 PM
  • in regards to the File Store Error i was receiving, it seems to be the 4GB limit for dimensions/Attributes. i have managed to over come that issue by following the following article link.

    That is strange. Normally if you hit the string store limit it actually says that in the error message rather than just returning a generic write error.

    i just want to check one more thing. not that i have my cube processed and all objects are up to date, can i just process the latest partition daily (and if so, which processing option should i be using)

    or do i need to process the entire cube daily?

    If you have your cube split into partitions the normal pattern would be to do a processUpdate on all your dimensions to bring in any possible changes to them first. Then you can do a processFull on just your latest partition.


    http://darren.gosbell.com - please mark correct answers

    • Marked as answer by Brendt Wooi Wednesday, November 20, 2019 8:15 PM
    Wednesday, November 20, 2019 8:05 PM
    Moderator
  • Darren i must say a very big thank you for all your assistance. With yours and Naveen's help, advise and assistance i have managed to resolve my issue as well as get a better understanding.

    thank you.

    Wednesday, November 20, 2019 8:15 PM