none
The size specified for a binding was too small, resulting in one or more column values being truncated. RRS feed

  • Question

  • Hi All

    When processing the cubes i'm getting the following error

    The
    size specified for a binding was too small, resulting in one or more column
    values being truncated.

    I checked the data size under key column and name column and compared to the database. All fields are matching but i'm still getting the error. Please need advice.

    Another question I have is can the datasize of the name column be greater than the Key column datasize? or should the sizes be the same?

    After looking every where in the DSV I found that the data types for the FACT table have Data Size of -1 , I TRIED MANY WAYS to edit he Data Size and couldn't find how to do it . Please need help


    SV



    Sunday, June 2, 2013 1:02 AM

Answers

  • Hi SaiVenkat,

    I used to get the same error while processing the cubes. The reason behind this error is:

    The data types for the columns for the Dimension table in the database and the Dimensions attributes data types should be the same.

    For example: Check the datatypes for a Model Name attribute in Dimension Properties

    Check the same column data type in database If the both data types are same then no problem.But still if you are facing the issue, one of the values for the column data type is more when compared to the data type value in SSAS dimension. You should increase the data type in the dimension.Then try to process the dimension and check the results.



    Maruthi...

    Sunday, June 2, 2013 10:16 AM
  • Finally solved it. I removed each dimension and processed the cube, i found out which dimension it was, removed and added and processed all went fine, thanks maruti for the time.

    SV

    • Marked as answer by saivenkat77 Monday, June 3, 2013 11:58 AM
    Monday, June 3, 2013 11:58 AM

All replies

  • Hi SaiVenkat,

    I used to get the same error while processing the cubes. The reason behind this error is:

    The data types for the columns for the Dimension table in the database and the Dimensions attributes data types should be the same.

    For example: Check the datatypes for a Model Name attribute in Dimension Properties

    Check the same column data type in database If the both data types are same then no problem.But still if you are facing the issue, one of the values for the column data type is more when compared to the data type value in SSAS dimension. You should increase the data type in the dimension.Then try to process the dimension and check the results.



    Maruthi...

    Sunday, June 2, 2013 10:16 AM
  • Thanks Maruti but all the dimension attributes have the same size, does the Key Columns and New Columns should be the same size? MS is supposed to be user friendly at least need to tell which column the error is on, there should be an option to sync with the data base columns. oracle OBIEE has way better options.

    SV


    Sunday, June 2, 2013 4:38 PM
  • Hi SaiVenkat,

    Please send me the error message which you are facing, thanks in advance :)


    Maruthi...

    Sunday, June 2, 2013 8:31 PM
  • Thanks Maruthi,

    I have a total of 3 cubes and all these 3 cubes share the same dimensions. 2 of the 3 cubes are processed successfully. The third cube is failing with the following error

    Errors in the back-end database access module. The size specified for a binding was too small, resulting in one or more column values being truncated. Errors in the OLAP storage engine: An error occurred while processing the 'Age Of Active Case Load' partition of the 'Age Of Active Case Load' measure group for the 'CourTools Age Of Active Case Load' cube from the Tools2 database.

    Internal error: The operation terminated unsuccessfully.

    Errors in the OLAP storage engine: An error occurred while processing the 'Age Of Active Case Load' partition of the 'Age Of Active Case Load' measure group for the 'CourTools Age Of Active Case Load' cube from the Tools2 database.

    Errors in the OLAP storage engine: The process operation ended because the number of errors encountered during processing reached the defined limit of allowable errors for the operation.

    What i'm guessing is the problem must be in the fact table of the cube that is failing. I tried to edit it but I don't see an option of editing it.

    *** In the FACT tbale in the database there is a field CandID that has data size 4 in the CUbe the same table has DataSize -1 (Not sure hows that possible) and i'm still figuring out a way to edit the FACT table. I deleted and added it back again, still i dont see any change.

    Another interesting thing is the the other two fact tables that are processed successfully have the data saizes as -1, not sure what is going on, but the other two cubes are prcoessed successfully and data loaded into the reports successfully.

    Thanks


    SV



    Monday, June 3, 2013 1:01 AM
  • Finally solved it. I removed each dimension and processed the cube, i found out which dimension it was, removed and added and processed all went fine, thanks maruti for the time.

    SV

    • Marked as answer by saivenkat77 Monday, June 3, 2013 11:58 AM
    Monday, June 3, 2013 11:58 AM
  • Friday, June 7, 2013 11:35 AM
    Moderator
  • Good solutions and related resources here. See if you want to contribute these (or others) to the TechNet Guru Competition announced here: http://social.technet.microsoft.com/Forums/en-US/sqlanalysisservices/thread/ff49e4a4-1e28-4afa-ad04-04b82961f89d 

    Note that good articles also often tie together resources on a specific scenario (like an overview that lists the resources as pieces to the puzzle).

    Here's more on the TechNet Guru awards: http://social.technet.microsoft.com/wiki/contents/articles/17625.technet-guru-contributions-june-2013.aspx

    Thanks!


    Ed Price (a.k.a User Ed), SQL Server Customer Program Manager (Blog, Small Basic, Wiki Ninjas, Wiki)

    Answer an interesting question? Create a wiki article about it!


    Sunday, June 9, 2013 9:56 PM
    Owner
  • Hey Maruthi, 

    Thanks for your post, it solved my problem !

    Dexter

    Wednesday, February 3, 2016 4:44 PM
  • Hei,

    I got this problem after I edited source view one of source table. Altering the view caused longer data(varchar) read but this change didn't reflect to dimension. So I just edited dimension attribute properties, Key- And NameColumns to reflect new field lengths. Hope this helps someone.

    bw, Sami

    Friday, February 5, 2016 10:31 AM
  • This is a great tool and helps a lot, but l still faced this issue after the discrepancy check. Unfortunately BIDSHelper does not solve the length for Nvarchar(Max), the cube used datalenght 0. Changing the 0 to 4000 fixed it for me. Thanks everyone for pointing me to the right solution.

    Thursday, May 26, 2016 11:49 AM
  • JeroenvS, the same thing solved my problem also.  I still have an unanswered question though that I cannot find a definitive answer to.  My understanding was that if a dimension attribute was set to a WChar and DataSize of 0 then that was the same thing as saying 'nvarchar(max)'.  In other words the attribute should cope with pretty much any size of string it is given.  I wonder could anyone clarify if my understanding is correct, and if so why do we have to manually change the DataSize for some attributes and not others?

    Peter

    Monday, July 10, 2017 2:29 PM
  • This problem can happen if you had a previous dimension created. We need to take care of it. It's necessary to correct the problem erasing all the dimensions that you have in your cube.
    Wednesday, August 14, 2019 11:41 PM