none
cannot convert between unicode and non-unicode

    Question

  • I keep getting the error message below when attempting to import a text file (flat file) to an SQL Server Destination using SSIS. This database has recently been migrated from SQL Server 2000 (where I used a DTS Package that worked fine). I plan on doing further manipulation to the file after importing but have so far tried to keep one step at a time. I have tried using a smaller files and different tables but still get a similar response. I believe that the encoding of the text file is ANSI. If it is relevant the database collation is Latin1_General_CI_AS (the same as it was when 2000). Any help anyone can provide will be greatly appreciated!!!

    TITLE: Package Validation Error
    ------------------------------

    Package Validation Error

    ------------------------------
    ADDITIONAL INFORMATION:

    Error at Data Flow Task [SQL Server Destination [174]]: Columns "Column 0" and "AccountNumber" cannot convert between unicode and non-unicode string data types.

    Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 1" can't be inserted because the conversion between types DT_STR and DT_I4 is not supported.

    Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 2" can't be inserted because the conversion between types DT_STR and DT_I4 is not supported.

    Error at Data Flow Task [SQL Server Destination [174]]: Columns "Column 3" and "Name" cannot convert between unicode and non-unicode string data types.

    Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 4" can't be inserted because the conversion between types DT_STR and DT_I4 is not supported.

    Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 5" can't be inserted because the conversion between types DT_STR and DT_R8 is not supported.

    Error at Data Flow Task [SQL Server Destination [174]]: Columns "Column 6" and "ExpiryDate" cannot convert between unicode and non-unicode string data types.

    Error at Data Flow Task [SQL Server Destination [174]]: Columns "Column 7" and "RateType" cannot convert between unicode and non-unicode string data types.

    Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 8" can't be inserted because the conversion between types DT_STR and DT_BOOL is not supported.

    Error at Data Flow Task [SQL Server Destination [174]]: Columns "Column 9" and "FullName" cannot convert between unicode and non-unicode string data types.

    Error at Data Flow Task [SQL Server Destination [174]]: Columns "Column 10" and "Address" cannot convert between unicode and non-unicode string data types.

    Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 11" can't be inserted because the conversion between types DT_STR and DT_DBTIMESTAMP is not supported.

    Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 12" can't be inserted because the conversion between types DT_STR and DT_I4 is not supported.

    Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 13" can't be inserted because the conversion between types DT_STR and DT_DBTIMESTAMP is not supported.

    Error at Data Flow Task [SQL Server Destination [174]]: Columns "Column 14" and "Occupancy" cannot convert between unicode and non-unicode string data types.

    Error at Data Flow Task [SQL Server Destination [174]]: Columns "Column 15" and "LoanPurpose" cannot convert between unicode and non-unicode string data types.

    Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 16" can't be inserted because the conversion between types DT_STR and DT_R8 is not supported.

    Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 17" can't be inserted because the conversion between types DT_STR and DT_R8 is not supported.

    Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 18" can't be inserted because the conversion between types DT_STR and DT_R8 is not supported.

    Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 19" can't be inserted because the conversion between types DT_STR and DT_R8 is not supported.

    Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 20" can't be inserted because the conversion between types DT_STR and DT_I4 is not supported.

    Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 21" can't be inserted because the conversion between types DT_STR and DT_I4 is not supported.

    Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 22" can't be inserted because the conversion between types DT_STR and DT_I4 is not supported.

    Error at Data Flow Task [SQL Server Destination [174]]: Columns "Column 23" and "DocumentLocation" cannot convert between unicode and non-unicode string data types.

    Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 24" can't be inserted because the conversion between types DT_STR and DT_I4 is not supported.

    Error at Data Flow Task [SQL Server Destination [174]]: Columns "Column 25" and "SecurityType" cannot convert between unicode and non-unicode string data types.

    Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 26" can't be inserted because the conversion between types DT_STR and DT_R8 is not supported.

    Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 27" can't be inserted because the conversion between types DT_STR and DT_DBTIMESTAMP is not supported.

    Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 28" can't be inserted because the conversion between types DT_STR and DT_R8 is not supported.

    Error at Data Flow Task [SQL Server Destination [174]]: Columns "Column 29" and "MortgageInsurancePolicyNumber" cannot convert between unicode and non-unicode string data types.

    Error at Data Flow Task [SQL Server Destination [174]]: Columns "Column 30" and "SecurityAddress" cannot convert between unicode and non-unicode string data types.

    Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 31" can't be inserted because the conversion between types DT_STR and DT_I4 is not supported.

    Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 32" can't be inserted because the conversion between types DT_STR and DT_R8 is not supported.

    Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 33" can't be inserted because the conversion between types DT_STR and DT_R8 is not supported.

    Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 34" can't be inserted because the conversion between types DT_STR and DT_R8 is not supported.

    Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 35" can't be inserted because the conversion between types DT_STR and DT_R8 is not supported.

    Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 36" can't be inserted because the conversion between types DT_STR and DT_R8 is not supported.

    Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 37" can't be inserted because the conversion between types DT_STR and DT_DBTIMESTAMP is not supported.

    Error at Data Flow Task [SQL Server Destination [174]]: Columns "Column 38" and "SecuritySuburb" cannot convert between unicode and non-unicode string data types.

    Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 39" can't be inserted because the conversion between types DT_STR and DT_I4 is not supported.

    Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 40" can't be inserted because the conversion between types DT_STR and DT_R8 is not supported.

    Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 41" can't be inserted because the conversion between types DT_STR and DT_R8 is not supported.

    Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 42" can't be inserted because the conversion between types DT_STR and DT_R8 is not supported.

    Error at Data Flow Task [SQL Server Destination [174]]: Columns "Column 43" and "MortgageNumber" cannot convert between unicode and non-unicode string data types.

    Error at Data Flow Task [SQL Server Destination [174]]: Columns "Column 44" and "TitleParticulars" cannot convert between unicode and non-unicode string data types.

    Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 45" can't be inserted because the conversion between types DT_STR and DT_DBTIMESTAMP is not supported.

    Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 46" can't be inserted because the conversion between types DT_STR and DT_DBTIMESTAMP is not supported.

    Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 47" can't be inserted because the conversion between types DT_STR and DT_DBTIMESTAMP is not supported.

    Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 48" can't be inserted because the conversion between types DT_STR and DT_I4 is not supported.

    Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 49" can't be inserted because the conversion between types DT_STR and DT_R8 is not supported.

    Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 50" can't be inserted because the conversion between types DT_STR and DT_R8 is not supported.

    Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 51" can't be inserted because the conversion between types DT_STR and DT_DBTIMESTAMP is not supported.

    Error at Data Flow Task [DTS.Pipeline]: "component "SQL Server Destination" (174)" failed validation and returned validation status "VS_ISBROKEN".

    Error at Data Flow Task [DTS.Pipeline]: One or more component failed validation.

    Error at Data Flow Task: There were errors during task validation.

     (Microsoft.DataTransformationServices.VsIntegration)

    ------------------------------
    BUTTONS:

    OK
    ------------------------------

    Wednesday, January 04, 2006 7:45 AM

Answers

  • In SQL Server 2000, DTS used to do implicit conversions in many circumstances. The decision was made to eliminate that risky guessing in Integration Services, and to compel all data type conversions, truncations, etc. to be made explicitly.

    Note that if you use the Import and Export Wizard to create your "base" package, the wizard creates all these tedious conversion transformations for you.

    -Doug

     

    Thursday, January 05, 2006 3:31 AM

All replies

  • The errors are exactly as are stated. You are trying to do implicit conversions that do not work. e.g. In column1 the value is coming in as a String and you are trying to insert it into an integer column.

    You will have to open up the package and use the Data Conversion component or the Derived Column component to do explicit conversions.

    -Jamie

     

    • Proposed as answer by oeer Monday, June 22, 2009 8:09 AM
    Wednesday, January 04, 2006 8:19 AM
  • Thanks for the reply Jamie, I have tried using both of these and still get a similar conversion error (as below on a similar table).  I have tried converting to different datatypes but still get the same conversion error.

    The same database had no problem using DTS on 2000.

    I must missing something that is probably blatantly obvious, but do you have any other ideas?

    TITLE: Package Validation Error
    ------------------------------

    Package Validation Error

    ------------------------------
    ADDITIONAL INFORMATION:

    Error at Data Flow Task [SQL Server Destination [4395]]: The column "TransactionKey" can't be inserted because the conversion between types DT_NTEXT and DT_WSTR is not supported.

    Error at Data Flow Task [SQL Server Destination [4395]]: The column "AccountNumber" can't be inserted because the conversion between types DT_NTEXT and DT_WSTR is not supported.

    Error at Data Flow Task [SQL Server Destination [4395]]: Column "TransactionCode" cannot convert between unicode and non-unicode string data types.

    Error at Data Flow Task [SQL Server Destination [4395]]: The column "Acmount" can't be inserted because the conversion between types DT_R4 and DT_CY is not supported.

    Error at Data Flow Task [SQL Server Destination [4395]]: The column "TransactionDate" can't be inserted because the conversion between types DT_DATE and DT_DBTIMESTAMP is not supported.

    Error at Data Flow Task [SQL Server Destination [4395]]: Column "Narration" cannot convert between unicode and non-unicode string data types.

    Error at Data Flow Task [DTS.Pipeline]: "component "SQL Server Destination" (4395)" failed validation and returned validation status "VS_ISBROKEN".

    Error at Data Flow Task [DTS.Pipeline]: One or more component failed validation.

    Error at Data Flow Task: There were errors during task validation.

     (Microsoft.DataTransformationServices.VsIntegration)

    ------------------------------
    BUTTONS:

    OK
    ------------------------------

    Thursday, January 05, 2006 2:30 AM
  • In SQL Server 2000, DTS used to do implicit conversions in many circumstances. The decision was made to eliminate that risky guessing in Integration Services, and to compel all data type conversions, truncations, etc. to be made explicitly.

    Note that if you use the Import and Export Wizard to create your "base" package, the wizard creates all these tedious conversion transformations for you.

    -Doug

     

    Thursday, January 05, 2006 3:31 AM
  • I'm getting the '...can't convert unicode...' conversion errors for strings imported from a table in Access 97.

    I see that the Import Wizard does it's job and will successfully import the data into a new table.

    But if you then change the destination to a table that already exists (created by a db copy from SQL2k) then the wizard generated package fails with the unicode conversion error again.

    I've changed the destination column properties to match those created in the new table by the wizard - and it still fails.

    I suspect that I could get the package copy the import into my table by doing an INSERT and then drop the table that the wizard creates. But I'm sure there's a more direct way than this.

    I suspect I'm overlooking something obvious - but it is proving difficult to manage data between Microsoft family applications.

    Any help / links appreciated.

    • Proposed as answer by oeer Monday, June 22, 2009 8:12 AM
    Saturday, January 21, 2006 8:25 AM
  • I have the same error trying to import an Access database with a memo field type.  SQL Server 2005 SSIS can't convert the memo to a varchar and gives the unicode error.  Odd.
    Monday, February 20, 2006 7:29 PM
  • I have found the same problem connecting to a Lotus Notes database via the NotesSQL 3.02g ODBC driver.

    Consider the following article on unicode.

    http://msdn.microsoft.com/library/default.asp?url=/library/en-us/acdata/ac_8_con_03_6voh.asp

    When I changed the datatypes of my destination table from varchar to nvarchar, I no longer received the "cannot convert between unicode and non-unicode" error.

    Thursday, February 23, 2006 5:31 PM
  • To be clear:  String conversion errors can be resolved by changing char columns to nchar and varchar columns to nvarchar in SQL2005.
    Thursday, March 02, 2006 8:12 PM
  • To be really clear, conversion errors happen because we use types inconsistently. There are unfortunately some instances when the type is forced upon us, but it is easy enough to change types inside SSIS. Changing existing database structures to simplify your ETL is not a great justification, alebit if possible it may make life easier. So in summary changing SQL Server column types will work, but so will converting the data within the pipeline, see the Data Conversion transform or the Derived Column transform.

    Thursday, March 02, 2006 8:28 PM
  • Well, I guess just another in a long line of folks hitting this trip wire.  I'm trying to import an Excel file into an existing SQL table which as varchar columns, not nvchar.  And, you're right, I'm not changing the table definition!

    OK, so please forgive the basic nature of the question, but you say it is easy enough to change types in SSIS.  So, you have to drag one of those transforms (Data Conversion or Derived Column) then onto your package?

    I was just thinking that there ought to be a more native way to do this.  But I guess this is it, then?

    For example, I right-clicked by source data file and went to the Advanced Editor.  There's a Input and Output Properties tab that shows Excel output in a tree view.  When expanded, in turn, there are External and Output columns nodes.

    I tried changing these to my desired data types, but it didn't seem to a) save my changes much less b) make the package work.  So what's does one use the Input and Output Properties for?

    Thanks for any information (or if you're up to adding an indepth article about it on your SQLIS.com website!).  BOL does not seem to offer any meaningful information here.

     

    Thursday, March 09, 2006 12:35 PM
  • External columns are what the source actually contains. Output columns are what SSIS gives you downstream. You cannot change the types of either because the external types are decided by the underlying Excel provider not you, and you cannot change the output columns because they should match the external columns. External vs output is a bit like which columns have you selected to bring through to SSIS out of those available. So changing types as part of your column choice does not make sense, it would only lead to problems and failures.

    The lack of implicit conversion is annoying for most people, but you can workaround. If implicit conversions take place and they are wrong, then you are generally stuck. The strong typing of SSIS is better in the long run I firmly believe, although I do often think it would be nice the other way too. Think of the way strings can be interpreted as dates. How may times have you hot problems with MD vs DM (if you in the UK all the time!). VBScript was my favourite as that would convert dates both ways depending on what the value was, all within the same function! You had no control and now idea which way it would go. Hence my assertion that not having implicit conversions are better. On the other hand some more help could be given in the product, such as selectively allowing similar types to be converted, STR to WSTR and I2 to I4. There will still be issues no doubt for more international organisations with multiple extended character sets or those using different code pages.

    Data Conversion or Derived Column is the way to go I'm afraid.

    Thursday, March 09, 2006 12:56 PM
  • Again thanks for the update.

    So, I'm trying a simple test at this point.  Starting with a brand new package,  I set up an Excel source, then dragged a Derived Column transformation onto the Data Flow designer.  I selected just two columns to test w/ by dragging them from the Columns pane down to the Expressions field, putting a TRIM() around 'em, entered a custom name for each in the Derived Column Name field, selecting <add as new column> and selecting string [DT_STR] in the Data Type field.

    Then set up my SQL table as the destination and mapped the two derived columns to the corresponding columns in my table. 

    I'm still getting task validation errors.

    "cannot convert between unicode and non-unicode string data types."


    So, this makes no sense to me.  In the derived transformation editor, I have the data types set to string; in my SQL table, the columns are varchar's.   I'm just not sure what I else I should be doing...

    Any suggestions would be welcome.




    Thursday, March 09, 2006 10:26 PM
  • Although you have selected the type of the output column to be DT_STR you haven't told SSIS how to convert the value into a DT_STR so in essence all you have done is define another implicit conversion.

    Explicit conversions are done using type casts. The following expression will convert a column of type DT_WSTR into a DT_STR:

    (DT_STR, <length>, 1252) (<column_name>)

    Hope that helps.

    -Jamie

     

    • Proposed as answer by HKouts Thursday, March 25, 2010 2:10 PM
    Thursday, March 09, 2006 10:40 PM
  • I am experiencing a similar error using the Microsoft OLEDB provider for oracle as a source.  All data types are defined as nvarchar2 in Oracle and exist in a sql server 2000 db typed as  nvarchar.  Since these are both unicode types, why am I getting the same error?

    I did find that the .net oracle provider fixed the unicode/non unicode error however the size of the source field is 1.5 times in the destination.  The package will run but I get truncation warnings throughout execution. 

    Any feedback on these is greatly appreciated!

    Thanks,

    Barbara

    Friday, March 10, 2006 5:19 PM
  • I am having the same problem with SQL Server as a source and destination.  I am using a "DataReader Source" and specifying a SQL query.  I then pass it through Merge Join and Conditional Split transforms.  My destination is an OLE DB Destination.  If I pick an existing destination table that has EXACTLY the same schema as the source table, the data flow fails with the same message everyone is getting above.  Somewhere in the process the package is converting the source data from varchar to nvarchar.  I can only get it to work if I have it create a new table in the OLE DB Destination, in which case it gives me nvarchar columns (identical schema as source except it changes the varchars to nvarchars).

    Where in the package is it converting from varchar to nvarchar?  It this a known issue? I can tell you with certainty that the source fields are not unicode.

    Friday, March 17, 2006 5:56 PM
  • Your problem is the DataReader source. It forces everything through managed types which means a DT_STR (varchar) becomes a System.String in the data reader, but so does a DT_WSTR (nvarchar). When it comes back out it always goes to DT_WSTR, so you table needs to be nvarchar. This is just the way it is because the data reader assumes uncicode, as it cannot tell the difference. Managed providers only give it String.

    You will need to convert the columns, see the Data Conversion Tx or the Derived Column Tx.

    For more confirmation of this System.String usage see-

    Re: DataReader Source and Column Types - MSDN Forums
    (http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=278625&SiteID=1)

    Friday, March 17, 2006 6:04 PM
  • Everyone is being polite, but this whole issue is obviously a failure by Microsoft to anticipate how people actually use SSIS / DTS.  It would be OK to have an option forcing strict data types, but forcing data into Unicode when many of us have no use or interest in Unicode, is clearly a bad design.  And telling people to change their table data types to nvarchar etc. is bad advice since double the storage (and increase retrieval time) is needed.  Similarly, telling people to program the pipeline for 10s or 100s of fields is also missing out on the real world of users.

    I hope Microsoft fixes this rigidity in the first Service Pack, which should come sooner rather than later.
    Sunday, April 02, 2006 1:45 PM
  • That rigidity is what stops uninformed people from ignoring Unicode types and shooting themselves in the foot. Usability could be a lot better for the DataReader Source, but allowing the pipeline to perform implicit conversions would be worse. Just think of all the fun you can have with dates in VBScript for example, the worlds best example of why variants and implicit conversions are a very bad idea. I do agree converting tables to uncode is also a very bad idea.
    Sunday, April 02, 2006 4:09 PM
  • Using strong types would be great if I could even get them into the database.  What good are they if the only error the OLE DB Destination gives me is 0xC0209029 DTS_E_INDUCEDTRANSFORMFAILUREONERROR?  I'd gladly put in data conversions if that's what it takes but I can't even see why the OLE Des. gets 6 good rows but will not insert them.  At least give me a type conversion error, isn't that what the whole forced typing thing was supposed to do any way?
    Tuesday, April 04, 2006 4:51 PM
  • ok, I hate to beat a dead horse here, but I was searching for what I thought would be an easy solution, but apparently it's not.  Implicit conversions are part of most industry standard etl tools including Informatica and Ascential.  I'm a big proponet of data quality and data validation, but there are many situations where implicit conversion is appropriate.
    Friday, May 26, 2006 12:32 AM
  • >Using strong types would be great if I could even get them into the database.  What good are they if the only error the OLE DB Destination gives me is 0xC0209029 DTS_E_INDUCEDTRANSFORMFAILUREONERROR? 

    Are you sure it is a type conversion that is causing your errors?

    The errors that SSIS gets from OLEDB providers are dependent on the providers - not all providers give us good error messages. The message we give you can only be as good as the one we get.

    I wonder what OLEDB provider you are using?

    Donald

    Friday, May 26, 2006 5:14 AM
  • >I'm a afbig proponet of data quality and data validation, but there are many situations where implicit conversion is appropriate.

    I think that's a very reasonable point, but can we identify what those situations are and ensure that users who may be less aware of the issues do not find themselves in difficulty, with unexpected (or worse - undetected) problems as a result.

    When I first joined the SSIS team, I did indeed find the explicit conversions somewhat awkward, but I quickly got used to the paradigm and found them both easy and effective. In particular, I found the ability to capture errors effectively (and to process them conditionally) much better for hadnling tricky data issues than other ETL tools I had used. I still find that to be the case today.

    Nevertheless, any input on when / how we should enable implicit conversions would be very useful feedback.

    Donald

    Friday, May 26, 2006 5:34 AM
  • Generally speaking datatypes on sql25k are a nightmare. I from Sql 2000 and I was very familiarized with DTS packages and things very easy now do them with 2005 are a pain.

     

    Friday, May 26, 2006 11:20 AM
  •  enric12879 wrote:

    Generally speaking datatypes on sql25k are a nightmare. I from Sql 2000 and I was very familiarized with DTS packages and things very easy now do them with 2005 are a pain.

     

     

    What specifically? Is it a pain cos its wrong or a pain cos its different? SSIS is a far superior product to DTS on so many levels - the consequence of that is that there is a learning curve with it.

    -Jamie

     

    Friday, May 26, 2006 11:41 AM
  • You're right. It's a pain because after three years creating and modifying a lot of DTS you acquire a good knowledge of that system and as time goes by you get lazy

    I liked do that this way...

    I'm accostumed to do that so...

    Friday, May 26, 2006 12:19 PM
  • Thank you for saying it ("Everyone is being polite, but this whole issue is obviously a failure by Microsoft to anticipate how people actually use SSIS / DTS! ") It is clear to me that this is poor thinking on the part of Microsoft. I have twenty tables in a moderate sized database. To export the data to Excel then re-import would require hours of playing with data conversions, which I feel are very awkward, in any case.

    Conversion between Unicode and non-Unicode should be automated by Microsoft in a much more smooth manner. For example, require that someone turn on a checkbox in a configuration screen, to require that simple transformation (I2 --> I4, or non-Unicode to Unicode and vice versa) be automatic.

    Friday, June 23, 2006 7:53 PM
  •  opalcomp wrote:
    Everyone is being polite, but this whole issue is obviously a failure by Microsoft to anticipate how people actually use SSIS / DTS.  It would be OK to have an option forcing strict data types, but forcing data into Unicode when many of us have no use or interest in Unicode, is clearly a bad design.  And telling people to change their table data types to nvarchar etc. is bad advice since double the storage (and increase retrieval time) is needed.  Similarly, telling people to program the pipeline for 10s or 100s of fields is also missing out on the real world of users.

    I hope Microsoft fixes this rigidity in the first Service Pack, which should come sooner rather than later.
    I don't know that I agree with your statement 100% but I will say that this causes the same error message:

    (1) Choose an OLE-DB data source for SQL Server 2005 with DDL: CREATE TABLE MyTable (Mycolumn VARCHAR(128)) and insert some junk data

    (2) Create an SSIS package with Step 1 as an OLE-DB data source for the SQL Server 2005 database from Step 1 and use an Excel spreadsheet as the destination

    (3) Create a transformation and tell the destination to create new table: CREATE TABLE 'My Excel Table' (MyColumn VARCHAR(128))

    (4) Execute the package

    You'll get an error when you execute the package: 'Column "Mycolumn" cannot convert between unicode and non-unicode string data types.' (or at least that's what I get on my SQL Server 2005 SP1 single-machine test)

    People have mentioned using other transformations (Data Conversion, Derived Column, et al) but the fundamental problem is that Steps 1 and 2 include non-unicode datatypes at both the Source and Destination ergo people reasonably expect the above code to work. 

     

    /****************************************************

        Check out our Sql Server 2005 and 2000 tutorials from LearnSqlServer.com

    ****************************************************/

    Sunday, July 30, 2006 12:26 AM
  • BTW - just to be clear... If someone had performed the same steps as my previous post but on a SQL Server 2000 instance using DTS, it would've worked without incident. Therein lies part of the trouble, IMO, that people have with the logic/issue.
    Sunday, July 30, 2006 12:28 AM
  •  Good Evening !

    I am a big fan of SSIS - but right now there are certain capabilities that:

    • Do not exist
    • Are not supported
    • Have changed drastically

    A Tool (SSIS) is exactly what it is - it can't do everything - regardless what MS Press promotes.

    So - for me I use SSIS extensively but I use T-SQL scripts for all data updates and inserts and it does not matter what collation exists between source and destination environments... Once again this may not be any help - but what I have read within this thread it seems absoutely ridiculous to go and change the source or destination tables for a char to vchar to nvarchar, or whatever! You cannot change the table field attributes!

    I just use the DataReaderSource and within that use the SQL Command - the SQL Script ensures only on the WHERE CLAUSE collation of the SOURCE and the COLLATION of the destination!

    Voila!@

    Best Regards !

     

    Sunday, July 30, 2006 5:00 PM
  • Whoever is having problems with the EXCEL SOURCE component should try the following:

    Change the connection string in the Excel Connection Manager component by adding an extended setting of "IMEX=1"

    For Example:

    Provider=Microsoft.Jet.OLEDB.4.0;Data Source=C:\MyWorkbook.xls;Extended Properties="EXCEL 8.0;HDR=YES;IMEX=1";

    By default the parser will look at the first 8 rows of your data and decide for itself what datatype to use. Thus, if you have 6 rows with numbers in a given column and only two rows with text then the numbers will be imported as numbers and the text will show up as "NULL". If you just try to change the datatype in the EXCEL SOURCE component (on the advanced editing page) then you will get two rows of text and six NULLs where the numbers belong.

    By using the extended setting you can successfully get all variant types into your package and then do whatever conversions you want from there.

    Hope that helps. It took me a while to figure that out. But I imagine you can do a lot more with the extended settings if you have other problems along these lines.

    Ciao

    PS: I would assume that IMEX means IMPORT EXPLICIT although I don't know that for sure. But it seems likely considering that's what it allows you to do once it is set to "1" (=TRUE?)

     

    • Proposed as answer by Kalman Toth Thursday, January 10, 2013 12:27 PM
    Saturday, August 05, 2006 9:49 PM
  • bstabile - This was driving me nuts but IMEX=1 came through for me big time!! Phew, you saved me a lot of headache - THANKS!
    Saturday, August 12, 2006 3:25 AM
  • Please note that the use of IMEX, as well as other known issues when working with Excel in Integration Services, is documented in the Books Online topics on the Excel Source and Destination components.

    -Doug

     

    Saturday, August 12, 2006 2:53 PM
  • Something else that might be useful to others who are wrestling with the "Excel Source" in SSIS:

    TIP! Check out the [HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Jet\4.0\Engines\Excel] located registry REG_DWORD "TypeGuessRows". That's the key to not letting Excel use only the first 8 rows to guess the columns data type. Set this value to 0 to scan all rows. This might hurt performance.

    This was posted on http://www.connectionstrings.com

    Check it out. It has some useful information on the behaviour of drivers used with OLEDB and ODBC connections (among other things). I wish I had found that link a little earlier in the debugging cycle!

    Tuesday, August 22, 2006 12:21 AM
  • Please note that this too (TypeGuessRows) is documented in the topic on the Excel Source in SQL Server Books Online.

    http://msdn2.microsoft.com/en-us/library/ms141683.aspx

    -Doug

     

    Tuesday, August 22, 2006 12:45 AM
  • I recently ran into this (Unicode non-Unicode) problem with Integration Services and discovered that in the Advanced Editor for a source connection you can edit the Data Type properties (in the Input and Output properties tab).  Changing these from the default Unicode data types SSIS set's solved all my problems.

     

     

    Tuesday, September 19, 2006 5:28 AM
  • The package set's the fields as nvarchar in the Advanced Editor for a source connection.  You can edit the Data Type properties (in the Input and Output properties tab) from the default nvarchar (unicode types). 
    Tuesday, September 19, 2006 5:34 AM
  • Katrina, you are a goddess among men. For anyone out there who still has this problem, Katrina has led me to the solution. I was still a little confused when I read it, so here are the exact steps I took to change it. I was loading from a flat text file. In the edit page of the file connection manager (not flat file source), go to the advanced tabs to see the columns. Here you can set the load type for the columns (I wanted non-unicode, so I chose String). Then you connect to (let's say) an OLE DB Destination. If you double click to open the connection, you can choose "metadata" and see the types are correct. Right click on the OLE DB Destination, and choose "Show Advanced Editor". Then click on the "Input and Output Properties" tab. Expand "OLE DB Destination Input" and you will see two folders ("External Columns" and "Input Columns"). If you open "Input Columns" and choose one of your columns, you will see the correct data type (ie. DT_STR). Now, if you expand the "External Columns" folder and choose one of your columns, you will see that it thinks an incorrect data type is coming in (ie. "Unicode string [DT_WSTR]"). This is what is causing your error. This makes the package think that it needs to implicitly convert what it THINKS is coming in (DT_WSTR) to what it wants (DT_STR), which it refuses to do. In the "External Columns" folder, change all of your data types to the correct incoming types, and all your worries will float away.

    To aid in the discussion on many forums about why this is a problem, this is most definitely a Microsoft error. When you connect the source to the destination, the connection should force the destination to update its input types. Again, thanks to Katrina for leading the way!
    Thursday, February 01, 2007 12:12 AM
  •  

    I got this solution from one of the web site and it works great for me

     

     

    I got the way to do this,

    I followed the following steps

    between the source and destination I've added a data conversion .

    In the data conversion I have converted the data for the "Date of Update" column from DT_Date to DT_DBDatetime and created a column "Date Of Update 1".

    While mapping the columns I mapped the source column "Date Of Update 1" with destination "DateOfUpdate"

    Similarly I was getting the errors for DT_NTEXT, and DT_WSTR, I applied the same procedure there.

    Thursday, August 23, 2007 5:24 PM
  • I am getting the Unicode convert error while Copying Data from Access file to SQL 2005.

     

    The Steps I have taken.

     

    1. To copying a 350 tables data from One SQL 2005 environment to another SQL 2005 environment,I took the export data from the source SQL 2005 into the one Access file.So that I will import this access file into the Destination SQL 2005 environment.

     

    2. While Importing this access file into the Destination SQL 2005, I am getting the 'Can't convert Unicode datatype'.

     

    3. When I see my access file I found that SSIS converted all Varchar data type of my source SQL 2005 into the memo datatype in the Acees file.That's why it is giving the 'Can't convert Unicode datatype.

     

    4. The same approach was successfully runnning on the SQL 2000, as DTS was not converting VARCHAR SQL 2000 fields to memo datatype in the Access file.

     

    Like I have 350 tables to import and each table has atleast 10 columns whose data type is VARCHAR in the SQL database.

     

    I read the full thread here,But did not understand what should(clear steps) I do on this condition.

     

    Do I need to change each and every column datatype in the SSIS package while exporting a data into the Access file?

     

    Help me....please

    Thursday, September 13, 2007 3:38 AM
  •  

    mitesh433

    Did you have any luck with this issue , I am experiencing a similar problem but am unable to resolve it.

     

    best regards

     

    micheal

    Tuesday, October 30, 2007 11:46 AM
  •  

    I had a similar issue with Transforming Data from Excel. I would recommend using DTS 2000 Package, for me it worked like a charm.

     

    Generally I think eliminate that "risky"  data type conversions, truncations, explicitly is a good thing. But I wish there was na easy straight forward way to turn it off and on.

    Friday, November 02, 2007 7:36 PM
  • The philosophy of Integration Services is that the user needs to know what's being done with his or her data...so no more implicit conversions. While this makes for a little more "work" on the part of the package developer, it gives added confidence in the results.

     

    Note that in many cases, if you let the Import and Export Wizard build the basic package for you, then customize it, the Wizard does the grunt work of creating simple Data Conversion Transformations of this sort.

     

    -Doug

     

    Friday, November 02, 2007 9:18 PM
  • While I am receiving these same error messages "Column "ORDER_ID" cannot convert between unicode and non-unicode string data types.", my situation is such that I am not even dealing with unicode data.  I am pulling data from an Oracle 10g db.  The columns in the table are not unicode.  The data flow does not seem to have any reference to anything unicode either.  My DFT works on 1 machine, but it won't work on the soon-to-be production machine nor my development machine.

    Thursday, November 29, 2007 8:37 PM
  •  

    The transform in question clearly thinks the input is unicode data. If this is a source, then often the connection provider gives limited information about data types and unicode is assumed even if you think otherwise.

     

    If you open the UI, and go to the screen where you map columns between input and output then you can hover over the column in question and see the data types quite easily.

     

    The fact that it differs between machines means something is different -

    the package itself, wrong version

    different database schema version

    different driver version

    Friday, November 30, 2007 9:55 AM
  • Tim (and Katrina),

     

    I had the same problem of DT_STR switching (in error) to DT_WSTR.  Unfortunately, although SSIS allows me to change the datatype on columns from DT_WSTR to DT_STR on the External Columns folder, it doesn't save the change when I hit the OK button.  As soon and I return to the Advanced Editor GUI and observe the External Columns that I changed, they are back to showing DT_WSTR ... arrrrrgghhh.  This has been soooo frustrating.  Man, just to do a simple extract of a SQL Server table to dBase has been an eye-opener.  Can't wait until 2008 comes out. I guess for venting purposes, I should also mention that SSIS also changed the field length when switching the data types.  So if the field is showing DT_WSTR (16) and I switch to DT_STR it changes length to default of 50 .... now why would it do such a thing??? if it had any sense it would stay the same or be cut in half.  double arrrrrghhhh.

     

    Joe

     

    Saturday, February 16, 2008 8:57 PM
  • Anonymous,

    It seems to me that a lot of the time when you try to override the settings such as DT_WSTR to DT-STR, SSIS assumes  you have not chosen wisely, and trys to change them back again with the assumption that you couldn't have really meant to do that, instead of giving you an error or warning message indicating that the choice that you made conflicts with data types settings on the input or output properties that you just changed.  How about giving me the option to chose to have a global setting for my packages to use a default of non-unicode instead of a default of unicode?  I would prefer that to having the property settings constantly reverting back to unicode everytime I make a change to a task that causes SSIS to resync the data type settings between tasks.  I would relate this issue to installing software and having the typical install versus custom.  We don't always need to chose the custom setting, and we shouldn't all be forced to use unicode if our business model does not require it.  My typical install would be to assume non-unicode unless explicitly set.

     

    I'm sure these comments will bring down a barrage of reasons why the problem is me and not SSIS.  I admit it, I'm not as smart as your guys who write the books and run the blogs.  However, there still seems to be a lot of people out here having the same issues trying accomplish something that took 5 minutes to do in DTS and now takes hours vs minutes to accomplish in SSIS.  The same issues that were out here a year and and a year from now.  I know I could read all the books and spend thousands of dollars on an SSIS class, but I'm thinking about waiting for the next release so I can relearn it all over again with 2008, and again in 2010, 2012....  By the way, why did it take MS so long to come with any kind of serious curriculum for SSIS development, almost 18 months after it's release. 

     

    Regardless, I do appreciate all the help from you guys and gals that do understand how all this works and take the time to try to explain it to the rest of us.  There's lots of good information out here.  I just don't buy it that SSIS is progress for shops that have limited time and resources to accomplish day to day tasks. 

     

    Ted

    Friday, February 22, 2008 10:02 PM
  • Hi,

    First of all I entirely agree that enforcing explict transformation is a huge step forward from implicit conversion. Basically this is enforcing short term pain for long term gain,  I've spent many hours debugging code caused by implicit conversions.

     

    However like many developers dealing with data sources in large organisations I have to deal with external data sources with a very large number of fields and most of these fields have a similar data type.

     

    Is there not a utility or command I am not aware of which generates this explict transformation (casting) for you using the particular example of DT_STR to DT_WSTR conversion?

     

    I understand that the VB6 to VB.NET code convertor does create explicit conversion code and warns the developer where these explicit conversions took place. A report is then generated in the Output window within Visual Studio of these explicit conversions generated by the migration tool.

     

    Why not have a similar tool within SSIS which transforms all the data types from the data source into the destination storing the explicit conversion details within the Derived Column feature of SSIS? The developer could define the nature of the transformation to be applied over multiple data types.

     

    Of course I except that not all implicit conversions can be automatically converted into explicit transformations but I understand some can.

     

    This would potientially save me (and others) a huge amount of time.

     

    Thanks,

     

    Kieran.

     

     

    Tuesday, March 04, 2008 5:12 PM
  •  Kieran Wood wrote:

     

     

    Why not have a similar tool within SSIS which transforms all the data types from the data source into the destination storing the explicit conversion details within the Derived Column feature of SSIS? The developer could define the nature of the transformation to be applied over multiple data types.

     

    Sounds like a great idea. You should suggest it at:

    http://connect,microsoft.com/sqlserver/feedback

     

    cheers

    Jamie

    Tuesday, March 04, 2008 5:17 PM
  • Thanks for your comments Jamie,

     

    I intend to complete a small project which is a good example of where implementation of my suggestions would save a lot of  time. This would enable me to articulate better in detail what my suggestion should be, including step by step screen shots of how I think this should work. In the meantime if anybody else thinks I have a good idea please respond to this thread or e-mail me.

     

    Thanks again,

     

    Kieran.

    Tuesday, March 04, 2008 7:22 PM
  •  Kieran Wood wrote:

    First of all I entirely agree that enforcing explict transformation is a huge step forward from implicit conversion. Basically this is enforcing short term pain for long term gain

     

    This is NOT short-term pain.  I need to import over 350 tables  To explicitly convert thousands of columns from unicode to non-unicode will take weeks.  The only other option is to drop and recreate the destination tables, which will take a fair amount of time as well. 

     

    It does not make sense to force use of unicode when it isn't needed, and burn excess disc space just to avoid tens of thousands of mouse-clicks doing the explicit type conversions.

     

    This really is a productivity issue.

     

     

    Monday, March 10, 2008 10:33 PM
  •  unavailable wrote:
     Kieran Wood wrote:

    First of all I entirely agree that enforcing explict transformation is a huge step forward from implicit conversion. Basically this is enforcing short term pain for long term gain

     

    This is NOT short-term pain.  I need to import over 350 tables  To explicitly convert thousands of columns from unicode to non-unicode will take weeks.  The only other option is to drop and recreate the destination tables, which will take a fair amount of time as well. 

     

    It does not make sense to force use of unicode when it isn't needed, and burn excess disc space just to avoid tens of thousands of mouse-clicks doing the explicit type conversions.

     

    This really is a productivity issue.

     

     

     

    In the context of "short-term pain," which is going to last longer? Your development cycle, or the lifespan of your ETL application. I don't know about your BI projects, but mine are generally used for much longer than it takes to build them, and I would rather buy a little pain during development for great runtime performance.

     

    While I agree that the strong data typing enforced by the SSIS data flow can negatively impact productivity, suggesting that it will "take weeks" is either a gross exaggeration or a sign of lack of basic SQL Server knowledge. If you're importing from tables that contain unicode strings and you need them to be non-unicode strings, you can easily do something like this. I'm using the sample AdventureWorks database for the example, but with a minor effort you can modify and apply it to your own database.

     

    Take a look at this query that takes advantage of SQL Server's built-in metadata through the INFORMATION_SCHMEA views introduced in SQL Server 7.0: 

     

    Code Snippet

     

      SELECT CASE

                WHEN DATA_TYPE <> 'NVARCHAR' THEN column_name

                ELSE 'CAST (' + column_name + ' AS VARCHAR (' +

                   CAST (CHARACTER_MAXIMUM_LENGTH AS VARCHAR(10)) +

                   ')) AS [' + COLUMN_NAME + ']'

             END + ',' AS [SQL_Starter]

        FROM INFORMATION_SCHEMA.COLUMNS

       WHERE TABLE_NAME = 'Department'

             AND TABLE_SCHEMA = 'HumanResources'

     

    When you execute this simple SELECT statement you get this as a result:

     

    SQL_Starter
    ------------------------------------------------
    DepartmentID,
    CAST (Name AS VARCHAR (50)) AS [Name],
    CAST (GroupName AS VARCHAR (50)) AS [GroupName],
    ModifiedDate,

    (4 row(s) affected)

     

    And with 20 seconds cleanup you get this:

     

    Code Snippet

     

      SELECT DepartmentID,

             CAST (Name AS VARCHAR (50)) AS [Name],

             CAST (GroupName AS VARCHAR (50)) AS [GroupName],

             ModifiedDate

        FROM HumanResources.Department

     

     

    Yes, it's frustrating that SSIS donesn't automatically do this work for us, but I for one am glad that the SSIS team built a data flow engine that performs so well instead of building another "easy to use, but performs like..." tool like DTS.


    And for what it's worth, the Import/Export Wizard in SSIS 2008 does this for you.

     

    Tuesday, March 11, 2008 12:19 AM
  • The cast() might work in some cases, but my data source is not SQL Server, and is using an ODBC connection with an ADO.Net data reader.  I don't think this will work for my case.  That will be useful in cases where the data source is SQL Server.

     

    It would take weeks if I used a Data Conversion transform due to the number of columns; this is a legacy database that is not normalized, and tables probably have an average fo 25-30 columns each - most are char.  Count the number of clicks involved in adding the columns to a Data Conversion transform, picking the output data type, the length, and then remapping manually to the data destination.  This was not an exaggeration.

     

    I am not having a problem with performance on DTS, nor am I having any problems with implicit type conversions converting data incorrectly.  For me, this is a fix for something that wasn't broken.

     

    The benefits I see are the ease of moving a package from dev to test to prod, and maintaining configuration parameters external to the package. 

     

    I just wish choice of disabling implicit type conversions had been left to me rather than being dictated to me.

     

     

    Tuesday, March 11, 2008 10:28 PM
  •  unavailable wrote:

    I am not having a problem with performance on DTS, nor am I having any problems with implicit type conversions converting data incorrectly.  For me, this is a fix for something that wasn't broken.

    It was broken for others, sorry MS did not build for you only.

     

    The usability is "broken", as this is not an uncommon issue and is painful, but the change in design is most certainly correct.

     

     

     unavailable wrote:

    ...using an ODBC connection with an ADO.Net ....

     

    Back to Matthew's point, can we try and be constructive, deal with the issue, and automate the work. You could build on the ADO.Net Source Sample, build your own custom source adapter, and easily produce a component that always returned data types of your preferred type. There is actually some extra information if you look into the ADO.Net GetSchemaTable call that would often allow you to determine between the two string types, but unfortunately MS took the simple route when building the Data Reader Source, and uses the generic type information only, and not the provider specific information that is available.

    Tuesday, March 11, 2008 11:29 PM
  • Has this issue been resolved with SP2? I am also getting the cannot convert between unicode and non-unicode string data types. I cannot change the output columns either since it does not save my changes.

    Tuesday, April 01, 2008 5:48 PM
  •  HappyOne wrote:

    Has this issue been resolved with SP2? I am also getting the cannot convert between unicode and non-unicode string data types. I cannot change the output columns either since it does not save my changes.

    Has what been fixed? I've re-read the first 2 pages of this thread (I don't have time to read all of it) and didn't see anything that suggested there is a bug.

     

    -Jamie

    Tuesday, April 01, 2008 7:07 PM
  • i found this article very helpful with this issue--

     

    http://www.mssqltips.com/tip.asp?tip=1393

    Tuesday, April 08, 2008 10:15 PM
  •  MatthewRoche wrote:
     unavailable wrote:
     Kieran Wood wrote:

    First of all I entirely agree that enforcing explict transformation is a huge step forward from implicit conversion. Basically this is enforcing short term pain for long term gain

     

    This is NOT short-term pain.  I need to import over 350 tables  To explicitly convert thousands of columns from unicode to non-unicode will take weeks.  The only other option is to drop and recreate the destination tables, which will take a fair amount of time as well. 

     

    It does not make sense to force use of unicode when it isn't needed, and burn excess disc space just to avoid tens of thousands of mouse-clicks doing the explicit type conversions.

     

    This really is a productivity issue.

     

     

     

    In the context of "short-term pain," which is going to last longer? Your development cycle, or the lifespan of your ETL application. I don't know about your BI projects, but mine are generally used for much longer than it takes to build them, and I would rather buy a little pain during development for great runtime performance.

     

    While I agree that the strong data typing enforced by the SSIS data flow can negatively impact productivity, suggesting that it will "take weeks" is either a gross exaggeration or a sign of lack of basic SQL Server knowledge. If you're importing from tables that contain unicode strings and you need them to be non-unicode strings, you can easily do something like this. I'm using the sample AdventureWorks database for the example, but with a minor effort you can modify and apply it to your own database.

     

    Take a look at this query that takes advantage of SQL Server's built-in metadata through the INFORMATION_SCHMEA views introduced in SQL Server 7.0: 

     

    Code Snippet

     

      SELECT CASE

                WHEN DATA_TYPE <> 'NVARCHAR' THEN column_name

                ELSE 'CAST (' + column_name + ' AS VARCHAR (' +

                   CAST (CHARACTER_MAXIMUM_LENGTH AS VARCHAR(10)) +

                   ')) AS [' + COLUMN_NAME + ']'

             END + ',' AS [SQL_Starter]

        FROM INFORMATION_SCHEMA.COLUMNS

       WHERE TABLE_NAME = 'Department'

             AND TABLE_SCHEMA = 'HumanResources'

     

    When you execute this simple SELECT statement you get this as a result:

     

    SQL_Starter
    ------------------------------------------------
    DepartmentID,
    CAST (Name AS VARCHAR (50)) AS [Name],
    CAST (GroupName AS VARCHAR (50)) AS [GroupName],
    ModifiedDate,

    (4 row(s) affected)

     

    And with 20 seconds cleanup you get this:

     

    Code Snippet

     

      SELECT DepartmentID,

             CAST (Name AS VARCHAR (50)) AS [Name],

             CAST (GroupName AS VARCHAR (50)) AS [GroupName],

             ModifiedDate

        FROM HumanResources.Department

     

     

    Yes, it's frustrating that SSIS donesn't automatically do this work for us, but I for one am glad that the SSIS team built a data flow engine that performs so well instead of building another "easy to use, but performs like..." tool like DTS.


    And for what it's worth, the Import/Export Wizard in SSIS 2008 does this for you.

     

     

     

    I'm sorry but I can't agree with your comments on this MatthewRoche, in fact you've made me quite angry I'm sorry to say.

     

     I have a similar situation to many people on this thread where I am going to have to spend weeks, if not months to convert my DTS packages to SSIS packages because of the implicit/explicit unicode and non-unicode issue. This, let me assure you, is in now a way down to a "gross exaggeration" or "lack of basic knowledge of SQL Server". We pull data from Informix and Ingres databases on unix boxes so you're "easily do something like this" bit of SQL you posted to convert the types won't work on either of those RDBMS's. so I don't know of any other option but to through hundreds of individual columns and convert them.

     

    I understand that SSIS wasn't written specifically for our organization and it is a great tool but this difference in functionality has ultimately led to frustration in my and seemingly many other organization as the path from DTS to SSIS won't be trodden quickly as there are alot of people who use SSIS but do not do so regularly enough to have well rounded good knowledge of it because we have other responsibilites in our day to day work and just don't get the time. 

     

    Instead of condemning and quite frankly bullying other people's opinions maybe you should take a walk across the world's shop floors once in a while because this is what people are doing out in the real world of 24/7 business ops. If they weren't then this thread wouldn't be so large, would it?

     

    So my suggestion for the next version of SSIS is give people the choice to specify if they would like to implicitly convert data or not and then we'll all be happy because in this case, it sounds like people want the best of both worlds.

     

    Thank You

    Thursday, June 05, 2008 11:39 AM
  • Bobbins, I couldn’t agree with you more.

    I've been a SQL developer for 8 years so I don't think I am lacking in "basic SQL knowledge" but this has been a royal pain. I too have to convert about 50 DTS packages with 5-10 tables each and thousands of columns to click through. I tried to hack the XML code for the DTS package but no success. All the alternatives seems painful:

    1) I can manually do a wizard for each table (because selecting multiple tables from ODBC is also not possible) then combine them.
    2) Import/Migrate the DTS packages, not taking advantage of SSIS and just delaying the unavoidable.

    Darren, thanks for the suggestion on creating your own customer data reader, that is the most constructive advice I have seen.

    One thing I read again and again from those in favor of this explicit conversion: "The decision was made". How was this decision made? Was it made from surveying the users or just developer's preference? Are all these people on this thread and other many threads on this forum and on the internet, a minority? Even if we are a minority, wouldn't it have been better to give choice between implicit and explicit conversions and let us minorities have the choice? The combination of forcing explicit conversion and assuming external ODBC sources are Unicode when not enough info is known (again, a choice here would suffice) is painful.

    This isn't some philosophical debate, this is what we do, we live this, we use your product everyday. 

    Friday, July 18, 2008 2:23 PM
  • SSIS is such a POS. I can't believe they took something so intuitive and useful and completely ruined it. Ok, there's my vent for the day. Now, I've been reading all of these posts and haven't read a solution yet. Everyone seems to be stuck on why microsoft created this piece of *** tool instead of responding to the questions with a little detail. For example, if I drag a data conversion task into my pkg, then what do I do? I'm just trying to import an Excel spreadsheet for god's sake, why do I now have to take all this time to convert every column, how ridiculous!! I could've been done in minutes with DTS and now it's taking me hours and it still doesn't work. Yeah really good move, I have all the time in the world to spend on this!! (I guess I wasn't done ranting). Anyway, I've never seen some of these datatypes listed and don't have any idea what to do. There is an input column drop down, ok I get this, I select the column I want to convert right? Then there is something called output alias. What the *** is this and what do I put in there? And then data type. Is this the data type of the spreadsheet column, or what I want to convert it to? And if it is the data type of the spreadsheet, how am I suppose to know what that is? And then there is a code page box. What the *** is this and what do I put in there? Please someone help me with this thing. I feel like they tried to make this tool for a web developer with no concern for it's affect on a database person.

    Friday, August 29, 2008 6:21 PM
  • DarrenSQLIS, I've worked with DTS a long time and have never shot myself in the foot for ignoring unicode types. Can you explain what you're talking about, because you still haven't convinced me that this lousy tool is better.

     

    Thanks,

     

     

     

    Friday, August 29, 2008 6:25 PM
  • Jamie,

    What's superior about it?

     

    Friday, August 29, 2008 6:30 PM
  • How do you change the connection string, I can't find it anywhere. The Excel source only shows me the path of the file and the version of Excel.

     

    Friday, August 29, 2008 6:41 PM
  • This didn't work, got the same error. Here is what I put in the ConnectionString

    Provider=Microsoft.Jet.OLEDB.4.0;Data Source=C:\SSMS Solutions\Toshiba\CompatabilityData.xls;Extended Properties="EXCEL 8.0;HDR=YES;IMEX=1";

     

    any suggestions would be appreciated, Thanks,

    Friday, August 29, 2008 6:49 PM
  • How do you get the toeh Advanced Editor for the source connection exactly??

     

    Friday, August 29, 2008 6:51 PM
  • Wow, how inappropriate and rude. I agree whole heartedly with unavailable. You may want to allow for the possibility that you're wrong. There are many issues with SSIS and it is not necessarily better than DTS just because it works for your situation. unavailable's post is right on the money. I've been working with DTS for years and not once did I experience any problems with implicit conversion. So the problem you state that SSIS solves, didn't even exist for many of us. So just because you had problems with implicit conversion doesn't necessarily justify making this change as it obviously has caused many people problems. Anyway, you need to lighten up on people. There is a significant amount of frustration involved with this issue and rightly so. It really was a bad choice on MS's part in my opinion, but I'm willing to admit that if it solved someone's problem with implicit conversion, there's a possibiltiy my opinion isn't totally correct. Even though implicit conversion was never a problem for me in the 6 years I worked with 2000.

    Friday, August 29, 2008 7:23 PM
  • HERE HERE!!!! GREAT POST!!!

    Friday, August 29, 2008 7:25 PM
  •  

    Hi Harvardsent,

    Please could you include the contents of the post you are replying to, especially if you are going to post so many additions.

    Also, could you condense all your comments into a single reply.

     

    Thanks

    Jamie

    Friday, August 29, 2008 7:45 PM
  • I've encountered this and similar errors trying to copy data from a Firebird database to SQLServer 2005.  It refuses to handle data of type memo, text or vartext, unless I create a table with ntext columns (I'd like to use char, nchar, varchar, and nvarchar).  Nothing I've tried works, not fiddling with the metadata (for days, trying every conceivable combination, and some combinations which are not even conceivable), not inserting a data conversion task in between, not casting the original data types to other data types in the SQL query.  The only thing I've managed to get to work is using an intermediate table with ntext columns, and sending the Firebird data to that, then having SSIS invoke a stored database procedure to copy the data from the intermediate table to the final table with the proper data types, thus removing data conversion from SSIS to SQLServer 2005, which appears to be far less cantankerous.  This is ugly, but better ugly than not working at all.  (As everyone has noted, the same stuff worked just fine in DTS.)

     

    Is there really no way around this SSIS failure?

     

     

    Monday, September 22, 2008 7:38 PM
  • I had already tried doing exactly what this page recommended, and it did not work at all.

     

     

    Monday, September 22, 2008 7:58 PM
  •  

    I wrote:

     

    "I had already tried doing exactly what this page recommended, and it did not work at all."

     

    Sorry -- I didn't realize the context would not automatically be provided.  This is the page I was referring to:

     

    http://www.mssqltips.com/tip.asp?tip=1393

     

    As I said, I'd already tried this, to no avail.

     

     

    Monday, September 22, 2008 8:07 PM
  • Thank you Katrina!  Saved the day. 

     

    Tuesday, October 07, 2008 3:48 PM
  • This is really a bad news.

    How do you feel if you must convert 190 fileds for each table and you have 10 table?

    Is there a way to do this programmatically using script component?

    Thanks

     

    Wednesday, December 03, 2008 11:10 AM
  • I ran into this problem also when trying to import from an Excel file.  If you didn't have your text in Unicode in your original database, then I have a simple solution I have found (at least if you have Excel 2007... I don't know if this option is there in previous versions) that works.

    What I did was:

    1. Open up the Excel data file in Excel itself.
    2. Selected "Save As >" from the funky new Office Button and Chose "Excel 97-2003 Workbook".
    3. Changed the name of the file.
    4. Clicked the "Tools" button in the lower left corner and selected "Web Options..." from the menu.
    5. Switched to the "Encoding" Tab.
    6. Selected "US - ASCII" in the "Save this document as:" drop-down menu.
    7. Saved the file.

    Then I used the new file I generated and did not encounter the conversion errors.

    • Proposed as answer by N. Huddleson Friday, December 19, 2008 7:45 PM
    Friday, December 19, 2008 7:44 PM
  • Nate,

    Thanks. Finally, a simple solution that mere mortals can understand and use.

    I didn't understand have the stuff that was said in this thread, and I shouldn't have to.

    In previous versions of SQL Server client tools that I used, I just did an import from Excel and it worked. That's how things are supposed to work: simply and intuitively. If you have to read a book to figure out how to do something simple, it's clear that the manufacturer is clueless.

    MS seems to be making the simple complex of late...their engineering team has lost some basic concept regarding software usability. It's almost as if they are trying to permanently alienate their users and don't really want our business.

     

    Saturday, January 24, 2009 3:04 AM
  • OK, my thank you was a bit early for you, Nate. Your simple solution didn't work for me. But I do appreciate the thought, my friend.

    Back to the #$@#! drawing board.

    Saturday, January 24, 2009 3:13 AM
  • My solution was to boot up an old laptop that has an old version of SQL Server Enterprise Manager.

    I imported my Excel file into SQL Server with no problems.

    Bit of a pain to have to do this everytime I want to upload Excel to SQL Server, but I can't figure out how to do what I need to do in SQL Server 2005 client tools.

    • Proposed as answer by Howard Ryan Monday, April 20, 2009 9:45 AM
    Saturday, January 24, 2009 3:27 AM
  • Hi,

    I am trying to create the fuzzy grouping transformation using VB .Net. But output alias for the Similarity score does not come. So i tried to add the Similarity output columns to Output Columns of fuzzy grouping transformation. But i encountered a problem with the following exception,

    Exception from HRESULT: 0xC0204019


    My code is,


    Dim fuzzyGrpInputCollection As IDTSInput90 = fuzzyGroupTrans.InputCollection(0) 
            Dim fuzzyGrpInputCols As IDTSInputColumnCollection90 = fuzzyGrpInputCollection.InputColumnCollection 
     
            Dim fuzzyGrpOutput As IDTSOutput90 = fuzzyGroupTrans.OutputCollection(0) 
     
     
            For Each inputColumn As IDTSInputColumn90 In fuzzyGrpInputCols 
     
                If (inputColumn.Name = "Employee Name") Then 
     
                    Dim fuzzyOutputCol As IDTSOutputColumn90 = _ 
                    fuzzyGroupTransInstance.InsertOutputColumnAt(fuzzyGrpOutput.ID, 0, "_Similarity_" _ 
                    & inputColumn.Name, "Similarity score " & inputColumn.Name) 
     
                    Dim fuzzyGrpCustPropCollec As IDTSCustomPropertyCollection90 = _ 
                    fuzzyOutputCol.CustomPropertyCollection 
     
                    fuzzyOutputCol.CustomPropertyCollection(0).Value = 2 
                    fuzzyOutputCol.CustomPropertyCollection(1).Value = inputColumn.ID 
     

                    'Read only property error 
                    'fuzzyOutputCol.DataType = Microsoft.SqlServer.Dts.Runtime.Wrapper.DataType.DT_R4 
                    'fuzzyOutputCol.CodePage = 10 
     
     
                  'Here i got the Exception from HRESULT: 0xC0204019 and This code has called into  another function. When that function is finished, this is the next statement that will be executed. 
                    fuzzyOutputCol.SetDataTypeProperties(Microsoft.SqlServer.Dts.Runtime.Wrapper.DataType.DT_R4, _ 
                    ("_Similarity_" & inputColumn.Name).Length(), inputColumn.Precision, inputColumn.Scale, inputColumn.CodePage) 'Code Page  : 1252) 
     
                    fuzzyGroupTransInstance.SetOutputColumnProperty(fuzzyGrpOutput.ID, inputColumn.ID, _ 
                    "SourceInputColumnLineageId", inputColumn.ID) 
     
                End If 
     
        Next 
     


    Pleas help me. I am spending lot of time with this issue.



    Cancel
    Thanks,


    Jaffar Rabeek

    BI Developer

    • Proposed as answer by Howard Ryan Monday, April 20, 2009 9:43 AM
    • Unproposed as answer by jwelchModerator Monday, August 03, 2009 10:27 PM
    Friday, March 13, 2009 6:59 AM
  • Hi,

    My solution, for what it's worth.......

    Instead of directly creating an Excel file - write out a delimited text file but suffix it with a file type of .xls

    If Excel is set to use the particular delimiter that you've used then when you open up the file it should automaticly open in Excel with the output correctly deliniated in to the right columns.

    This is a horrible bodge and some data risks getting mangled, for example text strings with leading zeros - but - it sort of works.

    Hope this helps some one.

    Monday, April 20, 2009 9:48 AM
  • Did anyone get this to work? 

    http://www.mssqltips.com/tip.asp?tip=1393

    I can't get the Data Conversion task to work!!??!! I've spent a day on this and I'm still stuck.

    I've got the new input column (created by data conversion and converted to string DT_STR) mapped to the destination column but it still gives the unicode to non-unicode error conversion. If anyone got this to work could you please document the steps.

    This is horrible.....I've read this entire thread and it is obviously a common problem without a solid solution. The easy way out would be to convert the database fields to nvarchar, nchar, etc but I would really like to figure this out and do it properly.

    Thursday, May 07, 2009 9:25 PM
  • Indeed,

    I started on this issue yesterday and after reviewing the full thread, no luck so far...  My issue is trying to read in a comma delimited flatfile (.csv)...   My issues mirror those above as to the unicode error...  If someone could walk me through the steps of how to convert either the importing data or the dimension / fact tables that are to receive this data load, I'd be grateful... BTW I am far from a techie as to database systems, I'm trying to utilize Sql Server 2008 for use in creating cubes for analysis...  I work with SAS, I converted a SAS dataset to a .CSV file to be read into this software... 
    Thursday, May 14, 2009 5:35 PM
  • I have been doing a variety of conversion jobs for 25 years and SSIS is without doubt the most flawed piece of <your favorite word here> I have ever seen.  All the defenders with their arrogant "you know nothing about database design and operation" should spend a few years working in the real world.  It should NOT BE NECESSARY to spend a year learning to do the simplest operations, the product is designed very badly, period.  It's all very well to talk about 'reading documentation' but in practical terms, quite a few packages from microsoft and other companies do not require such an investment of time (=money) to get very basic things done.  So, can this package be used?  I'm sure it can.  With an investment of time that makes it worth it?  Not a chance.

    The reason i was reading this thread, is, two years later, I need to maybe import an excel spreadsheet and i see that two years later, there is still no easy way.  you either put in conversions for 40 items, or else find some other way to transfer the data.   I've been waiting to see if MS releases a badly needed service pack (or else just unveils a PROPER conversion package that costs money.

    Tuesday, July 07, 2009 2:07 PM
  • Chris - you've taken the words out of my mouth. I agree with you 100% the current implementation of M$ database tools is poor, flawed and as you said above a piece of <curse word>.

    I don't understand why it has to be so difficult to create a simple import task that are so common in the real-world.
    Monday, August 03, 2009 10:07 PM
  • "Katrina, you are a goddess among men. For anyone out there who still has this problem, Katrina has led me to the solution. I was still a little confused when I read it, so here are the exact steps I took to change it. I was loading from a flat text file. In the edit page of the file connection manager (not flat file source), go to the advanced tabs to see the columns. Here you can set the load type for the columns (I wanted non-unicode, so I chose String). Then you connect to (let's say) an OLE DB Destination. If you double click to open the connection, you can choose "metadata" and see the types are correct. Right click on the OLE DB Destination, and choose "Show Advanced Editor". Then click on the "Input and Output Properties" tab. Expand "OLE DB Destination Input" and you will see two folders ("External Columns" and "Input Columns"). If you open "Input Columns" and choose one of your columns, you will see the correct data type (ie. DT_STR). Now, if you expand the "External Columns" folder and choose one of your columns, you will see that it thinks an incorrect data type is coming in (ie. "Unicode string [DT_WSTR]"). This is what is causing your error. This makes the package think that it needs to implicitly convert what it THINKS is coming in (DT_WSTR) to what it wants (DT_STR), which it refuses to do. In the "External Columns" folder, change all of your data types to the correct incoming types, and all your worries will float away.

    To aid in the discussion on many forums about why this is a problem, this is most definitely a Microsoft error. When you connect the source to the destination, the connection should force the destination to update its input types. Again, thanks to Katrina for leading the way!"


    Just a headsup. sometimes you can not save the configuration you made in the destination task. If you are facing the problem, go to the destination task's property and change the "ValidateExternalMetaData" to false.
    Sunday, August 16, 2009 10:22 AM
  • Tim Rupe's solution works perfectly!
    Friday, August 28, 2009 3:49 PM
  • Wow - a four year old thread on this is amazing. Seems to me that the problems caused by this outweigh the ones fixed and there should be an easier way to deal with this.

     Here is something that looks a bit new.

    I have  SQL 2008 Enterprise on a server in our production environment and developed a package that imports 20 tables from SQL 2000 in order to format the fields for SSRS and have historical data since the SQL 2000 databse only keeps 3 months worth of transactions.

    It works fine.

    We built a dev environment ( all virtual) and ran up a stand alone SQL server - again SQL 2008. I restored my database to it and we restored  the SQL 2000 database to another server.

    I copied the folder structure for the package to dev, made the appropriate changes to the configuration for server and database names and ONE of the data flow tasks gets this error now.

    The only difference I can divine is that I'm using a Full version of Visual Studio 2008 in dev and SQL Server Business Intelligence Development Studio on the prod server.
    Wednesday, January 27, 2010 3:10 PM
  •  

    It looks like Microsoft has fixed this in 2008, but NOT in 2005. 

    Go to the link below,
    vote for the thread,
    and suggest a 2005 fix as well!

    https://connect.microsoft.com/SQLServer/feedback/details/337679/ssis-cannot-convert-unicode-to-non-unicode?wa=wsignin1.0#tabs

     

    Thursday, April 01, 2010 8:17 PM
  • SSIS that will run in BIDS but not as a scheduled Job SQL 2005 Sp2 I have created an SISS package that pulls two nvarchar fields from a SQL table and exports them to an csv table. I got the "cannot convert between unicode and non-unicode data type" errors when i ran the package in BIDS. Following the threads I have cast these to varchar in my select and was able to create the desired output file with no Data Conversion step between connection and output tasks. I then set the package up as a scheduled job and has it fail because of the "cannot convert etc." error. I then went back to the package and added a data conversion step as per the threads. Again this worked fine in BIDS but failed with the "cannot convert etc." error. Does anyone have a solution for this?
    Thursday, April 08, 2010 3:30 AM
  • I just tested importing an Excel spreadsheet using the SSIS 2008 Import/Export Wizard.

    I changed the output type for FirstName & LastName from nvarchar(max) to varchar(max).

    I saved the package and run it successfully in BIDS.

    The wizard inserted a Data Conversion steps for FirstName & LastName from DT_NTEXT to DT_TEXT.

    The following recent thread is related to unicode/non-unicode conversion in T-SQL:

    http://social.msdn.microsoft.com/Forums/en-US/transactsql/thread/4fbbb77a-4aec-4e10-b26e-822102ce7ae9

    The Hungarian language, see demo following, contains the two Ő & Ű UNICODE characters, the rest of them are non-unicode. If you are not a database expert just a mere mortal you notice this by different software products (like email) converting it to O & U.

    -- The are Ő & Ű UNICODE characters converted to O and U NON-UNICODE
    DECLARE @HUAlphabet varchar(35)='AÁBCDEÉFGHIÍJKLMNOÓÖŐPQRSTUÚÜŰVWXYZ'
    DECLARE @nHUAlphabet nvarchar(35)=N'AÁBCDEÉFGHIÍJKLMNOÓÖŐPQRSTUÚÜŰVWXYZ'
    
    SELECT @HUAlphabet
    SELECT @nHUAlphabet
    -- AÁBCDEÉFGHIÍJKLMNOÓÖOPQRSTUÚÜUVWXYZ
    -- AÁBCDEÉFGHIÍJKLMNOÓÖŐPQRSTUÚÜŰVWXYZ
    
    SELECT CONVERT(varbinary(2), N'Ő'), CONVERT(varbinary(2), N'Ű')
    -- 0x5001	0x7001


    Kalman Toth SQL SERVER 2012 & BI TRAINING
    New Book: Beginner Database Design & SQL Programming Using Microsoft SQL Server 2012


    • Edited by Kalman Toth Wednesday, September 26, 2012 12:01 AM
    Thursday, April 08, 2010 6:32 AM
  • Try CASTing to NVARCHAR and NCHAR, (Unicode), 
    instead of VARCHAR and CHAR. (ASCII). 
    That seemed to be the best solution for me. 

    Others have suggested adding Data Conversion tasks in SSIS, 
    but I found that to be more time-consuming.

    Besides, if you do your converts to Unicode in the T-SQL, then you can use and re-use SSIS' ability to generate data flows quickly... after layout changes, etc.

    To know what columns need this: 
    in SSIS data flows, click on the green arrow and select "Metadata" on the upper left. 
    Any column that shows datatype "DT_STR" will need to be converted. 
    After CASTing/CONVERTing to NVARCHAR, the datatype in SSIS will show as "DT_WSTR".
    (This is useful if you have "delay validation" set to true.)

    Wednesday, May 26, 2010 4:04 PM
  • I'm not sure how it was anything inconsistant we did in this case.  It seems like varchar should be a valid destination for an Access text field.  I did do as suggested (thanks carlweb and mizuno),  and changed the destination to nvarchar instead of varchar, and it worked.  I think of this as a destination type limitation though instead of an error due to being inconsistant.  I like using varchar for a trimmer db, but I can use nvchar on the fields I need to use for Access text field destinations.

    Is it SQL Server 2005 SSIS supposed to have the limitation that you can only import Access text into a nvarchar, not a varchar?  I was able to manually load OK but when programming SSIS I had to change the varchars to nvarchar or I got the error on this thread.  It is strange the behavior for the exact same datasets is different from a manual import versus data flow task.  For this project I will change the destination to nvarchar, but I will look in within the flow when I have more time. 

    Thank you for the help.

    • Edited by Aurelious Wednesday, June 09, 2010 4:13 PM grammar correction
    Wednesday, June 09, 2010 4:09 PM
  • Thanks for the reply Jamie, I have tried using both of these and still get a similar conversion error (as below on a similar table).  I have tried converting to different datatypes but still get the same conversion error.

    The same database had no problem using DTS on 2000.

    I must missing something that is probably blatantly obvious, but do you have any other ideas?

    TITLE: Package Validation Error
    ------------------------------

    Package Validation Error

    ------------------------------
    ADDITIONAL INFORMATION:

    Error at Data Flow Task [SQL Server Destination [4395]]: The column "TransactionKey" can't be inserted because the conversion between types DT_NTEXT and DT_WSTR is not supported.

    Error at Data Flow Task [SQL Server Destination [4395]]: The column "AccountNumber" can't be inserted because the conversion between types DT_NTEXT and DT_WSTR is not supported.

    Error at Data Flow Task [SQL Server Destination [4395]]: Column "TransactionCode" cannot convert between unicode and non-unicode string data types.

    Error at Data Flow Task [SQL Server Destination [4395]]: The column "Acmount" can't be inserted because the conversion between types DT_R4 and DT_CY is not supported.

    Error at Data Flow Task [SQL Server Destination [4395]]: The column "TransactionDate" can't be inserted because the conversion between types DT_DATE and DT_DBTIMESTAMP is not supported.

    Error at Data Flow Task [SQL Server Destination [4395]]: Column "Narration" cannot convert between unicode and non-unicode string data types.

    Error at Data Flow Task [DTS.Pipeline]: "component "SQL Server Destination" (4395)" failed validation and returned validation status "VS_ISBROKEN".

    Error at Data Flow Task [DTS.Pipeline]: One or more component failed validation.

    Error at Data Flow Task: There were errors during task validation.

     (Microsoft.DataTransformationServices.VsIntegration)

    ------------------------------
    BUTTONS:

    OK
    ------------------------------

    Hi Jordan

    In your destination table, if data types are nvarchar(x), change them to varchar(x), then run your SSIS package again, it should work.

     

     

    • Proposed as answer by Mubarak_100 Thursday, July 15, 2010 8:00 PM
    Thursday, July 15, 2010 7:56 PM
  • Just with a questions, I'm having the same problem, and what I want to know is, where do I put that explicit conversion? "(DT_STR, <length>, 1252) (<column_name>)", I can't find where I can specify that kinds of conversions, if you can give me advice about that will be greatly appreciated.

     

     

    Wednesday, September 22, 2010 9:01 PM
  • That explicit conversion is an SSIS expression, and you can do that in a Derived Column component.  Here is an article that may help you out with a "best practice": Conversion Between Unicode and Non-Unicode Data Flow Pattern.  If your environment is permitted to use 3rd party components, this one could help you out a ton.
    Todd McDermid's Blog Talk to me now on
    Wednesday, September 22, 2010 9:19 PM
  • This thread has been going on for years. I faced the same problem today.

    The mistake that I was doing was: I was not setting up the input metadata. I was using the default column definitions. When I updated those it worked.

     

    Thanks

    Manish

    Wednesday, October 13, 2010 3:42 PM
  • you were lucky

    I have saved text files as code page 65001, set input to unicode, set output of all transformations to unicode, I made sure the sql server table column is nvarchar

    but SSIS seems to delight in telling me I am using non-unicode data.

    MR Microsoft Moderator. After 5 years you haven't realised that the problem is not that people need to understand the best practice for converting data. It is that SSIS is bug ridden

    I am not using non-unicode data, the SSIS is, and causing it's own error

     

     


    fa
    • Edited by Farid2006 Friday, November 19, 2010 11:44 AM 5 years of wasted time
    Friday, November 19, 2010 11:40 AM
  • i found this article very helpful with this issue--

     

    http://www.mssqltips.com/tip.asp?tip=1393

    THIS WORKS!!!!

    Use the Data Conversion Task to change your VARCHAR DT_STR into DT_WSTR and bobs your uncle!

    Tuesday, November 23, 2010 5:47 PM
  • This setting can be changed as follows:

    1. Go to Data Flow

    2. Right-click on the Excel Data Source

    3. Select Input And Output Properties Tab

    4. Expand Excel Source Output

    5. Expland External Columns

    6. You should be seeing your individual column names, with Column Properties and the Data Type Properties panes on the right. As you click on each individual column name, Data Type Properties box will display the current Data Type. Select the pulldown and set the data property as desired.

    Wednesday, December 01, 2010 12:57 AM
  • Follow the steps in this article. This works. The step above still resulted in the same error - how very frustrating! Don't forget to change the column names (as the article mentions) otherwise it won't work!

    http://www.mssqltips.com/tip.asp?tip=1393

    Wednesday, December 01, 2010 10:21 PM
  • I know this is little weird to asking us to convert column names to nvarchar from varchar..But using Cast function in my SQL statement for the input source ,I  could resolve this issue..I love SSIS but sometimes it is little frustating..

     

    Wednesday, January 26, 2011 8:02 PM
  • Sathyalan's answer was perfect for me.  I have just spent a few hours working on an SSIS package and I knew that the source was not Unicode, but SSIS seemed to think it was.  When I set "ValidateExternalMetadata" to fale, it works :D

    Monday, September 12, 2011 3:44 PM
  • Four years later, but ran into the same truncation issues, and various other issues related to using excel as a datasource, when using 64 bit SQL2008R2. Solved it by 

    1) Editing the typeguessrows value in the registry

    2) Directly editing the input and output column datatypes in the advanced editor (right click on data source in data flow task, choose advanced editor, then edit data types for output columns on the input/output properties tab).

    3) Setting Run64BitRuntime to false (BIS, Project, Properties, Debugging).

    My scenario is working with spreadsheets submitted by clients in various other countries, which may be thousands of rows, and sometimes include columns with multiple or ambiguous datatypes, e.g., strings of numbers, dates written in purely numeric form, strings exceeding 255 characters. 

    Big thanks to Tim Rupe and Katrina Illidge.

     

    Tuesday, September 20, 2011 8:55 AM
  • Thanks the IMEX=1 did the job.
    Thursday, January 10, 2013 12:16 PM
  • Thank you very much Jamie, I got the same question in SSIS 2008 and I use your suggestion to convert the date type in Derived Column, and it works! Actually I had used DT_STR function but I made a stupid mistake that I got my exprssion as this: (DT_STR, <length>, 1252) <column_name>, not as your's (DT_STR, <length>, 1252) (<column_name>), the error message still there that I thougt the function can not work in this situation, almost desparate then I saw your reply, then I found I lost () , and now it works very well, thanks again!

         -Jeff 

    Thursday, January 17, 2013 2:29 AM