locked
BCP command RRS feed

  • Question

  • Hello. I have executed bcp command on my souce db and stored the files on source server. I now want to execute bcp in in a fast way. Is there a place I can copy my files on my azureserver so I can execute the bcp command on my azure-server. 

    bleetmaa

    Tuesday, January 27, 2015 1:54 PM

Answers

  • Hi bleetmaa,

    From your description, do you want to import data of your local data files into SQL Azure database or export data from SQL Azure database? If that is the case, you can directly place the data files on your local server folder and run the bcp command such as the following statement at the Windows command prompt.

    bcp AdventureWorksLTAZ2008R2.SalesLT.Customer in/out C:\Users\user\Documents\MoveDataToSQLAzure.txt -c -U username@servername -S tcp:servername.database.windows.net -P password

    For more details, please review the following blog.
    BCP and SQL Azure
    http://azure.microsoft.com/blog/2010/05/21/bcp-and-sql-azure/

    Thanks,
    Lydia Zhang


    Lydia Zhang
    TechNet Community Support




    Wednesday, January 28, 2015 6:07 AM
  • Correct, to achieve the highest potential throughput of to your database, and negate as many network effects as possible, it is best to co-locate the source of the data and the database in the same region (e.g., West US).  Please also note that each Azure SQL Database's throughput will be a function of its performance level (i.e., Database Throughput Units (DTUs)).  For more information, please see Azure SQL Database Service Tiers and Performance Levels. A Standard S2 database will achieve a higher throughput than a Basic database, for example.  Depending on the specifics of your scenario, it might be advantageous to provision a Premium P3 database for the duration of your bulk insert task and then scale down the database to a more appropriate cost/performance level after the insert is done.  Because Azure SQL Database charges by the hour, the costs of such an insert can be minimized.
    Thursday, January 29, 2015 3:26 PM
  • OK, thanks. I have a running webpage in the same region as the database. I will transfer the files there and look over my DTUs at my server. Wonder if bcp supports networkfolder? Then it doesnt matter where I execute the bcp-command. 

    bleetmaa

    • Marked as answer by bleetmaa Friday, January 30, 2015 11:49 AM
    Friday, January 30, 2015 5:53 AM

All replies

  • Hi bleetmaa, I'd love to help, but I don't quite understand your question. Can you please elaborate? One of the fastest ways to BCP data from your SQL Database is to co-locate a server/VM in the same region.
    Tuesday, January 27, 2015 3:34 PM
  • Hi bleetmaa,

    From your description, do you want to import data of your local data files into SQL Azure database or export data from SQL Azure database? If that is the case, you can directly place the data files on your local server folder and run the bcp command such as the following statement at the Windows command prompt.

    bcp AdventureWorksLTAZ2008R2.SalesLT.Customer in/out C:\Users\user\Documents\MoveDataToSQLAzure.txt -c -U username@servername -S tcp:servername.database.windows.net -P password

    For more details, please review the following blog.
    BCP and SQL Azure
    http://azure.microsoft.com/blog/2010/05/21/bcp-and-sql-azure/

    Thanks,
    Lydia Zhang


    Lydia Zhang
    TechNet Community Support




    Wednesday, January 28, 2015 6:07 AM
  • Yes, but my thought is to make the insert faster. I want to transfer the dat-files to a place to make my bcp insert command much faster. But I only have a AZURE SQL SERVER and not a Virtual machine with storage. Is the only way to create a filestorage that is close to my SQL SERVER? In that case how do I create I filestorage close to my SQL SERVER?

    bleetmaa

    Thursday, January 29, 2015 9:42 AM
  • Correct, to achieve the highest potential throughput of to your database, and negate as many network effects as possible, it is best to co-locate the source of the data and the database in the same region (e.g., West US).  Please also note that each Azure SQL Database's throughput will be a function of its performance level (i.e., Database Throughput Units (DTUs)).  For more information, please see Azure SQL Database Service Tiers and Performance Levels. A Standard S2 database will achieve a higher throughput than a Basic database, for example.  Depending on the specifics of your scenario, it might be advantageous to provision a Premium P3 database for the duration of your bulk insert task and then scale down the database to a more appropriate cost/performance level after the insert is done.  Because Azure SQL Database charges by the hour, the costs of such an insert can be minimized.
    Thursday, January 29, 2015 3:26 PM
  • OK, thanks. I have a running webpage in the same region as the database. I will transfer the files there and look over my DTUs at my server. Wonder if bcp supports networkfolder? Then it doesnt matter where I execute the bcp-command. 

    bleetmaa

    • Marked as answer by bleetmaa Friday, January 30, 2015 11:49 AM
    Friday, January 30, 2015 5:53 AM