none
bulkcopy data to sql, if row existswrite over it RRS feed

  • Question

  • Hi, I wrote a script to read data from a csv into a table, then bulkcopy to a sql table. At the time i did not realize that i may need to write over existing rows and bulkcopy does not support that. i am just not sure how to modify the script to write over an existing row if the data already exists. Here is what i have. Any help would be appreciated. thanks!

    $CurrentDate = Get-Date
    $CurrentDate = $CurrentDate.ToString('MM-dd-yyyy_hh-mm-ss')
    
     # Database variables
    $sqlserver = "db-sqlent1-prd\sql2008"
    $database = "Employees"
    $table = "dbo.tblPersonal"
    
    # CSV variables;  
    $csvfile = $return[2]
    $csvdelimiter = ","
    $firstrowcolumnnames = $true
    $empidcolname = 'SSN'
    $empidvalue = $return[1]
     
    
    $SqlConnection = New-Object System.Data.SqlClient.SqlConnection
    $SqlConnection.ConnectionString = "Server=db-ent3-dev\sql2008;Database=Employees;Integrated Security=True"
    $SqlConnection.Open()
    $SqlCmd = New-Object System.Data.SqlClient.SqlCommand
    
    $SqlCmd.CommandText =
    
    "select * from tblpersonal where SSN='0'"
    
    
    $SqlCmd.Connection = $SqlConnection
    $OpenSSN = $SqlCmd.ExecuteScalar()
    $SqlConnection.Close()
    
    
    
    if($OpenSSN){ overwrite    data }else{
    
    Try
     {
     
    $elapsed = [System.Diagnostics.Stopwatch]::StartNew()
     
    
    $batchsize = 100000
     
    # Build the sqlbulkcopy connection, and set the timeout to infinite
    $connectionstring = "Data Source=$sqlserver;Integrated Security=true;Initial Catalog=$database;"
    
    $bulkcopy = new-object ("Data.SqlClient.Sqlbulkcopy") $connectionstring
    $bulkcopy.DestinationTableName = $table
    $bulkcopy.bulkcopyTimeout = 0
    $bulkcopy.batchsize = $batchsize
    
    
    
    #$bulkcopy.EnableStreaming = 1
     
    # Create the datatable, and autogenerate the columns.
    $datatable = New-Object "System.Data.DataTable"
     
    # Open the text file from disk
    $reader = new-object System.IO.StreamReader($csvfile)
    $line = $reader.ReadLine()
    $columns =  $line.Split($csvdelimiter)
     
        if ($firstrowcolumnnames -eq $false) {
            foreach ($column in $columns) {
                $null = $datatable.Columns.Add()
                }
            # start reader over
            $reader.DiscardBufferedData();  
            $reader.BaseStream.Position = 0;
            }
        else {
            foreach ($column in $columns) {
                $null = $datatable.Columns.Add($column)
            } $null = $datatable.Columns.Add($empidcolname)
        }
     
     
    
    $ColumnMap1 = New-Object System.Data.SqlClient.SqlBulkCopyColumnMapping("$empidcolname",'SSN')
    $ColumnMap2 = New-Object System.Data.SqlClient.SqlBulkCopyColumnMapping('LastName', 'LastName')
    $ColumnMap3 = New-Object System.Data.SqlClient.SqlBulkCopyColumnMapping('FirstName','FirstName')
    $ColumnMap4 = New-Object System.Data.SqlClient.SqlBulkCopyColumnMapping('PreferredName','PreferredName')
    $ColumnMap5 = New-Object System.Data.SqlClient.SqlBulkCopyColumnMapping('WorkPhone','WorkPhone')
    $ColumnMap6 = New-Object System.Data.SqlClient.SqlBulkCopyColumnMapping('Email','Email')
    $ColumnMap7 = New-Object System.Data.SqlClient.SqlBulkCopyColumnMapping('EmpType','EmpType')
    $ColumnMap8 = New-Object System.Data.SqlClient.SqlBulkCopyColumnMapping('HireDate', 'HireDate')
    $ColumnMap9 = New-Object System.Data.SqlClient.SqlBulkCopyColumnMapping('ACID','ACID')
    $ColumnMap10 = New-Object System.Data.SqlClient.SqlBulkCopyColumnMapping('OPID','OPID')
    $ColumnMap11 = New-Object System.Data.SqlClient.SqlBulkCopyColumnMapping('Floor','Floor')
    $ColumnMap12 = New-Object System.Data.SqlClient.SqlBulkCopyColumnMapping('Department','Department')
    $ColumnMap13 = New-Object System.Data.SqlClient.SqlBulkCopyColumnMapping('Division','Division')
    $ColumnMap14 = New-Object System.Data.SqlClient.SqlBulkCopyColumnMapping('JobCode','JobCode')
    
    $BulkCopy.ColumnMappings.Add($ColumnMap1)
    $BulkCopy.ColumnMappings.Add($ColumnMap2)
    $BulkCopy.ColumnMappings.Add($ColumnMap3)
    $BulkCopy.ColumnMappings.Add($ColumnMap4)
    $BulkCopy.ColumnMappings.Add($ColumnMap5)
    $BulkCopy.ColumnMappings.Add($ColumnMap6)
    $BulkCopy.ColumnMappings.Add($ColumnMap7)
    $BulkCopy.ColumnMappings.Add($ColumnMap8)
    $BulkCopy.ColumnMappings.Add($ColumnMap9)
    $BulkCopy.ColumnMappings.Add($ColumnMap10)
    $BulkCopy.ColumnMappings.Add($ColumnMap11)
    $BulkCopy.ColumnMappings.Add($ColumnMap12)
    $BulkCopy.ColumnMappings.Add($ColumnMap13)
    $BulkCopy.ColumnMappings.Add($ColumnMap14)
    
    
     # Read in the data, line by line
        while (($line = $reader.ReadLine()) -ne $null)  {
            $row = $datatable.NewRow()
            $row.itemarray = $line.Split($csvdelimiter)
            $datatable.Rows.Add($row)   
            $row[$empidcolname] = $empidvalue
            # Once you reach your batch size, write to the db,  
            # then clear the datatable from memory
            $i++; if (($i % $batchsize) -eq 0) {
            $bulkcopy.WriteToServer($datatable)
            $datatable.Clear()
            }
        }
     
    # Close the CSV file
    $reader.Close()
     
        # Add in all the remaining rows since the last clear
        if($datatable.Rows.Count -gt 0) {
            $bulkcopy.WriteToServer($datatable)
            $datatable.Clear()
        }
     
    # Garbage collector
    [System.GC]::Collect()
     }
     Catch
     {
    ""
    ""
    ""
    ""
     write-host "Caught an exception while trying to write to Tbl.Personal:" -ForegroundColor Red
        write-host "Exception Type: $($_.Exception.GetType().FullName)" -ForegroundColor Red
        write-host "Exception Message: $($_.Exception.Message)" -ForegroundColor Red
           write-output "$($_.Exception.Message)" | out-file "c:\temp\CIDB_insert $CurrentDate.log" -append
       ""
       ""
       Write-host "tbl.personal insert failed!" -ForegroundColor Red
      ""
      ""
      ""
        }}}


      

    Wednesday, June 3, 2015 1:25 PM

Answers

  • You can do it with a SQL update batch or with  stored procedure.  Both are callable from PowerShell ADO.Net.


    \_(ツ)_/

    • Marked as answer by glacket Thursday, June 4, 2015 7:57 PM
    Thursday, June 4, 2015 7:29 PM

All replies

  • Here is what you are asking for respondents to do.

    1. Reproduce your table schema using a SQL server. (Assuming they have a SQL server to play with.)

    2. Copy and paste your long script.

    3. Try to understand and debug the script for you.

    That is not really the purpose of this forum. For the best help, you need to post a very short, self-contained example that demonstrates your specific problem.

    I would also note that simply wrapping your database logic in a PowerShell script doesn't make this a scripting question. For questions about how SQL server works, you need to ask in a more appropriate forum.


    -- Bill Stewart [Bill_Stewart]

    Wednesday, June 3, 2015 2:13 PM
    Moderator
  • Bill is correct. This is a database use issue.

    In the db world we would bulkcopy to a temp table then execute an update query that would merge the new data.  You have to design that update query.  It can be executed from PowerShell.


    \_(ツ)_/

    Wednesday, June 3, 2015 3:03 PM
  • I think your referring to a stored procedure, correct? if so, it can be called from powershell?
    Thursday, June 4, 2015 7:28 PM
  • You can do it with a SQL update batch or with  stored procedure.  Both are callable from PowerShell ADO.Net.


    \_(ツ)_/

    • Marked as answer by glacket Thursday, June 4, 2015 7:57 PM
    Thursday, June 4, 2015 7:29 PM
  • Cool. I will check that out. thanks for your help.
    • Marked as answer by glacket Thursday, June 4, 2015 7:57 PM
    • Unmarked as answer by glacket Thursday, June 4, 2015 7:57 PM
    Thursday, June 4, 2015 7:57 PM
  • Great - you are on your way.  Post back if you get stuck.

    \_(ツ)_/

    Thursday, June 4, 2015 8:07 PM