locked
Pass Custom Parameter to Function or Jobs RRS feed

  • Question

  • Hi All

    I'm trying to pass a custom PS object to a script block but somehow the mapping is not working for me.

    I’m reading an excel file with multiple lines and five columns

    Each row represents a set of parameters independent of each other, and I've written a custom script block which I want to execute in parallel for all the records.

    Reference: https://gallery.technet.microsoft.com/scriptcenter/Multi-threading-Powershell-d2c0e2e5

    Below is the sample code

    ------------

    $COPY = {

        Param  ($output)

           $output.col1

           $output.col2

           $output.col3

           $output.col4

           $output.col5

             

    }

    Clear-Host

    $InputList = import-xls 'C:\Names.xlsx'

    Foreach($input in $InputList)

    {

        Start-Job scriptblock $COPY -ArgumentList $input

    }

    Get-Job | Wait-Job

    $out = Get-Job | Receive-Job

    $out

    -------------

    I’ve tried multiple things except splitting the row with five values and then pass all the values independently, I’m trying to pass complete row to the script block as a single variable, below is the structure of this variable when I print in PS console

    Col1 : Value1

    Col2 : Value2

    Col3 : Value3

    Col4 : Value4

    Col5 : Value5

    For each record, the structure is the same but with different values

    Any thoughts?



    Thanks, Rohit ~ All glory comes from daring to begin. ~ "If you find my answer helpful, please mark it as Answer."


    • Edited by RohitDBA Thursday, September 6, 2018 10:34 AM Junk links
    Thursday, September 6, 2018 10:32 AM

All replies

  • Hi,

    Thanks for your question.

    Background jobs are good for long-running tasks.  For multi-threading short, trivial tasks runspaces or a workflow would probably be a better choice.

    You can try to use workflow . Refer the link below.

    https://docs.microsoft.com/en-us/system-center/sma/overview-powershell-workflows?view=sc-sma-1807

    I'm sorry, it’s not too clear what is wrong with your export of excel files. please describe it clearly.

    Best Regards,

    Lee



    Just do it.

    Friday, September 7, 2018 2:18 AM
  • Hi Lee,

    The import of the CSV is working for me but now I'm facing a different issue.

    Script is running fine in its current form i.e. background jobs but few jobs are intermittently failing with "Your Azure credentials have not been set up or have expired, please run Connect-AzureRmAccount to set up your Azure credentials"

    In one of the case I’m restarting 100 odd Azure web services, and for few job I’m seeing this error for rest I can see its completing successfully

    In other case I’m deleting 50 AzureSQL databases its deleting few and erroring out for few, not sure why connection is getting expired in between execution.



    Thanks, Rohit ~ All glory comes from daring to begin. ~ "If you find my answer helpful, please mark it as Answer."

    Monday, September 17, 2018 11:33 AM
  • Your original question has been answered.  If you have new issues please open a new topic.

    I will also note that running jobs as a way of managing many separate operations on Azure will run into issues as each job will require a new login and connection.  This will cause issues with azure as Azure will block you on too many connections.

    To do multiple operations in Azure you will need to post in the Azure forum for assistance.

    You original code also makes no sense an has nothing to do with Azure.  Why would you use separate jobs to output each line of a CSV?

    An Excel object is an active COM object and can ot be easily passed to a new process.   You should pass the individual columns as single arguments.


    \_(ツ)_/



    • Edited by jrv Monday, September 17, 2018 7:38 PM
    Monday, September 17, 2018 7:36 PM