none
Windows Task Scheduler and PowerShell RRS feed

  • Question

  • Hey guys,

    I had a question I was hoping someone could help me with.

    Overview:

    I have two scripts that I use in an automated process of creating AD accounts. The first connects to a SQL HR database and extracts new hire information and the second script provisions the accounts.

    The data extraction script initiates the provisioning script after it has created a file to iterate through from the HR database.

    My Windows Task Scheduler initiates the data extraction script only since the provisioning script gets initiated as a sub-script.

    Problem:

    I found with Windows Task Scheduler jobs that if I don't prevent the data extraction script from closing out (after initating the provisioning script) it will stop the job and close out all of my powershell instances. To fix this, I added a powershell -noexit line to keep the data extraction script open while the provisioning script runs. It passes the process id of the data extraction script powershell instance to the provisioning script so when all of the ad accounts are created I run a stop-process on the data extraction instance and it effectively completes the job in the task scheduler.

    This all works fine except when I start the sub script for some reason a 'phantom' powershell instance gets opened and when everything is done that phantom instance stays open. For example, when the data extract script initiates the provisioning script 2 PS instances will open. I have tried the following with no luck:

    Using - Start-Job, Start-Process (both opens phantom instance). I tried just calling the provisioning script by listing the file path (e.g. $provisioningScriptPath $passedVariables) but in the middle of the provisioning script powershell times out and closes all of my instances. Now if I have a small set of accounts that need to be created this works but in larger jobs it times out.

    Any ideas? Do you guys have any suggestions on how to start a sub-script in a Windows Task Scheduled job? The thing that is weird is if I run the scripts manually from the PS shell it works fine - it only creates the phantom instance when being run from the task scheduler.

    Wednesday, August 20, 2014 1:39 AM

Answers

  • To begin with there is nothing that is timing out in anything you have posted.  The issue is bad design.  What you are trying to do cannot be done as a scheduled task because it is designed wrong.

    Create one single script that does everything and you will have eliminated most of the issues.  If you do not knowhow to do thisthenyou will need to contact a consultant to help you.

    We can answer  specific questions.  You do not have a specific question.

    If you can produce a simple script that demonstrates the issue then post it and we can try to help.  We cannot help with the vague information and partial scripts and will not undertake analysis of a complex scenario.


    ¯\_(ツ)_/¯

    • Marked as answer by BPurchell Thursday, August 21, 2014 8:34 PM
    Thursday, August 21, 2014 8:02 PM

All replies

  • Start by posting the shortest possible script (and associated task scheduler settings) that reproduce the problem.

    -- Bill Stewart [Bill_Stewart]

    Wednesday, August 20, 2014 2:01 AM
    Moderator
  • It very hard to understand what you are talking about. What is a "subscript"? 

    You should not be using Jobs or Start-Process in a task.

    You have a linear, two step process.  It would be best to do it all in one script.  If, for some reason, you must have two scripts then have a third script call the two scripts in order.

    # call script
    $results=c:\scripts\get-data.ps1
    if($results -eq 'good'){
        c:\scripts\process-data.ps1
    }

    That will not cause an issue as long as the two scripts do not use jobs or Start-Process.

    Of course, as Bill has noted, it is impossible to understand your issue with no example script.


    ¯\_(ツ)_/¯

    Wednesday, August 20, 2014 4:56 PM
  • That's because both Start-Job and Start-Process will spin off on their own instance.  They're not phantom, that's the nature of Start-Job -- to start a background job or process.  If the job you're starting doesn't finish out cleanly then it could sit perpetually.
    Wednesday, August 20, 2014 5:21 PM
  • That's because both Start-Job and Start-Process will spin off on their own instance.  They're not phantom, that's the nature of Start-Job -- to start a background job or process.  If the job you're starting doesn't finish out cleanly then it could sit perpetually.

    Correct - Given the data flow, using a job or start-process makes no sense.  It is not needed.


    ¯\_(ツ)_/¯

    Wednesday, August 20, 2014 5:24 PM
  • _/¯(ツ)¯\_

    Wednesday, August 20, 2014 5:47 PM
  • The reason I have 2 scripts is because the provisioning script is doing more than just one function. So I could have multiple jobs running the same script but with different parameters (e.g. one parameter creates accounts and another updates accounts) that get passed to it from the data extraction script.

    Here is a snippet of my code that calls the provisioning script (what I'm calling a sub script):

    ###Run the Provisioning Script
    IF ($execProvisioning -eq "Y")
    {
    	###Initiate the provisioning script
    	TRY
    	{
    		$date = Get-Date
    		#Get the process ID of this script. The provisioning script will kill this instance once it finishes processing
    		$scriptProcessID = ([System.Diagnostics.Process]::GetCurrentProcess()).Id
    		
    		Log-Results "$date`tSystem`tProcessing Started`tStarting IAMUPS for the following execution type: $execType using file: $outputFilePath`t"
    		
    		#***PRD:
    		Start-Process powershell.exe -ArgumentList "$acctProvScriptFilePath", "$execType", "$outputFilePath", "$scriptProcessID"	
    		
    	}
    	CATCH
    	{
    		$errorMessage = $_.Exception.Message
    		Log-Results "$date`tSystem`tProcessing Failed`tUnable to start IAMUPS`t$errorMessage"
    		exit
    	}
    	
    	#This script will not terminate until the provisioning script completes
    	powershell -noexit
    }

    When the data extract script starts the provisioning script it passes the current process id of its powershell instance so that the provisioning script can kill it when its done. The reason I had to do this is because when I didn't the Windows Task Scheduler would stop the job as soon as the data extract script finished.

    However, when I use Start-Process it opens up 2 powershell instances - 1 of which is the provisioning script and the other is the 'phantom' instance I'm talking about.

    Here is the code in the provisioning script that kills off the instances so the job can close out:

    #Kill the PS Instance of the Data Extraction Script
    IF ($scriptProcessID -eq $null -or $scriptProcessID -eq "" -or $scriptProcessID -eq 0)
    {
    	$date = Get-Date
    	Log-Results "$date`tSystem`tNo Action`tNo proccess to stop - scriptProcessID is null"
    }
    ELSE
    {
    		Stop-Process -Id $scriptProcessID -Force -ErrorAction Stop
    }
    
    $date = Get-Date
    Log-Results "$date`tSystem`tProcessing Completed`tIAMUPS completed the following execution type: $execType using file: $inputFile`t"
    #Kill the PS Instance of this script
    Stop-Process ([System.Diagnostics.Process]::GetCurrentProcess()).Id

    Putting Start-Process and Start-Job aside, if I call the provisioning script by just listing out the filepath of the script (e.g. c:\provisioningscript.ps1) everything works fine (it doesn't open up 2 other powershell instances) but the problem is the script times out and causes my job to fail and I'm not sure how I can make the script not time out.

    Let me know if this was enough information or not. Thanks all for your feedback and assistance - this has been driving me nuts.

    Thursday, August 21, 2014 6:46 PM
  • As we have been saying - you cannot do this:

    Start-Process powershell.exe -ArgumentList "$acctProvScriptFilePath", "$execType", "$outputFilePath", "$scriptProcessID" }

    This line starts the external script and returns immediately. Just run the script without Start-Process and it will wait. If you are using jobs you will have the same issue.

    Mostly this is all due to a lack of understanding of process flow and data flow design.  These need to be addressed in order to run this as a task.

    I recommend contacting a programmer or consultant who can work with you to design a process that will do what you need.  This forum is not available for consulting and design.

    I would consider the Stop-Process to be a good sign that you are lost as to how to design a complex process. You should never need to do this in a correctly designed task set.

     You should also consider using the pipeline to do sequential processes or a workflow. 


    ¯\_(ツ)_/¯

    Thursday, August 21, 2014 6:55 PM
  • You could add -wait to the start-process but your ddesign would then find new issues due to its structure.

    ¯\_(ツ)_/¯

    Thursday, August 21, 2014 6:56 PM
  • Right, I know it will return as soon as it issues the Start-Process which is why I have the line powershell -noexit so it pauses the script until the other one finishes (and I kill off both instances).

    I tried calling the provisioning script without the Start-Process but Powershell timed out on me after 9 minutes and I got this error in my log:

    Exception calling "Send" with "1" argument(s): "Service not available, closing transmission channel. The server response was: 4.4.1 Connection timed out"

    That's why I was wondering if there was a way to prevent powershell from timing out calling another script in this manor. If so then I wouldn't need to use powershell -noexit and I wouldn't need to use Stop-Process

    Thursday, August 21, 2014 7:11 PM
  • When we use Start-Process with -wit it hangs on that call until the script completes.  You cannot use jobs in the script.  PowerShell will not run as you are trying to get it to run.  All of your issues are due to lack of design of process flow.  For this you need a trained programmer to help you setup the steps.  I recommend workflow but it could be done with a pipeline.

    Again.  Design and consulting are beyond the scope of this forum.


    ¯\_(ツ)_/¯

    Thursday, August 21, 2014 7:18 PM
  • I understand, and appreciate your feedback. I've been doing a lot of research on my own and have searched google endlessly for a solution but I don't have anyone in my organization that is a trained expert in powershell. I thought these forums were for help and advice.  All I would like to know if you don't mind is how would you suggest starting script from within a script without it timing out? I personally don't like using Start-Process but if I don't and I just call the other script by listing it's file path and the parameters I'm trying to pass it just times out and causes my job to fail.
    Thursday, August 21, 2014 7:53 PM
  • To begin with there is nothing that is timing out in anything you have posted.  The issue is bad design.  What you are trying to do cannot be done as a scheduled task because it is designed wrong.

    Create one single script that does everything and you will have eliminated most of the issues.  If you do not knowhow to do thisthenyou will need to contact a consultant to help you.

    We can answer  specific questions.  You do not have a specific question.

    If you can produce a simple script that demonstrates the issue then post it and we can try to help.  We cannot help with the vague information and partial scripts and will not undertake analysis of a complex scenario.


    ¯\_(ツ)_/¯

    • Marked as answer by BPurchell Thursday, August 21, 2014 8:34 PM
    Thursday, August 21, 2014 8:02 PM
  • Thanks for your help - I just combined the scripts into one. I guess I was over thinking it.
    Thursday, August 21, 2014 8:34 PM
  • Thanks for your help - I just combined the scripts into one. I guess I was over thinking it.

    Yes - my original advice was to simplify as much as possible.  Once you get going you can add bells and whistles.

    ¯\_(ツ)_/¯

    Thursday, August 21, 2014 11:29 PM