locked
Infinite PowerShell loop using too much memory RRS feed

  • Question

  • I created the following PowerShell script with an infinite loop to provide a log of when we are experiencing a NSlookup failure.

    $arrDomains = @("westpac.com.au","pccasegear.com.au","google.com.au","google.com","ato.gov.au","commsec.com.au","facebook.com","en.wikipedia.org","microsoft.com")
    While($true){
        ForEach ($varDomain in $arrDomains){
            if (-Not (Resolve-DnsName $varDomain -ErrorAction SilentlyContinue)){
                $varOutput = "NSlookup failed on " + (Get-Date).DateTime + " for $varDomain"
                $varOutput | Out-File -filepath C:\Scripts\NSlookupFailureLog.txt -Append
            }
        }
    }

    The script works great and the resulting log file is exactly what we need to give to the hardware firewall supplier to prove the hardware firewall is causing the intermittent webpage connection failures. The only problem with running the script is the fact that the memory usage grows until the system slows down and other users complain. Overnight the memory usage grew to 30 GB. After searching online, I have been unable to find a solution. What needs to be changed in the script to stop the large memory usage? Thanks for your help with this.

    Friday, January 3, 2020 4:54 AM

All replies

  • The following will reduce the unnecessary variable creation and prevent the error stack from filling memory.

    $domains = @(
        'westpac.com.au',
        'pccasegear.com.au',
        'google.com.au',
        'google.com',
        'ato.gov.au',
        'commsec.com.au',
        'facebook.com',
        'en.wikipedia.org',
        'microsoft.com'
    )
    
    While($true){
        ForEach ($domain in $domains){
            Try{
                Resolve-DnsName $domain -ErrorAction Stop
            }
            Catch{
                $error.Clear() # keep error stack empty
                ('NSlookup failed on {0} for {1}' -f [datetime]::Now,$varDomain) |
                    Out-File C:\Scripts\NSlookupFailureLog.txt -Append
            }
        }
    }

    The code posted cannot run out of memory or it would cause an error.  The page file should get all overflow and would throw an alert if it was filling up.

    Memory in Windows is virtual.  Many numbers only tell us that some memory has been reserved.  When Windows needs memory it will trim the working set.  This may not happen ever for systems with large memories.  If a program does not throw memory errors you should assume that it is OK.

    The garbage collector will tag memory that is not being used but the system is what deallocates the memory and that only happens when the system needs memory for some other process.

    As Bob Marley liked to say "No error no cry".


    \_(ツ)_/





    • Edited by jrv Friday, January 3, 2020 6:42 AM
    Friday, January 3, 2020 6:38 AM
  • First you actually need to check the process which is consuming the most of the system Physical Memory. You can use task manager to check the same during high memory utilization.
    Friday, January 3, 2020 11:26 AM
  • "As expected, the ForEach statement, which allocates everything to memory before processing, is the faster of the two methods. ForEach-Object is much slower. Of course, the larger the amount of data, the more risk you have of running out of memory before you are able to process all of the items. So be sure to take that into consideration."

    https://devblogs.microsoft.com/scripting/getting-to-know-foreach-and-foreach-object/

    That is absolute nonsense and has been floating around for a long time. The main members of the Power5SHell team particularly object to that explanation.

    ForEach-Object is faster overall when run in a pipeline.  The request is not about speed.  Since errors are being thrown to get the results the whole loop will spend 90% of its time processing exceptions.  Using the CmdLet for DNS is also very inefficient.

    A pipeline with file IO will be faster with a ForEach-Object due to the suspension that will occur when the output is written. 

    Also that blog post is over 5 years old and PS1 and 2 had noticeable issues with that CmdLet which have since been addressed.  PS 7 now has a parallel version of the CmdLet for added performance.

    Yes - you can find a simple exam-le where foreach() is faster but not by much and the pipeline version makes up in usability.In this case the pipeline uses less memory which is what the OP asked for.


    \_(ツ)_/

    Friday, January 3, 2020 1:14 PM
  • So the old argument is now that ForEach-Object consumes more memory which is what makes it slow.... Hmmm... That's a new one on me. The article posted above only mentions time being slower and, as we now know, that is trivial compared to PS1 and 2.

    The original code posted by the OP will not hold a large amount of memory if we just add $error.Clear() in the loop.  For better performance this can be run on a loop counter and cleared every thousand iterations to improve loop performance although the performance gain will be trivial compared to the cost of the DNS call.

    Also we need to keep in mind that releasing memory does not actually remove the memory from the working set until the system needs it.  This is called "working set trimming" and is done when needed by Windows and periodically if the system or process are not busy.

    At about WS2003 MS implemented a new memory management scheme that adjusted how Windows memory was allocated and managed.  Later NT systems adjusted this further and actually did less management in favor of a less disruptive as needed approach.

    The current guidelines for PowerShell are to let PS and the system do the memory.  I have not found any case where this was not reliable advise.

    There are some posts on PS memory and garbage collection. The old ones should be ignored but I don't have a link to the most current description.

    I just did a search scan and only old articles show up.  Most newer articles only dui9scuss limiting PS memory for restricted and custom use shells or how to limit total memory for all shells.  This is only needed in aggressive systems with many PowerShell users or instances where we don't want PowerShell to consume too many resources.

    For almost all normal usage the current settings are more than sufficient as long as we avoid allocating objects in a loop.  Throwing exceptions in a loop creates persistent objects.

    Re-assigning a variable in a loop effectively releases the old contents.  There is no need for an explicit Remove-Variable.  Objects and (types) are just pointers to memory.  Once the memory is not pointed to PowerShell is free to release the memory.  PS does this in its GC pass when needed.  One gotcha is that some objects can persist even with no assigned variable and these must be explicitly destroyed.  Image objects are on case when the objects have an open file.  The file can become locked for a very long time.  If you need the file a second time you will get an exception. 

    The other killer is recursive functions.  When a function recurses it pushes its context onto a stack.  This will use up the stack until the recursion reaches an end point and then the stack will be unwound.  PowerShell limits the recursion depth so an exception will bethrown if the limit is exceeded.  The stack can never be overallocated.`

    The current default settings for PS memory are this:

    PS WSMan:\localhost\Shell> dir WSMan:\localhost\Shell
    
    
       WSManConfig: Microsoft.WSMan.Management\WSMan::localhost\Shell
    
    Type            Name                           SourceOfValue   Value
    ----            ----                           -------------   -----
    System.String   AllowRemoteShellAccess                         true
    System.String   IdleTimeout                                    7200000
    System.String   MaxConcurrentUsers                             2147483647
    System.String   MaxShellRunTime                                2147483647
    System.String   MaxProcessesPerShell                           2147483647
    System.String   MaxMemoryPerShellMB                            2147483647
    System.String   MaxShellsPerUser                               2147483647
    
    
    Th highlight show 2Gb per shell which should be enough for anything.


    \_(ツ)_/

    Friday, January 3, 2020 2:58 PM
  • Thank you for your reply. The memory usage takes longer to fill up but it still grows continuously until access to the systems resources by users becomes noticeably slow. This version of the script does not run out of memory and error but neither did mine. I have run these tests on multiple servers to confirm the problem. I have also been looking into limiting the memory usage and performing the script for a limited amount of time then starting it again. Non of these options have produced a satisfactory resolution so far.

    Do you have any further suggestions for changing the script to stop the memory usage growth?

    NB: The posted coded has a typo in that the last variable in the script is $varDomain instead of $domain.


    Tuesday, February 4, 2020 10:40 PM
  • First you have to run perfmon to see which process is consuming system resources.  Working set does not have an effect.  The one thing that can account for this is lack of page file space and a highly fragmented paging drive.  Also consumption of non-paged pool will cause a total slowdown of all processes.  You need to identify which of the limited resources are being consumed.  That can only be done by monitoring with perfmon.

    Until you can identify which resource is slowing down the system there is no way to identify the cause.  Assuming that it is the script or PowerShell is just a bad assumption.  You need proof of cause.  If there is a bug in PowerShell this is what you will have to report to the bugmeisters at UserVoice.

    You can also use a forced crash dump to analyze the actual resource usage and to look at the contents of the over-allocated memory which can tell you what command is causing the resources to be held assuming that you can prove tha the cause is PowerShell.


    \_(ツ)_/

    Tuesday, February 4, 2020 11:07 PM