none
Can a file be too large to be read with Get-Content ? RRS feed

  • Question

  • Is it possible that a file can be too large to be successfully read into a PowerShell array using the Get-Content cmdlet?

     

    Wednesday, January 4, 2012 6:33 PM

Answers

  • Size of objects limit 2gb in .Net - http://msdn.microsoft.com/en-us/library/ms241064(VS.80).aspx

    As with 32-bit Windows operating systems, there is a 2GB limit on the size of an object you can create while running a 64-bit managed application on a 64-bit Windows operating system.
    

    Wednesday, January 4, 2012 6:58 PM
  • If it's larger than your available memory/swap file can accomodate.


    [string](0..33|%{[char][int](46+("686552495351636652556262185355647068516270555358646562655775 0645570").substring(($_*2),2))})-replace " "
    • Marked as answer by Larry Weiss Thursday, January 5, 2012 4:36 AM
    Wednesday, January 4, 2012 6:36 PM

All replies

  • If it's larger than your available memory/swap file can accomodate.


    [string](0..33|%{[char][int](46+("686552495351636652556262185355647068516270555358646562655775 0645570").substring(($_*2),2))})-replace " "
    • Marked as answer by Larry Weiss Thursday, January 5, 2012 4:36 AM
    Wednesday, January 4, 2012 6:36 PM
  • What PowerShell code can compute that number (the amount of available
    memory/swap file)?
     
    .
    On 1/4/2012 12:36 PM, mjolinor wrote:
    > If it's larger than your available memory/swap file can accommodate.
    >
     
     
    Wednesday, January 4, 2012 6:40 PM
  • Size of objects limit 2gb in .Net - http://msdn.microsoft.com/en-us/library/ms241064(VS.80).aspx

    As with 32-bit Windows operating systems, there is a 2GB limit on the size of an object you can create while running a 64-bit managed application on a 64-bit Windows operating system.
    

    Wednesday, January 4, 2012 6:58 PM
  • Thanks.  I went ahead and tried a Get-Content on a file larger that 2GB,
    and the powershell.exe process consumed as much as 7GB on my 8GB RAM
    Windows 7 64-bit system (at least that's what Task Manager told me).
    Eventually the entire system ran very sluggishly (and not just the
    powershell.exe process), but no errors were reported in the
    PowerShell.exe host console, and I had to kill that instance of the
    powershell.exe process.
     
    So, that's another defensive programming strategy to entertain in a
    PowerShell script (that is, to test a file for a reasonable size before
    using Get-Content to read it).
     
     
    Wednesday, January 4, 2012 9:37 PM
  • It would be interesting to run some tests as this article did to determine where things break down: Why is Get-ChildItem so Slow?
    Thursday, January 5, 2012 9:26 AM
  • It might not yet have failed due to needing a >2GB object. Unless there is an optimisation inside the Get-Content cmdlet (remembering Get-Content works for multiple providers, not just filesystem) it is likely to read content in a loop expanding the container each time.

    This tends to create lots of increasingly large garbage objects in the large object heap which will trigger swapping. This slows everything down well before trying to create a single object >2GB.


    Richard J Cox
    Thursday, January 5, 2012 12:10 PM
  • That is an interesting article that describes how PowerShell is
    dependent of the limitations of the implementation of the .NET
    framework.  At least PowerShell allows workarounds when you need to code
    around those limitations (like using cmd.exe based utilities in a pinch).
     
    Thursday, January 5, 2012 2:06 PM
  • Thursday, January 5, 2012 2:35 PM
  • I've noticed that a 220MB txt file will hit the 2GB limit..  I'm guessing get-content opens the log file in memory.. then for each line makes additional objects.
    Monday, July 16, 2012 4:23 AM