locked
Combine info to make new account names and export to CSV RRS feed

  • Question

  • So my scripting is all that great and I'm having a bit of trouble after searching and searching.  I'm trying to take the names of users and make a new user name for an ADMT script that I'm working on.  So it would be adm-nuser, where the N is the letter of the first name and user is the users last name.  So far it work except for the export to csv.  It's writing the new information 3 times and it's putting quotes around Target and UPN both for the header and the output.  Can someone please explain to me what I'm doing wrong?

    ###version 5
    $users = Get-ADUser -Filter {sAMAccountName -like "ahsx*"} | select givenname,surname,samaccountname
    
        foreach($X in $users)
         {
            $usersam = $X.samaccountname
            $Userlast = $X.surname
            $UserFirst = $X.givenname
    
            $UserIntfirst = $UserFirst.Substring(0,1)
    
            $adm = "adm-"
            $adm = $adm + $Userfirst + $Userlast
            $userprin = $adm + "@healthcare.org"
    
            $admnames | out-file 'usernames.txt' -Append
            
                $AllObjects = @()
    
                $final | ForEach-Object {
                    $AllObjects += [pscustomobject]@{
                        source = $x.samaccountname
                        target = $adm
                        upn = $userprin
    
                    }
                }
    
                $AllObjects | Export-Csv -Path ".\outfile.csv" -NoTypeInformation -Append -Encoding UTF8
        }
    


    If it answered your question, remember to “Mark as Answer”.

    If you found this post helpful, please “Vote as Helpful”.

    Postings are provided “AS IS” with no warranties, and confers no rights.

    Thursday, October 12, 2017 11:19 AM

All replies

  • Move the Export-Csv line to be outside of your ForEach loop and that should stop the multiple repeat records. As for the quotes, I believe that is default behavior of Export-Csv as some data in a csv file must be in quotes.

    You can see by doing a simple test, that once you export to csv, everything is quoted

    Get-ADUser <username> -Prop * | Export-Csv C:\Temp\Testing.csv -NoTypeInformation

    If the double quotes is causing a problem, then you can do something like below which I found in one of my old scripts

    # If using Export-CSV will add double quotes to all data, so instead, use ConvertTo-Csv to convert the data, then remove all double quotes
      # and then out put the information to the CSV file
      $results | Sort-Object -Property ID | ConvertTo-Csv -NoTypeInformation -Delimiter "`t" | ForEach-Object {
           $_ -replace '"', ""} | Out-File $outPutFile -Force -ErrorAction SilentlyContinue -ErrorVariable onError -Encoding ASCII


    If you find that my post has answered your question, please mark it as the answer. If you find my post to be helpful in anyway, please click vote as helpful. (99,108,97,121,109,97,110,50,64,110,121,99,97,112,46,114,114,46,99,111,109|%{[char]$_})-join''



    Thursday, October 12, 2017 12:24 PM
  • Moving the export outside the quotes only gets the last user, that's why I put it inside the for each loop.

    If it answered your question, remember to “Mark as Answer”.

    If you found this post helpful, please “Vote as Helpful”.

    Postings are provided “AS IS” with no warranties, and confers no rights.

    Thursday, October 12, 2017 12:55 PM
  • That makes no since, since $AllObjects looks to be an array that you are adding to on each iteration. But also re-ooking at the code you have

    $final | Foreach {} - This is where $AllObjects is being populated, but where does $final come from?


    If you find that my post has answered your question, please mark it as the answer. If you find my post to be helpful in anyway, please click vote as helpful. (99,108,97,121,109,97,110,50,64,110,121,99,97,112,46,114,114,46,99,111,109|%{[char]$_})-join''

    Thursday, October 12, 2017 1:14 PM
  • AHH!! You are creating $AllObjects over and over again that is why :-)

    Try something like below which is not the entire script as I see you are creating more variables that are never used.

    $allObjects = @()
    
    Get-ADUser -Filter "sAmAccountName -like 'ahsx*'" | Foreach {
      $adm = "adm-$($_.GivenName)$($_.SurName)"
      $AllObjects += [Management.Automation.PSObject]@{
        source = $_.sAmAccountName
    	target = $adm
    	upn = "$adm@healthcare.org"
      }
    }
    
    $AllObjects | Export-Csv -Path ".\OutFile.csv" -NoTypeInformation -Encoding UTF8


    If you find that my post has answered your question, please mark it as the answer. If you find my post to be helpful in anyway, please click vote as helpful. (99,108,97,121,109,97,110,50,64,110,121,99,97,112,46,114,114,46,99,111,109|%{[char]$_})-join''

    Thursday, October 12, 2017 1:21 PM
  • The problem with that script is, is that it doesn't export the data.  It just exports the garbage.  Exporting to CSV I always have trouble with..  Here is what it outputs.

    IsReadOnly,"IsFixedSize","IsSynchronized","Keys","Values","SyncRoot","Count"
    False,"False","False","System.Collections.Hashtable+KeyCollection","System.Collections.Hashtable+ValueCollection","System.Object","3"
    False,"False","False","System.Collections.Hashtable+KeyCollection","System.Collections.Hashtable+ValueCollection","System.Object","3"
    False,"False","False","System.Collections.Hashtable+KeyCollection","System.Collections.Hashtable+ValueCollection","System.Object","3"


    If it answered your question, remember to “Mark as Answer”.

    If you found this post helpful, please “Vote as Helpful”.

    Postings are provided “AS IS” with no warranties, and confers no rights.

    Thursday, October 12, 2017 2:52 PM
  • Yeah, I was looking at that (sorry a bit busy right now), and trying to figure it out. If you just say $allObjects without piping over to Export-Csv, then it outputs the data correctly.

    If you find that my post has answered your question, please mark it as the answer. If you find my post to be helpful in anyway, please click vote as helpful. (99,108,97,121,109,97,110,50,64,110,121,99,97,112,46,114,114,46,99,111,109|%{[char]$_})-join''

    Thursday, October 12, 2017 3:13 PM
  • Yeah for the most part.  I played around with it and did it to a text be it didn't format it right.  It exported the data but at least it was readable.  No worries, I appreciate the help!

    If it answered your question, remember to “Mark as Answer”.

    If you found this post helpful, please “Vote as Helpful”.

    Postings are provided “AS IS” with no warranties, and confers no rights.

    Thursday, October 12, 2017 3:24 PM
  • Try it this way

    $allObjects = @()
    
    Get-ADUser -Filter "sAmAccountName -like 'clay*'" | Foreach {
      $adm = "adm-$($_.GivenName)$($_.SurName)"
      $sAmAccountName = $_.sAmAccountName
      
      $obj = New-Object PSObject
      $obj | Add-Member -MemberType NoteProperty -Name "Source" -Value $sAmAccountName
      $obj | Add-Member -MemberType NoteProperty -Name "Traget" -Value $adm
      $obj | Add-Member -MemberType NoteProperty -Name "UPN" -Value "$adm@healthcare.org"
      
      $allObjects += $obj
    }
    
    $allObjects | Export-Csv C:\Temp\OutFile.csv -NoTypeInformation -Encoding UTF8


    If you find that my post has answered your question, please mark it as the answer. If you find my post to be helpful in anyway, please click vote as helpful. (99,108,97,121,109,97,110,50,64,110,121,99,97,112,46,114,114,46,99,111,109|%{[char]$_})-join''

    Thursday, October 12, 2017 4:26 PM
  • How to build a customized output with PowerShell.

    Get-ADUser -Filter "sAmAccountName -like 'clay*'" -Properties GivenName,SurName | 
    	Select SamAccountName, @{n='Target';e={ "adm-$($_.GivenName)$($_.SurName)"}},
    			@{n='';e={ "adm-$($_.GivenName)$($_.SurName)@healthcare.org"}} |
    	Export-Csv C:\Temp\OutFile.csv -NoType


    \_(ツ)_/


    Thursday, October 12, 2017 9:35 PM
  • How to build a customized output with PowerShell.

    Get-ADUser -Filter "sAmAccountName -like 'clay*'" -Properties GivenName,SurName | 
    	Select SamAccountName, @{n='Target';e={ "adm-$($_.GivenName)$($_.SurName)"}},
    			@{n='';e={ "adm-$($_.GivenName)$($_.SurName)@healthcare.org"}} |
    	Export-Csv C:\Temp\OutFile.csv -NoType


    \_(ツ)_/


    I always knew there was an easier way

    If you find that my post has answered your question, please mark it as the answer. If you find my post to be helpful in anyway, please click vote as helpful. (99,108,97,121,109,97,110,50,64,110,121,99,97,112,46,114,114,46,99,111,109|%{[char]$_})-join''

    Friday, October 13, 2017 11:35 AM
  • How to build a customized output with PowerShell.

    Get-ADUser -Filter "sAmAccountName -like 'clay*'" -Properties GivenName,SurName | 
    	Select SamAccountName, @{n='Target';e={ "adm-$($_.GivenName)$($_.SurName)"}},
    			@{n='';e={ "adm-$($_.GivenName)$($_.SurName)@healthcare.org"}} |
    	Export-Csv C:\Temp\OutFile.csv -NoType


    \_(ツ)_/


    I always knew there was an easier way

    If you find that my post has answered your question, please mark it as the answer. If you find my post to be helpful in anyway, please click vote as helpful. (99,108,97,121,109,97,110,50,64,110,121,99,97,112,46,114,114,46,99,111,109|%{[char]$_})-join''

    I really wonder why people consider something that's much slower and much harder to read as "easier"... Is it just because you're afraid of multiline commands?

    A much faster and imho easier to read solution would be this:

    $formattedUsers = foreach ($user in (Get-ADUser -Filter * -Properties GivenName, SurName)) { $formattedUser = [Ordered] @{} $formattedUser.Source = $user.sAMAccountName $formattedUser.Target = "adm-$($user.GivenName[0])$($user.SurName)" $formattedUser.UPN = "$($formattedUser.Target)@healthcare.org"

    [PScustomObject] $formattedUser } $formattedUsers | Export-Csv -Path C:\Temp\test.csv -NoTypeInformation

    Also, a couple of small things with jrv's script are wrong (and fixed on my variation):

    -The object properties should be called "Source", "Target" and "UPN", not "SamAccountName", "Target" and ""

    -The OP said Target should be adm-<FirstLetterOfGivenName><SurName>, as oppose to the full GivenName.

    Ultimately my script does the exact same thing as jrv's (except the above listed things which have been "fixed"), I just don't rely on the pipeline as much as jrv does, specially not for constructing PSObjects where it's incredibly slow and the expressions become very difficult to read.

    One small detail should be noted though: in the script above, if the user has no GivenName an error will be thrown because you can't index into a null array. Won't go into how to fix that as it's incredibly easy and outside the scope of this question, but worth pointing it out.

    Friday, October 13, 2017 1:20 PM
  • Not slower.  If you have been reading the PowerShell Team blog or the new book "PowerShell in Action you would know why that is a bit of popular nonsense.  A pipeline is faster for most things like this.

    Also the :declarative" approach is much easier to build and works like humans think.  Let the PS engine worry about the details.

    The method you are showing will have memory issues because you are saving all results in memory before you export. On large results sets this can slow things down.  Remember that creating an array causes the array to be reallocated every time an element is added.  A pipeline does not save any elements.  Each is completed and passed on.  There should only be one element in the pipeline at a time.  Yes there is overhead but combined with  all of the steps it nulls out.  For small collections it will be slower but the larger the number of items the pipeline shows its efficiency.

    If you just prefer an old style of programming then that is OK if it gives you what you want.  The newer style is more descriptive and can do most things.


    \_(ツ)_/

    Friday, October 13, 2017 1:30 PM
  • Not slower.  If you have been reading the PowerShell Team blog or the new book "PowerShell in Action you would know why that is a bit of popular nonsense.  A pipeline is faster for most things like this.

    Also the :declarative" approach is much easier to build and works like humans think.  Let the PS engine worry about the details.

    The method you are showing will have memory issues because you are saving all results in memory before you export. On large results sets this can slow things down.  Remember that creating an array causes the array to be reallocated every time an element is added.  A pipeline does not save any elements.  Each is completed and passed on.  There should only be one element in the pipeline at a time.  Yes there is overhead but combined with  all of the steps it nulls out.  For small collections it will be slower but the larger the number of items the pipeline shows its efficiency.

    If you just prefer an old style of programming then that is OK if it gives you what you want.  The newer style is more descriptive and can do most things.


    \_(ツ)_/

    I'm not redimming an array at all in my solution, not directly anyway. Whether PowerShell internally does this is another thing entirely but I honestly do not believe that something like...

    $p = Get-Service

    ...where, as well known, Get-Service returns one object at a time ("yield return") causes $p to be redimmed every time a new object is returned. I think PowerShell is smart enough to only create the array once all objects have been received, preventing redims. That being said, I haven't look at the code so won't argue it either way.

    As for the pipeline, while it can at times be faster (if dealing a huge number of objects that use up a lot of memory), more often than not it's actually much slower. Realistically, you only start seeing advantages of using the pipeline when you're running out of memory and/or having to use the paging file.

    My test AD system has about 200 accounts. I ran "Get-ADUser -Filter * -Properties GivenName, SurName" 100x, appending the results of each query to a List<Object>. The rest of the code remained the same as above, with the exception that inside the foreach() I went through the objects in my List<Object>, effectively cycling through 20k users. This means I had 20k users stored in memory and also 20k PSCustomObjects stored in memory before outputting them to the Csv.

    Still, this was faster than running your method 100x, despite the fact that it only keeps one object in memory at any given point in time. And yet the results are the same.

    Were you to pull all properties out of AD, and assuming you weren't running this on a DC, your method would have probably been faster - it's Friday evening, not really going to test it now. But this is where the importance of filtering at source comes in. And that applies regardless of whether you use the pipeline or not.

    As for "new coding approach". Please don't tell me that it's easier to write and read 

    Select SamAccountName, @{n='Target';e={ "adm-$($_.GivenName)$($_.SurName)"}},
    			@{n='';e={ "adm-$($_.GivenName)$($_.SurName)@healthcare.org"}}
     than it is to write/read 

    [PSCustomObject] @{ Source = $user.sAMAccountName Target = "adm-$($user.GivenName[0])$($user.SurName)" UPN = "adm-$($user.GivenName[0])$($user.SurName)@healthcare.org"

    }

    First of all, n= and e= are abbreviations, shouldn't really be used. Second there's too many things to remember to do properly there: there's the hashtable curly brackets, the n= and e= (even if you don't abbreviate them), the scriptblock for the expression, the ; separator, dealing with $_ which a lot of people still struggle with...

    And while technically possible, if that expression was any harder the whole thing would be completely unreadable (for example I said when using GivenName[0] you get an exception if GivenName is null, if you wanted to put an if in there to account for those situations it's perfectly doable - it accepts a scriptblock after all, but it would be even more unreadable).


    Friday, October 13, 2017 2:36 PM
  • An array that is constantly being added to is being re-dimensioned.

    The way I usually do this for clarity is this:

    $properties = @(
    	'SamAccountName', 
    	@{ n = 'Target'; e = { "adm-$($_.GivenName)$($_.SurName)" } },
    	@{ n = ''; e = { "adm-$($_.GivenName)$($_.SurName)@healthcare.org" } }
    )
    
    Get-ADUser -Filter "sAmAccountName -like 'clay*'" -Properties GivenName,SurName | 
    	Select $properties|
    	Export-Csv C:\Temp\OutFile.csv -NoType

    Read some of the many good blogs and articles on functional programming which is an implementation of declarative programming (paradigm).

    While PS is not a complete functional language it allows for easy declarative modeling of problems.  This is a powerful tool and approach to thinking about solutions and produces better and less convoluted code. I have been using this method for decades and it has always produced less code and less problems with code.


    \_(ツ)_/

    Friday, October 13, 2017 7:31 PM
  • So here is what I ended up going with...

    ###version 5
    $users = Get-ADUser -Filter {sAMAccountName -like "ahsx*"} | select givenname,surname,samaccountname
    
        foreach($X in $users)
         {
            $usersam = $X.samaccountname
            $Userlast = $X.surname
            $UserFirst = $X.givenname
    
            $UserIntfirst = $UserFirst.Substring(0,1)
    
            $adm = "adm-"
            $adm = $adm + $Userfirst + $Userlast
            $userprin = $adm + "@healthcare.org"
    
            $adm | out-file 'usernames.txt' -Append
              
    
                $AllObjects = @()
    
                $final | ForEach-Object {
                    $AllObjects += [pscustomobject]@{
                        sourcename = $x.samaccountname
                        targetsam = $adm
                        targetupn = $userprin
    
                    }
                    $AllObjects | Export-Csv -Path ".\outfile.csv" -NoTypeInformation -Append -Encoding ascii
    
                  
        
                }
        }
      (Get-Content -Path .\outfile.csv).Replace('"','') | Set-Content -Path .\newdata.csv


    If it answered your question, remember to “Mark as Answer”.

    If you found this post helpful, please “Vote as Helpful”.

    Postings are provided “AS IS” with no warranties, and confers no rights.

    Monday, October 16, 2017 7:34 AM
  • Well... I spoke too soon.  The script above works but I just realized that it only needs to convert the user names that have ahsx in them that are in the G_ServerAdmins group.  Now, I really suck at PS and especially when it comes to doing things like this.  How do I go about this? It also only needs to pick the accounts with ahsx in them because there are other accounts in there like service accounts.  It would be nice if it could pick accounts that have both ahsx* and vend* in them.

    Sorry for the ignorance folks!

    If you could, give me a little insight into how you do it because I'm still trying to learn PS as best as I can...



    If it answered your question, remember to “Mark as Answer”.

    If you found this post helpful, please “Vote as Helpful”.

    Postings are provided “AS IS” with no warranties, and confers no rights.


    • Edited by Kelly Bush Monday, October 16, 2017 9:29 AM
    Monday, October 16, 2017 9:28 AM
  • An array that is constantly being added to is being re-dimensioned.

    The way I usually do this for clarity is this:

    $properties = @(
    	'SamAccountName', 
    	@{ n = 'Target'; e = { "adm-$($_.GivenName)$($_.SurName)" } },
    	@{ n = ''; e = { "adm-$($_.GivenName)$($_.SurName)@healthcare.org" } }
    )
    
    Get-ADUser -Filter "sAmAccountName -like 'clay*'" -Properties GivenName,SurName | 
    	Select $properties|
    	Export-Csv C:\Temp\OutFile.csv -NoType

    Read some of the many good blogs and articles on functional programming which is an implementation of declarative programming (paradigm).

    While PS is not a complete functional language it allows for easy declarative modeling of problems.  This is a powerful tool and approach to thinking about solutions and produces better and less convoluted code. I have been using this method for decades and it has always produced less code and less problems with code.


    \_(ツ)_/

    And where do you see an array being redimensioned in my code?
    Monday, October 16, 2017 11:31 AM
  • Create an array either explicitly or implicitly and add elements.  Each time you add an element the array is recreated with the new element.

    $a = Get-Process

    $a is an array that is incrementally built from the pipeline output.

    Get=Process | Export-Csv ...

    does not create an array it just write each element to the file.


    \_(ツ)_/

    Monday, October 16, 2017 11:54 AM
  • So here is what I ended up going with...

    ###version 5
    $users = Get-ADUser -Filter {sAMAccountName -like "ahsx*"} | select givenname,surname,samaccountname
    
        foreach($X in $users)
         {
            $usersam = $X.samaccountname
            $Userlast = $X.surname
            $UserFirst = $X.givenname
    
            $UserIntfirst = $UserFirst.Substring(0,1)
    
            $adm = "adm-"
            $adm = $adm + $Userfirst + $Userlast
            $userprin = $adm + "@healthcare.org"
    
            $adm | out-file 'usernames.txt' -Append
              
    
                $AllObjects = @()
    
                $final | ForEach-Object {
                    $AllObjects += [pscustomobject]@{
                        sourcename = $x.samaccountname
                        targetsam = $adm
                        targetupn = $userprin
    
                    }
                    $AllObjects | Export-Csv -Path ".\outfile.csv" -NoTypeInformation -Append -Encoding ascii
    
                  
        
                }
        }
      (Get-Content -Path .\outfile.csv).Replace('"','') | Set-Content -Path .\newdata.csv


    If it answered your question, remember to “Mark as Answer”.

    If you found this post helpful, please “Vote as Helpful”.

    Postings are provided “AS IS” with no warranties, and confers no rights.


    That isn't going to work you cannot get properties like surname and givenname without referencing them.

    \_(ツ)_/

    Monday, October 16, 2017 12:07 PM
  • The above worked for what I was doing at the time.

    If it answered your question, remember to “Mark as Answer”.

    If you found this post helpful, please “Vote as Helpful”.

    Postings are provided “AS IS” with no warranties, and confers no rights.

    Monday, October 16, 2017 12:45 PM
  • I'm trying to do something like this now.

    $members = Get-ADGroupMember -Identity $group -Recursive | Select samaccountname | Export-Csv .\members.csv -NoTypeInformation
    
    $users = Import-Csv .\members.csv |%{Get-AdUser -filter "samaccountname -eq '$($_.samaccountname)'"} | Select SamAccountName
    
    
    ForEach ($user in $users) 
        {
        If ($user.samaccountname -like "ahsx*") 
            {
            $user.samaccountname | out-file ".\userlist.txt" -append
            } 
        }
    
    ###
    $users = $(Get-Content .\userlist.txt)
    
        foreach($x in $users){ 
         {$x = Get-ADUser $user -Properties samaccountname,surname,givenname}
    
            $usersam = $x.samaccountname
            $Userlast = $x.surname
            $UserFirst = $x.givenname
    
            $UserIntfirst = $UserFirst.Substring(0,1)
    
            $adm = "adm-"
            $adm = $adm + $Userintfirst + $Userlast
            $userprin = $adm + "@healthcare.org"
    
            $adm | out-file 'ahsx-usernames.txt' -Append
              
    
                $AllObjects = @()
    
                $final | ForEach-Object {
                    $AllObjects += [pscustomobject]@{
                        sourcename = $x.samaccountname
                        targetsam = $adm
                        targetupn = $userprin
    
                    }
                    $AllObjects | Export-Csv -Path ".\outfile.csv" -NoTypeInformation -Append -Encoding ascii
    
                  
          (Get-Content -Path .\outfile.csv).Replace('"','') | Set-Content -Path .\newdata.csv
                }
     }
    


    If it answered your question, remember to “Mark as Answer”.

    If you found this post helpful, please “Vote as Helpful”.

    Postings are provided “AS IS” with no warranties, and confers no rights.

    Monday, October 16, 2017 12:46 PM
  • But that is different from what you posted previously.


    \_(ツ)_/

    Monday, October 16, 2017 12:52 PM
  • I know, I realized that I can't do it the way I had started out with.  Different teams are going to have different prefixes on their new admin accounts.  So that's why I need to pull just the users in a group with AHSX.  That way I can convert just those users.  Server_Admins get pulled and converted to ADM- and Desktop_Admins get pulled and converted to DSK-.

    So that's my issue... 


    If it answered your question, remember to “Mark as Answer”.

    If you found this post helpful, please “Vote as Helpful”.

    Postings are provided “AS IS” with no warranties, and confers no rights.

    Monday, October 16, 2017 1:05 PM
  • $pre = 'AHSX'
    Get-AdUser -Filter "SamAccountName -like '$pre*'"


    \_(ツ)_/



    • Edited by jrv Monday, October 16, 2017 1:36 PM
    Monday, October 16, 2017 1:35 PM
  • This is what I finally got to work

    $members = Get-ADGroupMember -Identity $group -Recursive | Select samaccountname | Export-Csv .\members.csv -NoTypeInformation
    
    $users = Import-Csv .\members.csv |%{Get-AdUser -filter "samaccountname -eq '$($_.samaccountname)'"} | Select SamAccountName
    
    
    ForEach ($user in $users) 
        {
        If ($user.samaccountname -like "ahsx*") 
            {
            $user.samaccountname | out-file ".\userlist.txt" -append
            } 
        }
    
    ###
    $z = Get-Content .\userlist.txt 
    
        $results = foreach($x in $z) { 
        
      
        $array = Get-ADUser -identity $x -Properties * | select samaccountname,surname,givenname
        
            
            foreach($y in $array){
            $usersam = $y.samaccountname
            $Userlast = $y.surname
            $UserFirst = $y.givenname
    
            $UserIntfirst = $UserFirst.Substring(0,1)
    
            $adm = "adm-"
            $adm = $adm + $Userintfirst + $Userlast
            $userprin = $adm + "@healthcare.org"
    
            $adm | out-file 'ahsx-usernames.txt' -Append}
              
    
                $AllObjects = @()
    
                $final | ForEach-Object {
                    $AllObjects += [pscustomobject]@{
                        sourcename = $y.samaccountname
                        targetsam = $adm
                        targetupn = $userprin
    
                    }
                    $AllObjects | Export-Csv -Path ".\outfile.csv" -NoTypeInformation -Append -Encoding ascii
    
                  
          (Get-Content -Path .\outfile.csv).Replace('"','') | Set-Content -Path .\newdata.csv
                }
     
     }


    If it answered your question, remember to “Mark as Answer”.

    If you found this post helpful, please “Vote as Helpful”.

    Postings are provided “AS IS” with no warranties, and confers no rights.

    Monday, October 16, 2017 1:47 PM
  • Sorry but I cannot make any sense out of your code.  It is clear you do  not understand what we have been trying to show you.  Please take the time to learn PowerShell basics and to learn the AD CmdLets.  You cannot just keep guessing and we cannot keep trying to unwind your guesses.

    Start simple and test each CmdLet at a prompt until you understand how it works.  After you understand the commands then think about what was posted above and try to understand how it works.


    \_(ツ)_/

    Monday, October 16, 2017 1:54 PM
  • Well I agree I need to learn it. However, in the mean time I have to do what I can with what I have.  It's not like you can just plug into the matrix and download this stuff.  It works for what I need it to do so I guess it is what it is.  I appreciate the help you provided.

    If it answered your question, remember to “Mark as Answer”.

    If you found this post helpful, please “Vote as Helpful”.

    Postings are provided “AS IS” with no warranties, and confers no rights.

    Monday, October 16, 2017 1:56 PM
  • Create an array either explicitly or implicitly and add elements.  Each time you add an element the array is recreated with the new element.

    $a = Get-Process

    $a is an array that is incrementally built from the pipeline output.

    Get=Process | Export-Csv ...

    does not create an array it just write each element to the file.


    \_(ツ)_/

    Ahh... no, $a is not incrementally built from the pipeline output. And here, I can prove it:

    $a = $null

    Measure-Command { $a = foreach ($i in 1..50000) {$i}}

    This takes 39 milliseconds to run on my computer

    $a = [Object[]]::new(0)
    Measure-Command {foreach ($i in 1..50000) {$a += $i}}
    

    Whereas this takes ... 40 seconds to run on the same computer.

    Hell I first had the first command set to 10 million and it will only took 7.5 seconds to run on my computer. All because the array did not need to be redimensioned - unlike what you said.

    But hey, let's not just base our results on "random timings something takes to run". Let's do some proper science and look under the hood:

    $a = $null Set-PSDebug -Trace 2 $a = foreach ($i in 1..5) {$i}

    Set-PSDebug -Trace 0

    Here's the output:

    DEBUG:    1+ $a = foreach ($i in  >>>> 1..5) {$i}
    DEBUG:     ! CALL function '<ScriptBlock>'
    DEBUG:    1+  >>>> $a = foreach ($i in 1..5) {$i}
    DEBUG:     ! SET $foreach = 'IEnumerator'.
    DEBUG:    1+ $a = foreach ( >>>> $i in 1..5) {$i}
    DEBUG:     ! SET $i = '1'.
    DEBUG:    1+ $a = foreach ($i in 1..5) { >>>> $i}
    DEBUG:    1+ $a = foreach ( >>>> $i in 1..5) {$i}
    DEBUG:     ! SET $i = '2'.
    DEBUG:    1+ $a = foreach ($i in 1..5) { >>>> $i}
    DEBUG:    1+ $a = foreach ( >>>> $i in 1..5) {$i}
    DEBUG:     ! SET $i = '3'.
    DEBUG:    1+ $a = foreach ($i in 1..5) { >>>> $i}
    DEBUG:    1+ $a = foreach ( >>>> $i in 1..5) {$i}
    DEBUG:     ! SET $i = '4'.
    DEBUG:    1+ $a = foreach ($i in 1..5) { >>>> $i}
    DEBUG:    1+ $a = foreach ( >>>> $i in 1..5) {$i}
    DEBUG:     ! SET $i = '5'.
    DEBUG:    1+ $a = foreach ($i in 1..5) { >>>> $i}
    DEBUG:    1+ $a = foreach ( >>>> $i in 1..5) {$i}
    DEBUG:     ! SET $foreach = ''.
    DEBUG:     ! SET $a = '1 2 3 4 5'.

    Notice that $a gets set once.

    Now let's compare it with what happens when we actually redimension an array...

    $a = [object[]]::new(0)
    Set-PSDebug -Trace 2
    foreach ($i in 1..5) {$a += $i}
    Set-PSDebug -Trace 0
    DEBUG:    1+ foreach ($i in  >>>> 1..5) {$a += $i}
    DEBUG:     ! CALL function '<ScriptBlock>'
    DEBUG:     ! SET $foreach = 'IEnumerator'.
    DEBUG:    1+ foreach ( >>>> $i in 1..5) {$a += $i}
    DEBUG:     ! SET $i = '1'.
    DEBUG:    1+ foreach ($i in 1..5) { >>>> $a += $i}
    DEBUG:     ! SET $a = '1'.
    DEBUG:    1+ foreach ( >>>> $i in 1..5) {$a += $i}
    DEBUG:     ! SET $i = '2'.
    DEBUG:    1+ foreach ($i in 1..5) { >>>> $a += $i}
    DEBUG:     ! SET $a = '1 2'.
    DEBUG:    1+ foreach ( >>>> $i in 1..5) {$a += $i}
    DEBUG:     ! SET $i = '3'.
    DEBUG:    1+ foreach ($i in 1..5) { >>>> $a += $i}
    DEBUG:     ! SET $a = '1 2 3'.
    DEBUG:    1+ foreach ( >>>> $i in 1..5) {$a += $i}
    DEBUG:     ! SET $i = '4'.
    DEBUG:    1+ foreach ($i in 1..5) { >>>> $a += $i}
    DEBUG:     ! SET $a = '1 2 3 4'.
    DEBUG:    1+ foreach ( >>>> $i in 1..5) {$a += $i}
    DEBUG:     ! SET $i = '5'.
    DEBUG:    1+ foreach ($i in 1..5) { >>>> $a += $i}
    DEBUG:     ! SET $a = '1 2 3 4 5'.
    DEBUG:    1+ foreach ( >>>> $i in 1..5) {$a += $i}
    DEBUG:     ! SET $foreach = ''.

    I rest my case.

    Monday, October 16, 2017 8:21 PM
  • This:

    Measure-Command { $a = foreach ($i in 1..50000) {$i}}

    proves the pipeline is faster.  the construct creates a pipeline where the others do not.  Collecting inside the loop always creates a standard array where the pipeline creates a results array of type then returns it either as "type" or as cast to object[].

    Built the same code in C# using the host and it will be easier to see how that works.


    \_(ツ)_/

    Monday, October 16, 2017 8:29 PM
  • This:

    Measure-Command { $a = foreach ($i in 1..50000) {$i}}

    proves the pipeline is faster.  the construct creates a pipeline where the others do not.  Collecting inside the loop always creates a standard array where the pipeline creates a results array of type then returns it either as "type" or as cast to object[].

    Built the same code in C# using the host and it will be easier to see how that works.


    \_(ツ)_/

    You're fricking mad. There's no pipeline being used there at all!

    But you want to see the results with a pipeline? Sure, let's do it:

    Measure-Command { $a = foreach ($i in 1..1000000) {$i}}

    That takes 760 milliseconds to run on my computer.

    Measure-Command { $a = 1..1000000 | % {$_}}

    That takes 4.7 seconds on the exact same computer.

    This is when I stop arguing, there's just no point talking to you mate, even in the face of irrefutable evidence you refuse to admit you made a mistake. Instead it's easier to just keep slowly changing what your version of events, right?

    First you said doing $a = Get-Service caused the array to be redimensioned for every service returned, now you completely change and start rambling about this proving the pipeline is faster - even though there's no pipeline being used at all. Next you're going to tell me I'm holding it wrong, right?

    Monday, October 16, 2017 10:07 PM
  • All commands that produce output are considered to be pipelined.  The default pipeline is:

    <command> | Out-Default

    If you do not specify an output converter then Out-Default is used.  At least this is what Bruce Payette points out in his latest book.  Just running to the console is pipelined output.

    The results from  command in C# look like this:

    PSDataCollection<PSObject>

    or this:

    Collection<PSObject> PSOutput = PowerShellInstance.Invoke();

    A PS instance has a default pipeline.


    \_(ツ)_/




    • Edited by jrv Monday, October 16, 2017 10:17 PM
    Monday, October 16, 2017 10:10 PM
  • If I do this:

    $x = {
         Get-Process
         GEt-Service
    }

    $results = $x.Invoke()

    all output will  be returned because the execution sends all output to the default pipeline.


    \_(ツ)_/

    Monday, October 16, 2017 10:20 PM
  • All commands that produce output are considered to be pipelined.  The default pipeline is:

    <command> | Out-Default

    If you do not specify an output converter then Out-Default is used.  At least this is what Bruce Payette points out in his latest book.  Just running to the console is pipelined output.

    The results from  command in C# look like this:

    PSDataCollection<PSObject>

    or this:

    Collection<PSObject> PSOutput = PowerShellInstance.Invoke();

    A PS instance has a default pipeline.


    \_(ツ)_/




    Have you even tested the Out-Default?

    It's not quite the same, here's an example to prove it:

    $a = 1..10
    
    $b = 1..10 | Out-Default
    
    $c = foreach($i in 1..10) {$i}
    
    $d = foreach($i in 1..10) {Out-Default -InputObject $i}

    Now check the values of variables $a, $b, $c and $d ...

    Out-Default takes whatever objects are in the pipeline and sends the output to the default formatter.

    It is well known that things like Get-Service | Out-Host are useless, because Out-Default is the default command at the end of the pipeline.

    This means that after an instruction is executed, if there is still something in the pipeline it gets sent to Out-Default, which in turn just sends it to the console. But in the case of $a = Get-Service there is nothing left in the pipeline, whatever was left in the pipeline was assigned to $a, so Out-Default is never invoked.

    Do not confuse this "pipeline" with the | pipeline. The pipeline we're talking about here is similar to a yield return, only PowerShell will internally manage the objects and either pass them one at a time to the next instruction (if using the | pipeline), collect them all and cast them to an object[] at the very end in a situation where we assign the value of the instruction to a variable or just internally send it to Out-Default if none of the above were true.

    However, the truth of the matter is that the | pipeline has a massive overhead. It's great in theory for objects that are incredibly large in memory where one might run out of memory or for situations where an instruction might take forever to run and you want to start processing the results before obtaining them all.

    An example is: imagine Get-Service would take 1 second to retrieve each service. If a computer has 40 services, it would take 40 seconds to complete Get-Service | Stop-Service (assuming Stop-Service takes no time at all and ignoring the overhead introduced by the | pipeline). But it would be stopping one service a second, immediately after each individual service is retrieved. If you assume Stop-Service actually outputs something - like the name of the service that was stopped -, that means the user would see a new line coming up every one second, giving them a very good indication that the command is running, what it is doing at a particular point in time and, in this particular case, how close it is to completing since services are returned in alphabetical order.

    On the other hand, if you were to do foreach ($service in Get-Service) {Stop-Service -InputObject $service} the overhead would be much smaller because of the lack of the | pipeline dependency but... it wouldn't start the cycling through the results of Get-Service until that instruction had fully completed. In other words, again assuming that Stop-Service was instant it would just "sit there" for 40 seconds "doing nothing" and then instantly output that all services had been stopped - all at the same time. This can provide a sense of being slower or the command being stuck because there is no visibility that something is indeed happening.

    However, do not confuse this with the pipeline being faster, it's not. Even on trivial things *not* using the pipeline is faster. In most cases the difference is negligible but in certain situations it starts to add up quite a bit.

    Monday, October 16, 2017 11:33 PM
  • If I do this:

    $x = {
         Get-Process
         GEt-Service
    }

    $results = $x.Invoke()

    all output will  be returned because the execution sends all output to the default pipeline.


    \_(ツ)_/

    Again, you're capturing the result so Out-Default is not called. Whatever is inside the scriptblock is executed first, whatever it returns is assigned to $results in one fail swoop. There is no "default pipeline".
    Monday, October 16, 2017 11:36 PM
  • Yes but, as you just said, it is still a pipeline.


    \_(ツ)_/

    Monday, October 16, 2017 11:48 PM
  • Yes but, as you just said, it is still a pipeline.


    \_(ツ)_/

    Clearly there had to be a "but" in there. Let's focus on the nomenclature being right, even though literally everything else you said was wrong. Let's forget that actually most people refer to this as "the pipe" and to | as "the pipeline" exactly to avoid this confusion and because they are literally completely different things.

    But hey, at least you got the nomenclature right, so congratz for that!

    Tuesday, October 17, 2017 7:48 AM
  • You girls done now?

    If it answered your question, remember to “Mark as Answer”.

    If you found this post helpful, please “Vote as Helpful”.

    Postings are provided “AS IS” with no warranties, and confers no rights.

    Tuesday, October 17, 2017 3:35 PM
  • Hi,

    I'm checking how the issue is going, was your issue resolved?

    And if the replies as above are helpful, we would appreciate you to mark them as answers, and if you resolve it using your own solution, please share your experience and solution here. It will be greatly helpful to others who have the same question.

    Appreciate for your feedback.

    Best Regards,
    Albert Ling

    Please remember to mark the replies as an answers if they help and unmark them if they provide no help.
    If you have feedback for TechNet Subscriber Support, contact tnmff@microsoft.com.

    Thursday, October 19, 2017 9:17 AM