none
Script to copy newest version of particular files in a list RRS feed

  • Question

  • I'm trying to write a script that copies files between two remote servers. Files are all in one directory, however there are many files with "Similar" names going back 2 weeks. I only want the newest version of each file. There are 51 files I need, so I have to put them in a list.

    For example, here's what the directory would look like:

    • SERVER_Active_Full_201401081244.safe  (1/8/14)  (I only want this one)
    • SERVER_Active_Full_201401071244.safe  (1/7/14)
    • SERVER_Active_Full_201401061244.safe  (1/6/14)
    • SERVER_Unicode_Full_201401081247.safe  (1/8/14)  (I only want this one)
    • SERVER_Unicode_Full_201401071247.safe  (1/7/14)
    • SERVER_Unicode_Full_201401061247.safe  (1/6/14) 

    So out of those Active and Unicode, there are 51 varieties that I care about copying.  However there are others in the folder that I don't want to copy.  That's why I'm creating a full list of all 51 files.

    I have a working script but it takes files from all dates, not just the newest version.  Basically, I'm looking for a way to incorporate this into the script below so that it only grabs the newest file of the files in the list:

    If ($item.LastWriteTime -gt ((Get-Date).AddDays(-1)))

    --------------------------------------------------------------------------

    # source and destionation directory
    $src_dir = "\\SERVER1\Test"
    $dst_dir = "\\SERVER2\Test"

    # list of files from source directory that I want to copy to destination folder
    # unconditionally
    $file_list = "SERVER_Active_Full*",
                     "SERVER_Unicode_Full*" ,
                     "SERVER_Test_Full*" ,
                       <etc, etc>

    # Copy each file unconditionally (regardless of whether or not the file is there
    foreach ($file in $file_list)
    {
    Copy-Item $src_dir$file $dst_dir
    }


    • Edited by agroda Thursday, January 9, 2014 4:12 PM
    Thursday, January 9, 2014 3:04 PM

Answers

  • OK, so I'm confused.

    Here's how I understand your original script:

    You have a list of wildcard filenames ($file_list). These are the database names without the date portion (the portion handled by the * wildcard)

    For each database name, you copy DATABASENAME* from the source to the destination (i.e. you copy all the DATABASENAME files)

    You have a piece of code to select only the files modified in the last day: If ($item.LastWriteTime -gt ((Get-Date).AddDays(-1)))

    You want to incorporate this line into your existing script so instead of copying all files, it copies just the files returned by this line.

    In this case you need to first retrieve the files so you can check the LastWriteTime before you copy the ones that match your selection.  Similar to what I suggested, you'll need to use Get-ChildItem:

    # source and destionation directory
    $src_dir = "\\SERVER1\Test\"
    $dst_dir = "\\SERVER2\Test\"
    
    # list of files from source directory that I want to copy to destination folder
    $file_list = "SERVER_Active_Full*",
    	"SERVER_Unicode_Full*" ,
    	"SERVER_Test_Full*" ,
    	<etc, etc>
    	
    foreach ($file in $file_list) {
    	
    	$items = Get-ChildItem -Path $src_dir$file
    	
    	foreach ($item in $items) {
    
    		if ($item.LastWriteTime -gt ((Get-Date).AddDays(-1))) {
    		
    			Copy-Item $item.FullName $dst_dir
    		
    		} # if lastwritetime within the last day
    
    	} # foreach file
    
    } #foreach databasename



    • Marked as answer by agroda Thursday, January 9, 2014 5:06 PM
    Thursday, January 9, 2014 4:40 PM

All replies

  • An approach you could try would be to list the files in $src_dir and sort by LastWriteTime. It sounds like you want the latest, which in this case would be the first result:

    Get-ChildItem -Path ($src_dr + "\SERVER_Active_Full*") | Sort-Object -Property LastWriteTime -Descending

    If you toss the results in a variable, the first item [0] contains the most recent file. There is likely a nicer way to do this, but this is quick and dirty.

    $files = Get-ChildItem -Path ($src_dr + "\SERVER_Active_Full*") | Sort-Object -Property LastWriteTime -Descending
    Copy-Item -Path $file[0].FullName -Destination $dst_dir



    • Edited by Jason Warren Thursday, January 9, 2014 4:06 PM fixed a words
    Thursday, January 9, 2014 4:05 PM
  • I really need to list the files though.  Of the newest files, I only want 51, however there were 100 or so written to today.  These are DB backups that take place every day.  I need to copy them off to a remote server, however I only care about the really critical DB's (The 51 I mentioned), and the latest file versions of these 51. 

    The beginning of the name is the same each day, at the end it stamps a date/time on the end of the file (Example:  (SERVER_DBNAME_Full_<timedate>.safe).  That's why I need to determine last write time and only copy the newest of the files that I want.




    • Edited by agroda Thursday, January 9, 2014 4:14 PM
    Thursday, January 9, 2014 4:12 PM
  • So if I understand you, you want to list all the files, but only copy the files from today?

    If you list out the results of Get-ChildItem (alias "dir") it will show all the "SERVER_Active_Full*" files in the folder.

    Do the old files already exist in the destination? If you're making a "backup" or copy of the folders, a tool like Robocopy or SyncToy may work. They can be set to only copy the latest files, as well as display all the existing files.


    Thursday, January 9, 2014 4:18 PM
  • If I do this command, it will return ALL files written to today.  However, I don't want all of them, I only want 51 out of 100 or so.

    Get-ChildItem -Path \\server1\test\ |
        where {$_.lastWriteTime -gt (Get-Date).addDays(-1)}

    This script allows me to copy all of the newest files and works correctly, but I don't want all of them.

    $uncA=\\server1\test\
    $uncB=\\server2\test\
    foreach ($item in (Get-ChildItem $uncA)) {
        If ($item.LastWriteTime -gt ((Get-Date).AddDays(-1))) {
            Copy-Item $item.FullName $uncB
        }
    }

    And the script I posted in the original post correctly gets just the files I want, but for all days. I guess I'm looking for a way to combine the two of them.  I looked into using Robocopy, but due to the fact that I need to feed a list of files, I couldn't find a good way to do it without a huge command.

    Also, it probably wasn't clear in the first post, but the Unicode and Active are "sample" DB names.  There are really 100 of these, but again, only 51 that I want to copy...

    • Edited by agroda Thursday, January 9, 2014 4:27 PM
    Thursday, January 9, 2014 4:24 PM
  • OK, so I'm confused.

    Here's how I understand your original script:

    You have a list of wildcard filenames ($file_list). These are the database names without the date portion (the portion handled by the * wildcard)

    For each database name, you copy DATABASENAME* from the source to the destination (i.e. you copy all the DATABASENAME files)

    You have a piece of code to select only the files modified in the last day: If ($item.LastWriteTime -gt ((Get-Date).AddDays(-1)))

    You want to incorporate this line into your existing script so instead of copying all files, it copies just the files returned by this line.

    In this case you need to first retrieve the files so you can check the LastWriteTime before you copy the ones that match your selection.  Similar to what I suggested, you'll need to use Get-ChildItem:

    # source and destionation directory
    $src_dir = "\\SERVER1\Test\"
    $dst_dir = "\\SERVER2\Test\"
    
    # list of files from source directory that I want to copy to destination folder
    $file_list = "SERVER_Active_Full*",
    	"SERVER_Unicode_Full*" ,
    	"SERVER_Test_Full*" ,
    	<etc, etc>
    	
    foreach ($file in $file_list) {
    	
    	$items = Get-ChildItem -Path $src_dir$file
    	
    	foreach ($item in $items) {
    
    		if ($item.LastWriteTime -gt ((Get-Date).AddDays(-1))) {
    		
    			Copy-Item $item.FullName $dst_dir
    		
    		} # if lastwritetime within the last day
    
    	} # foreach file
    
    } #foreach databasename



    • Marked as answer by agroda Thursday, January 9, 2014 5:06 PM
    Thursday, January 9, 2014 4:40 PM
  • Yep, you got it.  The basic reason we need to do this is because there are DB's which are much more critical to us than others in a full disaster situation.  These are getting copied to a remote DR location, and I have to save as much bandwidth as possible.

    Your code is spot on, looks like it's working! Thanks so much!!  I kept trying to incorporate it into the foreach statement like that, but couldn't get it right.


    • Edited by agroda Thursday, January 9, 2014 5:08 PM
    Thursday, January 9, 2014 5:06 PM
  • Ahh, that explains it. Glad it's working!

    Thursday, January 9, 2014 7:19 PM