none
VB Script - Find newest create date RRS feed

  • Question

  • I've a script that checks for the newest file date. 

    	compliant = 0
    
    	objAppendFile.WriteLine("Checking for newest File")
    	For Each objFile in objFSO.GetFoldeR(fullPath).Files
    	If DateDiff(d, Now, objFile.DateCreated) < 2 Then 
    	compliant = 1
    	break
    	End If
    	Next
    			
    	If compliant = 1 Then objAppendFile.Write(fullPath & " passes")

    It works just fine.  But since all i really want to know is if there is a file created the day it is run or 1 day older, is there a method that is faster?  Going through 50k or so file for each iteration takes a long time and chews up resources.

    Friday, August 12, 2011 9:30 PM

Answers

  • By far the fastest way is to run the console command

    dir  /od  "Path\File.ext"

    then grab the last file name the command reports and use your script to check its date.

    Note also that the line
        If DateDiff(d, Now, objFile.DateCreated) < 2 Then
    won't work. It should read
        If DateDiff("d", Now, objFile.DateCreated) < 2 Then

    Friday, August 12, 2011 9:55 PM

All replies

  • By far the fastest way is to run the console command

    dir  /od  "Path\File.ext"

    then grab the last file name the command reports and use your script to check its date.

    Note also that the line
        If DateDiff(d, Now, objFile.DateCreated) < 2 Then
    won't work. It should read
        If DateDiff("d", Now, objFile.DateCreated) < 2 Then

    Friday, August 12, 2011 9:55 PM
  • If you're prepared to switch to Powershell, this is how easy it is (and fast, too):

     

    cd c:\somedir
    $lastwritten = 0
    foreach ($file in (gci -ea silentlycontinue)) {if ($file.lastwritetime -gt $lastwritten) {$newestfile = $file.name; $lastwritten = $file.lastwritetime}}
    "The newest file is $newestfile, written on $lastwritten" 
    

     Note that this can easily modified to include subdirectories, by simply using (gci -r).  I ran this from the root of my C: drive, and it took less than 30 seconds to show me the newest file on the entire drive!


    If you found this post helpful, please give it a "Helpful" vote. If it answered your question, remember to mark it as an "Answer".






    • Edited by Bigteddy Saturday, August 13, 2011 9:00 AM
    Saturday, August 13, 2011 5:08 AM
  • Here is a demo of some new and very useful features of PowerShell.

    Run this and try to explain to yourself what is happening.

    GCI c:\ -r -ea silentlycontinue| ForEach-Object `
      -begin{
        $VerbosePreference='continue'
        Write-Verbose "Begin processig"
        $newestfile=''; 
        $lastwritten =0 
       } `
      -process{
        if($_.lastwritetime -gt $lastwritten){
         Write-Verbose "Newer file found $($_.name)"
         $newestfile = $_.name 
         $lastwritten = $_.lastwritetime
        }
       } `
      -end{
        "The newest file is $newestfile, written on $lastwritten"
        $VerbosePreference='silentlycontinue'
      }
    
    
    

    Some CmdLets are really just empty shells.  When we use ForEach-Object we are really callingon an empty CmdLet and giving it a custom'process' block.  Where-Objectt {} is the same as Where -process {}.  We can also add begin and end.

    In long iterative processes like this it is more efficient to use the pipeline instead of foreach( x in y) especially on multi processor systems.  I suspect the next version of PowerShell will be evenmore efficient at parallelism in the pipeline. (this is not the best example of this)

    Type: help foreach -full

    It is likely that you c drive scan was very fast due to the indexer (MSSearch).  For info about files like name and dates teh indexer accelerates the search by many magnitudes.  Turn off teh indexer and rerun to see the difference.

    You can add custom attributes to the indexers indexes.

     

     


    jv
    Saturday, August 13, 2011 7:54 AM
  • In long iterative processes like this it is more efficient to use the pipeline instead of foreach( x in y) especially on multi processor systems.  I suspect the next version of PowerShell will be evenmore efficient at parallelism in the pipeline. (this is not the best example of this)

     


    jv


    I haven't studied the code yet, but just quickly trying it against mine, yours was consistently slower (31 secs) vs. mine (26 seconds). I have a quad-core processor. (Indexing is on)  (I didn't actually time it before, I just guessed, but I was a bit keen with 10 seconds!)

    Will study interesting code and come back to you.


    If you found this post helpful, please give it a "Helpful" vote. If it answered your question, remember to mark it as an "Answer".

    Saturday, August 13, 2011 8:46 AM

  • I haven't studied the code yet, but just quickly trying it against mine, yours was consistently slower (31 secs) vs. mine (26 seconds). I have a quad-core processor. (Indexing is on)  (I didn't actually time it before, I just guessed, but I was a bit keen with 10 seconds!)

    Will study interesting code and come back to you.



    Out of curiosity: How long does it take for the same folder to run the command

    dir /od /b

    while in a Command Prompt?

    Saturday, August 13, 2011 9:16 AM
  • Out of curiosity: How long does it take for the same folder to run the command

    dir /od /b

    while in a Command Prompt?


    I ran dir /od /b /s, which is the equivalent.  It took 1 minute 15 seconds to complete. (?)
    If you found this post helpful, please give it a "Helpful" vote. If it answered your question, remember to mark it as an "Answer".
    Saturday, August 13, 2011 9:27 AM
  • I haven't studied the code yet, but just quickly trying it against mine, yours was consistently slower (31 secs) vs. mine (26 seconds). I have a quad-core processor. (Indexing is on)  (I didn't actually time it before, I just guessed, but I was a bit keen with 10 seconds!)


    I will check that.  In some cases teh pipeline definitely does not allow for parallelism.  It depends on whether the segment uses IO or not I believe.  It is clear that this version of Powershell will block more often as supprt for multiple thread is apparently not yet implemented on all CmdLets. My example will certainly block and likely be slower.

    I am still trying to track down all of the rules of this.

    The code was posted as an interesting and porentially helpfull appraoch to writing searches.  Not the pipeline as much as the use of the ForEach shell to synthesize an Advanced function.

    If you are worried about a little performance then it will still work the way you like.

    foreach ($file in (gci -r -ea silentlycontinue)) | ForEach-Object `
     -begin{
      $VerbosePreference='continue'
      Write-Verbose "Begin processig"
      $newestfile=''; 
      $lastwritten =0 
      } `
     -process{
      if($_.lastwritetime -gt $lastwritten){
       Write-Verbose "Newer file found $($_.name)"
       $newestfile = $_.name 
       $lastwritten = $_.lastwritetime
      }
      } `
     -end{
      "The newest file is $newestfile, written on $lastwritten"
      $VerbosePreference='silentlycontinue'
     }
    
    
    


     


    jv
    Saturday, August 13, 2011 9:56 AM
  • foreach ($file in (gci -r -ea silentlycontinue)) | ForEach-Object ` -begin{ $VerbosePreference='continue' Write-Verbose "Begin processig" $newestfile=''; $lastwritten =0 } ` -process{ if($_.lastwritetime -gt $lastwritten){ Write-Verbose "Newer file found $($_.name)" $newestfile = $_.name $lastwritten = $_.lastwritetime } } ` -end{ "The newest file is $newestfile, written on $lastwritten" $VerbosePreference='silentlycontinue' }

     


    jv

    I don't know what to say.  This code doesn't work.  The foreach construct is wrong.  jv, ?
    If you found this post helpful, please give it a "Helpful" vote. If it answered your question, remember to mark it as an "Answer".
    Saturday, August 13, 2011 10:43 AM
  • I ran dir /od /b /s, which is the equivalent.  It took 1 minute 15 seconds to complete. (?)

    Teddy - it is not the same comamnd. You are asking it to sort teh results with /od.  The PowerShell just scans. 

    You also need to supress the output or it will take forever.

    Measure-Command {cmd /c dir c:\ /s > nul:}

    Mine takes 16 seconds.

    This in Powershell using your 'faster' method takes 37 seconds.

    measure-command {
        foreach($file in (gci c:\ -r -ea silentlycontinue)){
             if ($file.lastwritetime -gt $lastwritten){
                 $newestfile = $file.name
                 $lastwritten = $file.lastwritetime
             }
        }
       "The newest file is $newestfile, written on $lastwritten"
    }

    The DOS dir command will always be faster as it is direct to the API and has been optimized for years. I am pretty sure it uses the indexer when it can.  PowerShell has to go through the Framework to get to and from the API.  The Framework introduces a specific performance penalty.  The CLR maintains its isolation from unmanaged code and crossing that boundary is a bit expensive.

     

     

     


    jv
    Saturday, August 13, 2011 10:57 AM
  • On further testing, there isn't much to choose from between them.  What is interesting is that if I modify my code slightly to write out each time the If statement is true, the difference in the behaviour of the two programs becomes apparent.  With this code:

     

    cd c:\
    $lastwritten = 0
    foreach ($file in (gci -r -ea silentlycontinue)) {if ($file.lastwritetime -gt $lastwritten) {$newestfile = $file.name; $lastwritten = $file.lastwritetime; `
    Write-Host "Found newer file $($file.name)"}}
    "The newest file is $newestfile, written on $lastwritten" 
    

    ...there is a long pause while the (gci -r) executes, and no output, until the whole lot is in memory, then the names get spat out.  With this code:

     

     

    GCI c:\ -r -ea silentlycontinue| ForEach-Object `
     -begin{
     $VerbosePreference='continue'
     Write-Verbose "Begin processig"
     $newestfile=''; 
     $lastwritten =0 
     } `
     -process{
     if($_.lastwritetime -gt $lastwritten){
      Write-Verbose "Newer file found $($_.name)"
      $newestfile = $_.name 
      $lastwritten = $_.lastwritetime
     }
     } `
     -end{
     "The newest file is $newestfile, written on $lastwritten"
     $VerbosePreference='silentlycontinue'
     }
    
    
    ...the names of newly found files are printed out immediately, because you are piping gci straight to the customized ForEach-Oject cmdlet.

     


    If you found this post helpful, please give it a "Helpful" vote. If it answered your question, remember to mark it as an "Answer".
    Saturday, August 13, 2011 10:58 AM
  • I don't know what to say.  This code doesn't work.  The foreach construct is wrong.  jv, ?

    Yup - that is what I get for typing without thinking first.

    I was going to do something else but to late now.

     


    jv
    Saturday, August 13, 2011 11:00 AM
  • The DOS dir command will always be faster as it is direct to the API and has been optimized for years. I am pretty sure it uses the indexer when it can.  PowerShell has to go through the Framework to get to and from the API.  The Framework introduces a specific performance penalty.  The CLR maintains its isolation from unmanaged code and crossing that boundary is a bit expensive.

     


    jv

    Point taken, but the DOS dir command doesn't exactly answer the OP's question, does it now?
    If you found this post helpful, please give it a "Helpful" vote. If it answered your question, remember to mark it as an "Answer".
    Saturday, August 13, 2011 11:08 AM
  • On further testing, there isn't much to choose from between them.  What is interesting is that if I modify my code slightly to write out each time the If statement is true, the difference in the behaviour of the two programs becomes apparent.  With this code:

     

    Just comemnt out the $verbose settings and it will run the same and should be a bit faster.  Comment out all outpput statements.

    I have tried it on three machines and mine always runs an average of 500ms to 1.5 seconds faster.

    I will try against a much larger machine later.  I say the pipeline is faster for most things on a multiprocessor machine.

    Back to the topic though. 

    Using the pipeline and ForEach-Object or Where-Object adds many features to the processing that can help since these CmdLets are really just shells that you can customize. ForEach-Object always returns an object or nothing and Where-Object always forces the output depending on what you return. retuning null or $false will cause the object in the pipeline to be thrown away.  Returning true will cause the object to be passed.

    Dirext use of teh WIndows Serch index would bethe fastest assuming that the files are indexed on the OPs system.  They are not by default on most servers. but the service can easily be enabled on one folder set.

    Windows 7 most often has the indexer installed and running.

    50,000 files are a lot of files to scan.  It can take a very long time on a busy server.  Using a WMI event would be fastest.  The event would just recor the creation time of the last file that met teh naming criteria.  The dat awould always be up to date. Put it in the registry and just query the registry to find out.

     

     


    jv
    Saturday, August 13, 2011 11:15 AM
  • It works just fine.  But since all i really want to know is if there is a file created the day it is run or 1 day older, is there a method that is faster?  Going through 50k or so file for each iteration takes a long time and chews up resources.

     

    Is there a naming convention that uses the date or is it some kind of serial number?  If there is a name that has a date in it then it narrows down the search.

    "file20110811.zip" - it could be more complex but this will work.

    Dir file101108*

    This would select a very small subset of all files. This is fast even when the folder has 50000 files as it uses oonly the directory and doesn't relaly need to use the creation date (also in the directory).

    Unfortuantely you cannot use wildcards with the FSO.  You can use PowerShell or use shell folders namespace.  I recommend PowerShell as it is easier.

     

     


    jv
    Saturday, August 13, 2011 11:24 AM
  • Point taken, but the DOS dir command doesn't exactly answer the OP's question, does it now? ".


    The DOS dir command was not my idea was it?

    DO this

    $results= cmd /c dir file* /od

    If the OP shooses the wildcard filter to narrow the list of files to only thos from the current day or two then the $result[-1] will have newest file and it will be extremely fast.

    PowerShell and DOS have a huge time penalty when writing to the screen. The priorit of test rite to teh screen is extremely low on teh IO prioritylist. Everything slows it down.  Sibce this is synchronous we are actually measuring, in part, the IO load when we measure either DOS or PowerShell with screen output.  assigning results ot a variable is much faster. Assigning to the bit bucket is even faster.  It all depends what you want to measure.

     


    jv
    Saturday, August 13, 2011 11:34 AM
  • I think the OP must come back.  This has gone a bit flat for me.
    If you found this post helpful, please give it a "Helpful" vote. If it answered your question, remember to mark it as an "Answer".
    Saturday, August 13, 2011 11:37 AM
  • This is the fastest technique so far.

    I searched through more than 50,000 files in less that 3 seconds.

    GCI  \\server1\public -r -filter 'daily*'| 
         Where-Object{$_.CreationTime -gt (Get-Date).AddDays(-1)}|
         S
    ort-Object CreationTime |
         Select-Object -last 1

    Searches and returns only files created within the previous 24 hours that match the search pattern.  The file I retrieved was buried very deep in the hierarchy.

    Which one of these techniques should be used depends on teh structure of the file names.  The Weher-Object method would be faster than matching every date.

     


    jv
    Saturday, August 13, 2011 12:03 PM
  • ...., what about the Filesystem Watcher?  I found this code, and it works, but I want to modify it to notify on all file change events, not just created:


    Now you are cooking with onions and garlic.

     Let us throw in some jalepenos.

    http://msdn.microsoft.com/en-us/library/system.io.filesystemwatcher.aspx

    This is one of the techniques I mentioned above.  It parks a watcher on a folder and alerts whenever something happens.  The big drawback is that it has to be kept running.  The solution is to park it on a scheduled task that runs once on system startup and uses an account that can receive events then send the result to a file, email or the registry.

    I prefer the WMI event system as it gets installed once. Runs under WMI and can execute script, send mail or write to the registry.  I'll pull one of my MOFs and clean it up for you.

     

     


    jv

    • Edited by jrv Saturday, August 13, 2011 3:41 PM
    Saturday, August 13, 2011 3:40 PM
  • Bump, see my modified code...
    If you found this post helpful, please give it a "Helpful" vote. If it answered your question, remember to mark it as an "Answer".
    Saturday, August 13, 2011 3:41 PM
  • This is one of the techniques I mentioned above.  It parks a watcher on a folder and alerts whenever something happens.  The big drawback is that it has to be kept running.  The solution is to park it on a scheduled task that runs once on system startup and uses an account that can receive events then send the result to a file, email or the registry.

     


    jv

    I'm quite pleased with this solution, because although it runs all the time, so what?  So do all your services.  All it does, surely, is register event handlers, and uses the file system api to monitor events and react to them.  So it's only working whenever an event occurs, really?

    I find it works really well.

    I bet you couldn't do that with VBS!


    If you found this post helpful, please give it a "Helpful" vote. If it answered your question, remember to mark it as an "Answer".
    Saturday, August 13, 2011 3:51 PM
  • gci c:\ | sort lastwritetime | select -Last 1
    

    Keep it simple.
    If you found this post helpful, please give it a "Helpful" vote. If it answered your question, remember to mark it as an "Answer".

    Rich Prescott | MCITP, MCTS, MCP

    Blog | Twitter: @Arposh | Powershell Client System Administration tool
    Saturday, August 13, 2011 4:01 PM
    Moderator
  • gci c:\ | sort lastwritetime | select -Last 1
    

    Keep it simple.
    If you found this post helpful, please give it a "Helpful" vote. If it answered your question, remember to mark it as an "Answer".

    Rich Prescott | MCITP, MCTS, MCP

    Blog | Twitter: @Arposh | Powershell Client System Administration tool


    Hi Rich,

    That's putting us back on planet earth!  I like it.  But I also like my FileSystemWatcher solution, as this seems more complete.  But the OP hasn't exactly explained what he really wants, so we're kind of groping around in the dark here, and maybe having a bit of fun on the side!

    What really suits will be what answers the OP's question, and I'm not sure if he wants to simply check just once for the last write date, like your code, or to continuously monitor a set of folders and log changes (my code).


    If you found this post helpful, please give it a "Helpful" vote. If it answered your question, remember to mark it as an "Answer".
    Saturday, August 13, 2011 4:12 PM
  • I bet you couldn't do that with VBS!

    Here is the VBS solution.

    strComputer = "."
    Set wmi=GetObject("winmgmts:\\" & strComputer & "\root\cimv2")
    
    query="Select * From __InstanceCreationEvent Within 2" _
       & "Where TargetInstance Isa 'Cim_Datafile' " _
       & "And TargetInstance.Path = '\\Scripts\\'"
    
    Set oEvents=wmi.ExecNotificationQuery(query)
    
    Do
       Set evnt = oEvents.NextEvent()
       WScript.Echo evnt.TargetInstance.Name
    Loop
    

    This will list each new file that is created in the scripts folder.


    jv
    Saturday, August 13, 2011 4:58 PM
  • gci c:\ | sort lastwritetime | select -Last 1
    
    
    
    

    Keep it simple.
    If you found this post helpful, please give it a "Helpful" vote. If it answered your question, remember to mark it as an "Answer".

    Rich Prescott | MCITP, MCTS, MCP

    Blog | Twitter: @Arposh | Powershell Client System Administration tool


    Been there done that.  See my earlier post on using the srot.

    Problem: OP say 50000+ files.  I tested that on a drive with about 40000 files and it took forever.

    Add -recurse on large drive and try it. The sort can ve killer

    The fastest is this variation (also posted earlier)

    gc- c:\ -r -include 'file2011"' | sort filecreationtime | select last 1

    By adding a filter that narrows the search in  way the the NTFS directory can be used we can accelerate the search by a factir of 10 to 100.   Run timings on very large drives with high directory fragmentation and I think you will see what I mean.

    In the past I have always used either WMI event MOFs or the FileSystemWatcher to do this as it is very efficient and silent. No need for a script once it is installed(MOF).

    The FileSystemWatcher has been around since Microsoft first added eventing to Windows NT.  In the earlier days we could only access it viaC \ C++. AT&T has a very complex dataloader that runs under a custom FSW that I designed back in 1998.  It detects filel ftp'ed into a very large data warehouse system that are loaded into a the data warehous and into an ESSBASE Cube that reports on call completion statistics for an AT&T CS call center. 

    I don't know if they still use it. They were last I talked to someone over there.  No one remembers where it came from or how it works.  It just works.

    Now we can do this with a MOF which is much easier. The AT&T system built it on a service tht used the FSW eventing (actually its SENS). The eventlog monitor also runs on SENS. SENS has been part of NT since W2K.


    jv
    Saturday, August 13, 2011 5:12 PM
  • If you're really bored, and you want to see what your computer does while you stare blankly at it, run this script:

    $folder = 'c:\'
    $filter = '*.*'              
    $fsw = New-Object IO.FileSystemWatcher $folder, $filter -Property @{IncludeSubdirectories = $true;NotifyFilter = [IO.NotifyFilters]'FileName, LastWrite'}
    $onCreated = Register-ObjectEvent $fsw Created -SourceIdentifier FileCreated -Action {
    $name = $Event.SourceEventArgs.Name
    $changeType = $Event.SourceEventArgs.ChangeType
    $timeStamp = $Event.TimeGenerated
    Write-Host "The file '$name' was $changeType at $timeStamp" -fore green}
    $onDeleted = Register-ObjectEvent $fsw Deleted -SourceIdentifier FileDeleted -Action {
    $name = $Event.SourceEventArgs.Name
    $changeType = $Event.SourceEventArgs.ChangeType
    $timeStamp = $Event.TimeGenerated
    Write-Host "The file '$name' was $changeType at $timeStamp" -fore red}
    $onChanged = Register-ObjectEvent $fsw Changed -SourceIdentifier FileChanged -Action {
    $name = $Event.SourceEventArgs.Name
    $changeType = $Event.SourceEventArgs.ChangeType
    $timeStamp = $Event.TimeGenerated
    Write-Host "The file '$name' was $changeType at $timeStamp" -fore blue}
    
    

    To stop all this output, type:

    Unregister-Event Filechanged
    Unregister-Event Filecreated
    Unregister-Event Filedeleted
    


    If you found this post helpful, please give it a "Helpful" vote. If it answered your question, remember to mark it as an "Answer".
    Saturday, August 13, 2011 5:27 PM
  • Teddy - can we move this to a discussion.  I am sorry. Its my fault because I posted slightly off-topic posts and should have moved it back then.

    Let's move it to a discussion.  It is a good discussiona nd your explorations are discovering a lot of things that many learning WIndows and PowerShell would love to get in their mail as you add items.

    I have started on discussion on the WMI CodeCreator utility from Microsoft and WMI tools.  I posted a link you might find interesting.

     


    jv
    Saturday, August 13, 2011 5:43 PM
  • I saw that, we can take it from there.  Cheers, jv.
    If you found this post helpful, please give it a "Helpful" vote. If it answered your question, remember to mark it as an "Answer".
    Saturday, August 13, 2011 5:57 PM
  • Thanks for all the help guys.  I unfortunately can't use power shell at this time.  Yes, I know I need to move in that direction but at this time, the full script will be running on a bunch of W2k3 boxes without Powershell installed.  Also, it needs to be a run once a day type scheduled task.  If I run it as a stream checking for new files I'd quickly become overwhelmed.  3-4 times a day at random intervals the directory is written too with about 60-100 files per task.  I just need a snapshot in time.

     

    I don't even care what the file is, just that it has been created sometime in the last 24 hours.

     

    Monday, August 15, 2011 1:10 PM
  • So, to clarify your need, you want a script that

    1. Will search through a directory (or directories?) for files periodically,

    2. Report on all files that have been created in the last 24 hours.

    3. You don't want to know about changed files, or deleted files, just new files.


    Monday, August 15, 2011 1:31 PM
  • Given that you have so many files to seach through, the general consensus was that to iterate through them would not be the most efficient way of doing it.

    Following is jrv's vbs code that uses WMI eventing to report on files that are created:

    strComputer = "."
    Set wmi=GetObject("winmgmts:\\" & strComputer & "\root\cimv2")
    
    query="Select * From __InstanceCreationEvent Within 2" _
     & "Where TargetInstance Isa 'Cim_Datafile' " _
     & "And TargetInstance.Path = '\\Scripts\\'"
    
    Set oEvents=wmi.ExecNotificationQuery(query)
    
    Do
     Set evnt = oEvents.NextEvent()
     WScript.Echo evnt.TargetInstance.Name
    Loop
    
    

    Monday, August 15, 2011 1:45 PM
  • I don't even want that much.  I just want to know if there is a file that was created within the last 24 hours. 
    Monday, August 15, 2011 1:57 PM
  • Send the output to a file and you will always have the latest file date at teh end of teh file.

    strComputer = "."
    Set wmi=GetObject("winmgmts:\\" & strComputer & "\root\cimv2")
    
    query="SELECT * FROM __InstanceCreationEvent WITHIN 5" _
       & "WHERE TargetInstance Isa 'Cim_Datafile' " _
       & "AND TargetInstance.Path = '\\Scripts\\' " _
       & "AND TargetInstance.Drive='C:'"
    
    Set events=wmi.ExecNotificationQuery(query)
    
    Do
       Set evnt = events.NextEvent()
       WScript.Echo evnt.TargetInstance.Name, evnt.TargetInstance.CreationDate
    Loop
    
    
    

    I have added the file date at the end.

    What you are asking for is kind of vague and you have not provided is an example of what you are trying to do.

    Please try to understand that thsi is a scripting forum.  It is for people who are trying to learn scripting or who have an issue with a script that they have created.  It is not a scripts-on-demand forum.

    TH thread has offered and discussed many approaches to your problem.  You need to try each suggestion or provide more information and a copy of the code you are trying to use.

     

     


    jv
    Monday, August 15, 2011 2:21 PM
  • I agree with Bigteddy's approach.  I would modify his sample a little to open a flag file to contain the time the file event happened and then have your test script read that time when you want to check it.  You can also extend the sample time in his script to test less often, say ever minute (60 seconds) or ten minutes (600 seconds), something like this ...

    strComputer = "."
    Set wmi=GetObject("winmgmts:\\" & strComputer & "\root\cimv2")

    query="Select * From __InstanceCreationEvent Within 60 " _
    & "Where TargetInstance Isa 'Cim_Datafile' " _
    & "And TargetInstance.Path = '\\Scripts\\'"

    Set oEvents=wmi.ExecNotificationQuery(query)

    Do
       Set evnt = oEvents.NextEvent()
       objFSO.OpenTextFile("D:\someplace\flag.txt", 2,true).writeline Now
    Loop

    Then your test script could like looks something like this ...

    bcompliant = false
    strDate = Now
    objAppendFile.WriteLine("Checking for newest File")
    if objFSO.FileExists((("D:\someplace\flag.txt") then
      strDate = objFSO.OpenTextFile(("D:\someplace\flag.txt", 1).ReadLine
    else
    bCompliant = (DateDiff("d", Now, strDate) < 2 )

    If bcompliant Then objAppendFile.Write(fullPath & " passes")

    I have assumed the existence of the objFSO and the other file object you showed earlier.

    The first script sits idle until it tests for an event and runs forever.  I would expect it to use very little CPU time.  So, I think it best meets your requirements.

    HTH,


    Tom Lavedas
    Monday, August 15, 2011 2:30 PM
    Moderator
  • Perhaps we should start afresh. You claimed that your code

    compliant = 0
    objAppendFile.WriteLine("Checking for newest File")
    For Each objFile in objFSO.GetFoldeR(fullPath).Files
       If DateDiff(d, Now, objFile.DateCreated) < 2 Then 
    	 compliant = 1
    	 break
       End If
    Next
    

    'works'. Pegasus spotted the "d" and there is no "break" in VBScript. Did you test code with "d" and "Exit For"? Is that still too slow?

    Staying within VBScript, your strategy (looking at each file in principle, but breaking on the first one meeting the filter/search criteria) seems optimal to me; sorting or monitoring are of no use in your cenario.

    Did you test Pegasus' proposal to use dir /o:d?

    Monday, August 15, 2011 2:30 PM
  • I agree with Bigteddy's approach.  I would modify his sample a little to open a flag file to contain the time the file event happened and then have your test script read that time when you want to check it.  You can also extend the sample time in his script to test less often, say ever minute (60 seconds) or ten minutes (600 seconds), something like this ...

    strComputer = "."
    Set wmi=GetObject("winmgmts:\\" & strComputer & "\root\cimv2")

    query="Select * From __InstanceCreationEvent Within 60 " _
    & "Where TargetInstance Isa 'Cim_Datafile' " _
    & "And TargetInstance.Path = '\\Scripts\\'"

    Set oEvents=wmi.ExecNotificationQuery(query)

    Do
       Set evnt = oEvents.NextEvent()
       objFSO.OpenTextFile("D:\someplace\flag.txt", 2,true).writeline Now
    Loop

    Then your test script could like looks something like this ...

    bcompliant = false
    strDate = Now
    objAppendFile.WriteLine("Checking for newest File")
    if objFSO.FileExists((("D:\someplace\flag.txt") then
      strDate = objFSO.OpenTextFile(("D:\someplace\flag.txt", 1).ReadLine
    else
    bCompliant = (DateDiff("d", Now, strDate) < 2 )

    If bcompliant Then objAppendFile.Write(fullPath & " passes")

    I have assumed the existence of the objFSO and the other file object you showed earlier.

    The first script sits idle until it tests for an event and runs forever.  I would expect it to use very little CPU time.  So, I think it best meets your requirements.

    HTH,


    Tom Lavedas


    Put script in file 'filedetection.vbs' and run like this

    filedetection.vbs > lastfile.log

    The date on the file will be the time of the last detection.

    You can also use my amended script (just posted) like this:

    filedetection.vbs >> lastfile.log

    This will create a running log of all files created and teh timestamp on the file will reflect the create time of the LAST file.

     

     


    jv
    Monday, August 15, 2011 2:47 PM
  • Well it worked until I decided not to look at every file and decide which was the newest.  :) 

    I'll try it again with the fixes in the current code and with Pegasus' proposal.

     

    Thanks a lot guys.

    Monday, August 15, 2011 2:47 PM
  • OK, the time could be off by a minute.  Using the CreationDate would make it more accurate.

    Using an ECHO to a redirection might cause an 'file locked' conflict, though I haven't tested it.  Writing directly to the file should make that a remote possibility.


    Tom Lavedas
    Monday, August 15, 2011 3:16 PM
    Moderator
  • This simple code will do what you asked for:

     

    nonew = 0
    
    'Change as needed
    strFolder = "C:\scripts"
    
    Set fso = CreateObject("Scripting.FileSystemObject")
    Set fld = fso.GetFolder(strFolder)
    
    For Each fil In fld.Files
     If DateDiff("h", fil.datecreated, Now) < 24 Then
      MsgBox ("A new file was created today or yesterday: " & fil.Name)
      wscript.Quit
     Else
      nonew = 1
     End If
    Next
    
    If (nonew = 1) Then
     MsgBox ("No new files in the last 24 hours")
    End If
    If (nonew = 0) Then
     MsgBox ("No files were found in that folder")
    End If
    

     


    Monday, August 15, 2011 4:12 PM
  • I did some tests on a folder containing 10,000 (very old) files, the last on which I modified. Before and after the modification a loop over all 10,000 files took about 50 secs. Then I explored Pegasus' proposal (always a good idea) and came up with this POC code:

     

    Function getInterestingFileName()
     getInterestingFileName = "not found"
     Dim sDir : sDir  = goFS.GetAbsolutePathName( "..\testdata\copymissing\gifs" )
     Dim dtLimit : dtLimit = Now - 2
     Dim sCmd : sCmd  = "%comspec% /c dir /o:-d /t:a " & qq( sDir )
     Dim oWSH : Set oWSH = CreateObject( "WScript.Shell" )
     Dim oExec : Set oExec = oWSH.Exec( sCmd )
     Dim sLine
     Do
     Select Case True
      Case oExec.Status <> cnWshRunning
      Exit Do
      Case oExec.Stdout.AtEndOfStream
      Exit Do
      Case Else
      sLine = oExec.Stdout.ReadLine()
      If "." = Mid( sLine, 3, 1 ) Then
       WScript.Echo sLine
       oExec.Terminate
       Exit Do
      End If
     End Select
     Loop
     If IsDate( Left( sLine, 10 ) ) Then
      If CDate( Left( sLine, 10 ) ) > dtLimit Then
      getInterestingFileName = sLine
      End If
     End If
    End Function
    

    This took about 3 secs for succeeding or failing searches. So the idea to use .Exec to get the first line of a /o-d sorted folder listing and avoiding the output of the not interesting rest seems promising. I'm not happy about the missing error handling and the date check needs testing (my XP has locale problems).

    The date check should be:

    ' 21.09.2010 21:34         1 9999.gif
     Const cnCutOff = 37 ' for this version of os/dir/locale
     WScript.Echo goFS.BuildPath( sDir, Mid( sLine, cnCutOff ) )
     If goFS.GetFile( goFS.BuildPath( sDir, Mid( sLine, cnCutOff ) ) ).DateLastModified > dtLimit Then
       getInterestingFileName = sLine
     End If
    

    to avoid problems caused by locales and the reluctance of the .Exec & dir combination to honor /t:a.

     

     


    Monday, August 15, 2011 4:14 PM