locked
FileSystemObject GetFolder/Files method on a URL? RRS feed

  • Question

  • I have done some searching but haven't found what I'm looking for, so I figured I was either asking the question wrong, or the answer didn't exist.
    Hopefully someone has a definitive answer here.

    Basic scenario:

    VB Script that will look for specific files in a directory and download/copy a specific file based on some logic.

    For local directory it is very easy getFolder works on c:\TestDir – no problem.

    I wanted to know if I can use the same type of method on a URL which points to the same directory and return the list of files so I could “loop” through them.

    I want to do basically the following code: (modified from scripting guys example to make it easier)

    Set objFSO = CreateObject("Scripting.FileSystemObject")

    objStartFolder = "http://somewebsite/somefolder"

    Set objFolder = objFSO.GetFolder(objStartFolder)

    Set colFiles = objFolder.Files

    For Each objFile in colFiles

        Wscript.Echo objFile.Name

    Next

    This currently give "path not found"
    The http://somewebsite/somefolder would obviously have the directory listing properly enabled.  And it works to view the directory listing in IE.

     

    Is this possible? Or some other code that accomplishes the same thing.


    Thanks!

    Monday, August 3, 2009 3:26 PM

Answers

  • I don't believe there is any way to do what you want, unless you have listing access to the target folder.  That is, if you direct a browser to that address, does it return a listing of all of the files and folder contained under that location.  If it doesn't and rather it gives a 404 error or opens an index page, then you can't get a file listing, AFAIK.  That's a characteristic of the internet protocol.  It is done that way to give a site's web builder control of how the content is presented.

    If however, it does give a listing, then that listing can be accessed using this subroutine ...

    Function GetXml(sURL)
     ' Create an xmlhttp object:
     With CreateObject("Microsoft.XMLHTTP")
       .open "GET",sURL
       .send
       GetXml = .responseText
     End With
    End Function

    The response text would then need to be parsed to extract the file names.  That parsing is likely to be site dependent because different servers construct the file listing differently.

    The contents of the individual files could then be accessed in a similar manner, depending on exactly what form you want the result to be in - HTML/XML/Text/Binary.

    The options are .responseHTML for xml and html and .responseBody for binary results.  Writing binarys to disk requires special handling, something like this ...

    Sub DownBinFile(FilePath, sURL)
      const adTypeBinary = 1
      const adModeReadWrite = 3
      const adSaveCreateOverwrite = 2
     ' Create an xmlhttp object:
      set oXML = CreateObject("MSXML2.XMLHTTP")
      oXML.open "GET", sURL, False
      oXML.send
      With CreateObject("ADODB.Stream")
        .type = adTypeBinary
        .mode = adModeReadWrite
        .open
        .write oXML.responseBody
        .savetofile FilePath, adSaveCreateOverwrite
      End With
    End Sub


    Tom Lavedas
    • Marked as answer by KevinVernon Monday, August 3, 2009 6:29 PM
    Monday, August 3, 2009 6:18 PM

All replies

  • Hi Niveknonrev01,

    I don't have an answer for you, but as you noticed, you can't use the FileSystemObject object to do this as it only supports local and UNC paths.

    Bill
    Monday, August 3, 2009 4:39 PM
  • Thanks for the answer!

    So is there another VBScript funciton etc. that can accomplish the same thing given a URL as input that points to a directory, listing the files?

    I want to read the list of files contained in the given URL and choose one to download.
    Because the file names can change I can't just hardcode the file name.

    Any other methods to accomplish this?
    Monday, August 3, 2009 4:49 PM
  • If you are open to Powershell you can access URLs.

    Here is an example script

    http://bsonposh.com/archives/680
    Brandon Shell [MVP]
    Monday, August 3, 2009 4:50 PM
  • Thanks that works, BUT - only if powershell is installed.
    I ran with that first, and then it failed on several systems becasue powershell was not installed etc.
    Without the ability to rollout powershell to all servers anytime soon (not my call), I wanted something more universal, and thought VBScript would be the way to go.

    Or if there is any vbscript that can at least dump the results of that URL into a local text file, then I could loop through the text file.

    Make sense?
    Monday, August 3, 2009 5:29 PM
  • You could use Powershell on a central server to download the files to a UNC path. Then you could use vbscript on the servers.
    Brandon Shell [MVP]
    • Proposed as answer by Bill_Stewart Monday, August 3, 2009 6:14 PM
    Monday, August 3, 2009 6:02 PM
  • I don't believe there is any way to do what you want, unless you have listing access to the target folder.  That is, if you direct a browser to that address, does it return a listing of all of the files and folder contained under that location.  If it doesn't and rather it gives a 404 error or opens an index page, then you can't get a file listing, AFAIK.  That's a characteristic of the internet protocol.  It is done that way to give a site's web builder control of how the content is presented.

    If however, it does give a listing, then that listing can be accessed using this subroutine ...

    Function GetXml(sURL)
     ' Create an xmlhttp object:
     With CreateObject("Microsoft.XMLHTTP")
       .open "GET",sURL
       .send
       GetXml = .responseText
     End With
    End Function

    The response text would then need to be parsed to extract the file names.  That parsing is likely to be site dependent because different servers construct the file listing differently.

    The contents of the individual files could then be accessed in a similar manner, depending on exactly what form you want the result to be in - HTML/XML/Text/Binary.

    The options are .responseHTML for xml and html and .responseBody for binary results.  Writing binarys to disk requires special handling, something like this ...

    Sub DownBinFile(FilePath, sURL)
      const adTypeBinary = 1
      const adModeReadWrite = 3
      const adSaveCreateOverwrite = 2
     ' Create an xmlhttp object:
      set oXML = CreateObject("MSXML2.XMLHTTP")
      oXML.open "GET", sURL, False
      oXML.send
      With CreateObject("ADODB.Stream")
        .type = adTypeBinary
        .mode = adModeReadWrite
        .open
        .write oXML.responseBody
        .savetofile FilePath, adSaveCreateOverwrite
      End With
    End Sub


    Tom Lavedas
    • Marked as answer by KevinVernon Monday, August 3, 2009 6:29 PM
    Monday, August 3, 2009 6:18 PM
  • Excellent!
    I don't know why but I was thinking I couldn't get that stream of text because it wasn't an actual HTML file.  But sometimes the obvious escapes me!  The response is HTML and it does list the file names, so a few loops and arrays later I will have my file listing!!!

    It didn't even occur to me to process the response stream, so thank you for the push in the right direction!

    Much appreciated.
    Thanks
    Kevin
    Monday, August 3, 2009 6:32 PM