locked
Search in huge directorystructore for filenames listed in a .csv file RRS feed

  • Question

  • Hi Scripting Guys!

    I need to verify if thousands of pictures on a remote disk exists somewhere on my laptop.

    So basically I need to compare two very different directorystructures.

    I think this is a little bit complex, since I have to import the .csv to Powershell - for every single filename I need to make a recursive search in my laptop.

    On the external disk I have approx 100.000 pictures, but the directorystructure on this disk is much different than the structure on my computer.

    I have extracted all the filenames from this external disk in a .csv file.

    For every single filename listed in the .csv file - I need to know if the files exists somewhere on my laptop, and the whole directory.

    It´s necessary for me to know, if there are any files, that are missing on my laptop.

    This is the structure of my .csv file:

    IMG_4761.JPG
    IMG_4762.JPG
    IMG_4763.JPG
    IMG_4764.JPG

    Best regards Soeren Jensen

    Saturday, January 4, 2020 9:09 AM

All replies

  • Sorry but this is not the correct forum for your request. This forum is for technicians who have questions about a script they have written,

    Note that he current version off Windows 10 has built in capability to detect duplicate images in files.  There are also numerous free programs that can do this.  No one with computer experience today would try to do this with a script.

    Please carefully review the following links to set your expectation for posting in technical forums.


    \_(ツ)_/

    Saturday, January 4, 2020 9:47 AM
  • Hi jrv Thank you for your answer.  I understand your response, but I would like to clarify one thing, you might have misunderstood. My question is not about searching for duplicate files, my intention is to compare two very large and different directories for missing files. I have used a lot of time testing different solutions using PowerShell, but I've been unable to make it work in my situation, where I have two very different directories.  I have allready created a solution using Excel, in this solution I'm comparing two columns using vlookup. This solution is not easy to use.    I would like to implement an easier test using PowerShell.   Please tell me - do you have any suggestions regarding where to ask?  Best regards Soeren Jensen
    Saturday, January 4, 2020 10:57 AM
  • We still won't write4 a script for you. If you want a script then start writing it and ask specific questions about the script when you have issues.

    You can also look for utilities that will compare folder structures and tell you if they are equal.

    Learning to script properly with PowerShell


    \_(ツ)_/

    Saturday, January 4, 2020 11:13 AM
  • Hi soerenjensen4

    Just think about this :

    • Gather list of image files in a var (Using Get-childItem -recurse -file -filter "*.jpg")
    • Gather list of image file on the remote disk an put the result in a var as the same way.
    • And now, let's do the magic : use Compare-Object cmdlet ... it's a very fast way
    • after that, it could be easy to copy missing item.

    regards

    Olivier

    P.S. : there is a alternative way. If a image file has been renamed but this content is the same. Use Get-Hash on all files and compare $HashFiles. It's a very fast way to follow.

    Saturday, January 4, 2020 2:10 PM