DFS Replication - some files not copied.


  • Hi all.

    I have a 'source' file server with over 294GB of 620,000 files, and I set up DFS replication to transfer them over to a new 'destination' server. Replication is set up in both directions, but the initial 'source' folder is on an old Server 2003 R2 machine, and the empty 'target' is a Server 2008 R2 machine.

    After a full week or so it has copied the vast majority of files to the S2008 machine, but for some reason I'm seeing some 300 media and office files that failed to copy, mostly JPG files. I'm using WinMerge to perform a directory comparison between source and destination machines, ignoring the contents of the DFSRPrivate folders.

    I've removed the file filter so it should be copying absolutely everything. I upped the size of the staging area to 200GB because of some very large files. I've checked permissions and they are quite normal, with no strict file or folder permissions that might prevent the DFS service from seeing them.

    When I run the 'Create Diagnostic Report' in the DFS Management snap-in, all I get is a warning about available disk space on the D: drive, but I've got over 600GB free on the destination machine, and 60GB free on the source machine. I also see some standard file sharing violations due to locked files, but none of these are the ones that failed to copy. It's like DFS-R has just failed to identify them at all!

    Can anybody please point me towards some resources that might help me further identify what's going wrong? 

    Many thanks for your help.

    Thursday, April 11, 2013 8:44 AM

All replies

  • Despite your staging area being quite large I'd check that it is actually large enough, as it needs to be as big as the 32 largest files in the replicated folder.  This useful blog post explains how to determine the size using PowerShell:

    You could also use the dfsrdiag backlog command to see which files have not replicated and ensure that there is no schedule applied to the replication to prevent it from running during certain hours.

    Thursday, April 11, 2013 9:38 AM
  • Thanks Frank.

    I've tried running that script but it keeps complaining about path and filename being too long, over 260 characters, which is probably correct given the 600,000+ files -

    Get-ChildItem : The specified path, file name, or both are too long. The fully qualified file name must be less than 260 characters, and the directory name must be less than 248 characters.

    At first I thought it was because I was doing it on a 32bit system with Powershell v1, but I've since tried it on an x64 powershell on Server 2008 R2 but it's coming up with the same error.

    It's still running through though, may take some time - has been over 10 mins so far!

    Thursday, April 11, 2013 2:11 PM
  • Okay so the script completed (although excluding those really deep folders).

    I've just got a result back equating to 23.45GB, which is way lower than the 200GB that I estimated!

    So it's not that unless those really deep folders contain massive files. 

    • Edited by s.d.smith Thursday, April 11, 2013 2:22 PM edit
    Thursday, April 11, 2013 2:20 PM
  • I just tried running dfsrdiag backlog on the relevant replication group, but I got back -

    no backlog - member <a> is in sync with partner <b>.

    Any other areas we can look at?

    Thanks for your efforts.

    Thursday, April 11, 2013 2:30 PM
  • Did you check the backlog using dfsrdiag?  If the Source server is Windows 2003 then maybe it can't replicate the files in the folders that are too long.  This article explains the limitations but I don't believe the same constraints exist in Windows 2008 R2:

    • Edited by Dai Webb Thursday, April 11, 2013 2:55 PM
    Thursday, April 11, 2013 2:52 PM
  • Yes I checked the backlog, it's empty.

    Those limitations are interesting, as the 'source' server with all the files is S2003 R2, which may go some way to explaining it. 

    It's odd though because it has successfully replicated the vast majority, including the really deep folder structure, definitely beyond 260 characters worth, and most files within. All that's missing are around 200 random files dotted all over the place. 

    I read somewhere else somebody suggested you should perform a robocopy of all the data between source and destination THEN enable replication via DFS. As a kind of a fix, can I switch off replication, perform a robocopy update on the data and re-enable replication?

    Many thanks Frank.

    • Edited by s.d.smith Friday, April 12, 2013 11:10 AM clarification
    Friday, April 12, 2013 11:09 AM
  • I believe that's possible but haven't done it myself, so can't speak from experience.

    Saturday, April 13, 2013 8:01 AM