locked
Document Library in WSS 3.0 has 60 000 files RRS feed

  • Question

  • We have a document library that has grown to 60 000 files in wss 3.0. This is having an effect on stability and performance of the site and document library.  I know that you can create folders to try and reduce this number. But doing this manually for 60000 files is kind of risky and the document library grow on a daily basis. I dont want admins to be contionously creating folders to keep up with the growth.

    The dilemma I'm having is archving versus creating folders. If I break down the library into 30 folders, 2000 documents each, is this really going solve the issue for the long run? or is it better just to archive the older files to a different location and not worry about creating these folders?

    Any advise would be helpful.

     

    Thanks

    • Moved by Mike Walsh FIN Friday, June 10, 2011 2:29 PM This is not a customization question. (From:SharePoint - Design and Customization (pre-SharePoint 2010))
    Friday, June 10, 2011 2:05 PM

Answers

All replies

  • According to http://technet.microsoft.com/en-us/library/cc263028.aspx#section4,:

     

    ·        The performance of views of content degrades when the number of items viewed exceeds 2,000 items. Remedies for this limitation are to organize the content in the library into folders each containing 2,000 or fewer items, or to create views that take advantage of indexed columns to return sets of 2,000 or fewer items;

    ·        For folder, 2,000 items per folder is the recommended limit.

     

    However, these are not hard limitation, just recommendation for performance. I would suggest document library views with less than 2000 items is the best option for your scenario.

     

    Thanks,


    Veera Reddy Kolan
    SharePoint Consultant
    Blog:http://veerareddykolan.blogspot.com/
    Friday, June 10, 2011 4:03 PM
  • With that many files you probably need to do both folders and archiving. I would create workflows that move documents into folders based upon metadata. You can also do some things with views to help performance. In any case you should do something sooner rather than later. You can probably get rid of a lot of files by simply archiving the older ones.
    Hope that helps,
    SharePointNinja
    Friday, June 10, 2011 4:44 PM
  • Manually moving the files to folders seems like a painful task. Do you think this could be scripted or be done using 3rd party applications to organize and move the files. Also create new folders automatically when new files are uploaded.
    Friday, June 10, 2011 5:08 PM
  • Hi,

    The number offiles you haveis really huge. Hoever to mitigate the oerformance issue I suggest you to create folders or document libraries and then copy down the contents there. Instead of carrying out this activity I suggest you to write a piece of code and execute it on the server.

    I hope this will help you out.

    Thanks,

    Rahul Rashu

    Friday, June 10, 2011 6:58 PM
  • Would changing the view to limit the files displayed be an alternative solution?
    Friday, June 10, 2011 7:42 PM
  • I have successfully put 1,000,000 documents into a single document library within MOSS2007 so certainly do not fear have such large numbers.  I see that you are using WSS 3.0 but this is more or less identical in its handling of files.

    It is important to have a folder structure.  Not least because if you are using search the crawl application reads an entire folder into memory and will struggle with 10,000 + on a 4Gb 32bit machine.

    Probably worthwhile creating a folder structure AND/OR writing an event reciever that moves the file into the folder structure when the user has saved it so the users are not exposed to have to work out which folder to put something in (if required).

    As regards performance you will need to ensure that you create indexes on the appropriate fields if you are finding documents via filtering on views or filtered views (or any web part that uses CAML queries such as Bamboo list search) otherwise these elements will be very slow.  Better still use search to find documents.

    Also ensure that you are running a full maintenance plan or similar on the backend SQL server since auto statistics does not work well on complex SQL structures and if the SQL query optimiser statistics are becoming out of date then the queries that sharepoint runs on the very large table that contains all documents will slow down and severely effect farm performance.

    In short this shouldn't be a problem but some care around:

    1] SQL performance

    2] Folder size (10,000 max)

    3] Document library indexes

     

     


    Monday, June 13, 2011 11:38 AM
  • Hi,

     

    According to your description, I have found a good blog about work with large files, for more information, please refer to:

    http://www.lcbridge.nl/vision/2008/largefiles.htm

    I hope it can help you a little.

     

    Best Regards
    David Hu
    • Marked as answer by Emir Liu Friday, June 17, 2011 1:36 AM
    Tuesday, June 14, 2011 1:31 AM