locked
MOSS 2007 + ContentDB size ~500GB = Performance Issues. Any suggestions? :) RRS feed

  • Question

  • Hi!

    Have an old MOSS2007 "mammonth" on SQL 2008, which have performance issues. I have no access today but think will be granted tomorrow. Now all the information I have is that there is few thousands of users, ~460GB is database size, and there are big performance issues. Not much info, I know) Want to improve performance(maybe not for 200%, but at least partly)

    Ideas I have - that database size exceeds recommended 100gb in 5 times:
     
    In most circumstances, to enhance the performance of Office SharePoint Server 2007, we discourage the use of content databases larger than 100 GB. If your design requires a database larger than 100 GB, follow the guidance below: Use a single site collection for the data. Use a differential backup solution, such as SQL Server 2005 or Microsoft System Center Data Protection Manager, rather than the built-in backup and recovery tools. Test the server running SQL Server 2005 and the I/O subsystem before moving to a solution that depends on a 100 GB content database. Whenever possible, we strongly advise that you split content from a site collection that is approaching 100 GB into a new site collection in a separate content database to avoid performance or manageability issues. Limit content databases that contain multiple site collections to approximately 100 GB.
    So I'm looking for ways to split it. As I understand there is:
    STSADM -o mergecontentdbs https://technet.microsoft.com/en-us/library/cc288557.aspx
    that can migrate site collection to another DB. But as I understand there is only one Site Collection at root. I'll have access on that server tomorrow and will provide more information. Have You met such case, or maybe have some advices? Interesting situation for me, think I need to begin from making documentation for this farm.. I can use SQL Profiler and Performance monitor to monitor activity. Can update screenshots here..
    Tuesday, April 5, 2016 7:12 PM

All replies

  • Splitting a site collection is a real pain in the neck.

    The primary issue with large DBs is for backup and recovery. However it's more than likely that you won't be able to get any detectable performance improvements by just moving data about. What performance issues are you trying to fix?

    Tuesday, April 5, 2016 8:33 PM
  • As per the description i understood you have only one site collection. so content database spilt may not be possible because content db split only possible for different site collection.

     can achieve this by following procedure,:

    1.  Create a new site collection using different content database. refer below link
    2. https://blogs.msdn.microsoft.com/mcsnoiwb/2007/08/20/how-to-create-site-collection-in-a-specific-content-database/
    3. move few subsite from your current site collection to new site collection using import and export command stsadm
    4. https://social.technet.microsoft.com/Forums/office/en-US/2efef942-0ad6-4340-9c9a-2e272d36754d/how-can-i-move-a-subsites-to-a-new-site-collection?forum=sharepointadminlegacy
    5. so the subsites will move to the new content database

    Note: sharepoint user groups are permitted to sitecollection so the old site users may not able to access new site collection. so perform proper testing before migration

    karim 


    Please remember to mark your question as answered, if this solves your problem.

    Wednesday, April 6, 2016 8:34 AM
  • Hi guys!

    Thank You very much for responses! I'll get more information and provide more details here. Yes I think simple removing data from Recycle Bins will do nothing - database Shrink is needed to compact the database, and it is dangerous operation for DB.. Now I'm going to get some more specific feedback from users about issues


    Independent SharePoint Developer, freelancer http://blog.sharepoint-develop.com/

    Wednesday, April 6, 2016 10:28 AM
  • Shrinking a database is normally the wrong thing to do. You should only do it once, after first checking your database using DBCC, then backing it up, then shrinking, then rechecking with DBCC.

    Before you go near it you should be able to tell us what the amount of free space is in the database. If you can't do that then you shouldn't run the shrink command.

    Wednesday, April 6, 2016 10:34 AM
  • I'm not going to shrink it at all - at this size I think it could take weeks of time(and Farm downtime).

    Or months..

    >>checking your database using DBCC, then backing it up, then shrinking, then rechecking with DBCC.

    ..and then also + rebuild indexes as I understand.. :-D


    Independent SharePoint Developer, freelancer http://blog.sharepoint-develop.com/

    Wednesday, April 6, 2016 10:40 AM
  • Sorry guys, still no details..

    Independent SharePoint Developer, freelancer http://blog.sharepoint-develop.com/

    Wednesday, April 20, 2016 9:52 AM
  • Update: Hi guys!

    I got an access to farm these days. It is really slow :) And IIS often crashes and gives "Unknown error".

    It is a two-tier Farm with Database on one server and WFE+App.Server on the other. Hardware is the same:

            Task Manager   

    About 1'800 users registered, size of Content DB is about 470Gb. Databases are well separated according best practicies(databases are stored on different LUN's, MDF and log files are separated too).

    There are 4 web applications: Central Admin, Personal Sites, Shared Services administration, and Root web application, example http://sp. This root site collection contains about 10 site-collections: http://sp/blabla/another_site_collection1,2,3.. and most of them have few webs and workspaces (in total not more than 150).

    Search is configured in Shared Services Provider, and there are about 1'270'000 elements crawled.

    I used

    %stsadm.exe% -o enumsites -url "http://sp" >> c:\site_collections_size.txt
    command and found that Root site collection and one other has more than 100Gb:
    <Sites Count="43"> <Site Url="http://sp" .. StorageUsedMB="112567,5" StorageWarningMB="0" StorageMaxMB="0" /> .. <Site Url="http://sp/blabla/blabla" .. StorageUsedMB="124295,8" StorageWarningMB="180500" StorageMaxMB="200500" /> ..

      - I think 8Gb for WFE could be Ok, but 8Gb for database is extremely small.

      - And the second problem I think is Search - the additional crawl runs for 49 hours oO, and there is configured Full Crawl every week, and additional Crawl "for 24 hours, every 8 hours" oO(sorry system is not english):

    So I think Search is configured wrong and it hangs the system. I think I need to turn of Full crawl(it is only necessary on first time, as I know), and I'll set additional crawl to night time. What do You think?


    Independent SharePoint Developer, freelancer http://blog.sharepoint-develop.com/


    • Edited by Rockie_ Tuesday, April 26, 2016 9:09 PM
    Tuesday, April 26, 2016 9:08 PM
  • As per Microsoft in sharepoint environment HW of  SQL server should be alway double the HW of WFE server.

    you can configure the perfmon to monitor all the servers behavior very closely and depending upon it you can increase the HW if required...or you can add new WFE,APP or DB server to the farm


    Karim... Please remember to mark your question as answered, if this solves your problem.


    • Edited by karimSP Wednesday, April 27, 2016 3:45 AM 1
    Wednesday, April 27, 2016 3:44 AM
  • Hi Rockie

    If its possible, you should first be looking at increasing hardware resources on both WFE + SQL. With load from few thousand users and wfe+app on one server itself, 8 GB is gonna be a real bottleneck. You should be increasing atleast 8 GB of RAM more (RAM is very cheap these days). Same goes for SQL server. You can easily add 16 gigs of RAM and it will be consumed. What about disk I/O on SQL? Is it SAN or local drives?

    It should definitely reduce crawl time further. Full crawl on weekends is good and incremental crawl can be set to night, can't argue against it for now, thereby reducing load on servers during daytime


    Thanks
    Mohit


    • Edited by mohit.goyal Wednesday, April 27, 2016 4:25 AM
    Wednesday, April 27, 2016 4:22 AM
  • Hi Rockie

    Can you share if you have any updates on this issue?


    Thanks
    Mohit

    Monday, May 16, 2016 4:27 AM
  • Rockie,

    Recycle bin plays here too. I had same scenario few years back and realized the Recycle Bin contained huge set of content, flushing them off released a considerable space for me. so check that as well.

    Monday, May 16, 2016 7:49 AM
  • Thank You for responds!

    I made some little improvements like increasing SQL Autogrowth from 1 to 50Mb, IIS Warm Up and IIS recycling(at night)

    Proposed to increase RAM, but it is difficult to do right away(it is non-virtual old machine and no slots available).

    I'm also tried to repair search index(started Full crawl), but no result(Full crawl finished for 72 hours, incremental crawl runs for 24 hours). Think there is an error in search database. If I stop crawls at all, SharePoint works better, but RAM usage on Database is still about 99%.

    >>What about disk I/O on SQL? Is it SAN or local drives?

    Database is Ok, I think. Files are on local drives, but databases are separated according best-practicies(SharePoint Database files are separated from TempDb, and *.MDF are separated from *.LDF too)

    Imho RAM and currupted Search is the problem(



    Independent SharePoint Developer, freelancer http://blog.sharepoint-develop.com/


    • Edited by Rockie_ Monday, May 16, 2016 12:33 PM
    Monday, May 16, 2016 12:31 PM
  • Hi Rockie

    There is no need to recycle IIS at night. All application pools are auto recycled during night timings. You may try to reset search index (on weekend) to see if it helps.


    Thanks
    Mohit

    Monday, May 16, 2016 12:44 PM