locked
SharePoint 2007 - Site Collection Size RRS feed

  • Question

  • Hi, 

    I have a SharePoint 2007 Farm with one site collection. The database of the site collection has 205GB. But, when I executed the command "stsadm - o enumsites -url http://intranet" I get site collection size = 58GB. I also executed a program (SharePoint Space Monitor v3.0) I get the same results. The information are listed below:

    

    Question 1: Why is there a difference between the size of the site collection (58GB) and content database (205 GB)?

    Question 2 (Important): The microsoft recommendation (100GB per site collection) refers to size of the site collection (and my site collection is ok, because has 58 GB) or refers to content database (205GB)(and my site collection needs to be divided into several site collections).

    Thanks

    Everton

    Friday, April 5, 2013 1:41 AM

Answers

  • Hi Everton,

    Great question.. Yes, but its all about IOPS, HA and DR along with your ability to maintain & restore the the DBs. The issue is just because you can doesn't mean you should as -Spence would say. Its not a good idea generally to have a ContentDB that large. The issue is record locking and the amount of wait time that can be introduced due to a large number of records ins a table.

    So the answer is if you have the IOPS, Hardware, and Operations then you could have a ContentDB that large. You should always aim to stay within what we know are best practices, pre-size, preplace your Dbs, create NDFs on the DBs that need them, move tempDB to a separate drive pre-size to 25% of your planned largest ContentDB, test your backups before you need them, monitor the Server and applications..

    Reference: Software Boundaries and Limits for SharePoint http://technet.microsoft.com/en-us/library/cc262787.aspx

    -Ivan


    Ivan Sanders My LinkedIn , My Blog, @iasanders, BI in SP2013, SP2013 Content Packs.

    Friday, April 5, 2013 7:32 PM
  • Hi Everton,

    Whenever, we advise clients on where to use auditing the first thing we say is to audit only where necessary... There are not any adverse affects from using stsadm -o trimauditlog command. Do not edit the table directly as you know you would be unsupported.

    -Ivan


    Ivan Sanders My LinkedIn , My Blog, @iasanders, BI in SP2013, SP2013 Content Packs.

    Thursday, April 11, 2013 6:06 AM

All replies

  • Hi Everton,

    We strongly recommended limiting the size of content databases to 200 GB, except when the circumstances in the following rows in this table apply. If you are using Remote BLOB Storage (RBS), the total volume of remote BLOB storage and metadata in the content database must not exceed this limit.

    Content databases of up to 4 TB are supported when the following requirements are met:

    • Disk sub-system performance of 0.25 IOPs per GB. 2 IOPs per GB is recommended for optimal performance.

    • You must have developed plans for high availability, disaster recovery, future capacity, and performance testing.

    As a rule we pre-size the databases to 100GB and try not to go beyond Software boundaries as listed above. However, I have clients that have 500GB databases and we will be reducing the size of them during the next migration cycle.

    Questions?

    1. Do you have more than one site collection in your content database
    2. What is the size of your backup for the content database

    Reference: Software Boundaries and Limits for SharePoint http://technet.microsoft.com/en-us/library/cc262787.aspx

     

    -Ivan


    Ivan Sanders My LinkedIn , My Blog, @iasanders, BI in SP2013, SP2013 Content Packs.

    Friday, April 5, 2013 3:54 AM
  • Hi Ivan,

    Does Sharepoint 2007 also support content databases of 4TB? Or just sharepoint 2010?

    Questions?

    1 - No, the content database is used only to store data from one site collection

    2 - Where do I get this information?  

    Thanks!

    Everton


    Friday, April 5, 2013 4:20 PM
  • Hi Everton,

    Great question.. Yes, but its all about IOPS, HA and DR along with your ability to maintain & restore the the DBs. The issue is just because you can doesn't mean you should as -Spence would say. Its not a good idea generally to have a ContentDB that large. The issue is record locking and the amount of wait time that can be introduced due to a large number of records ins a table.

    So the answer is if you have the IOPS, Hardware, and Operations then you could have a ContentDB that large. You should always aim to stay within what we know are best practices, pre-size, preplace your Dbs, create NDFs on the DBs that need them, move tempDB to a separate drive pre-size to 25% of your planned largest ContentDB, test your backups before you need them, monitor the Server and applications..

    Reference: Software Boundaries and Limits for SharePoint http://technet.microsoft.com/en-us/library/cc262787.aspx

    -Ivan


    Ivan Sanders My LinkedIn , My Blog, @iasanders, BI in SP2013, SP2013 Content Packs.

    Friday, April 5, 2013 7:32 PM
  • Hi Ivan, thanks!

    I think that discovered the difference between the site collection size and content database size... Look this, please:

    The table dbo.AuditData has 128.502.784 kb (124 GB). So, now I need delete the content of the table. Did you already use the following command?

    stsadm -o trimauditlog -databasename  wss_content_sharepoint - url http://intranet -date yyyy/mm/dd

    what is the problem(dangers) if I delete all records from this table?

    Thanks

    Everton


    Sunday, April 7, 2013 12:12 AM
  • Hi Everton,

    Whenever, we advise clients on where to use auditing the first thing we say is to audit only where necessary... There are not any adverse affects from using stsadm -o trimauditlog command. Do not edit the table directly as you know you would be unsupported.

    -Ivan


    Ivan Sanders My LinkedIn , My Blog, @iasanders, BI in SP2013, SP2013 Content Packs.

    Thursday, April 11, 2013 6:06 AM
  • Hi Ivan

    Thanks!!!

    Everton

    Thursday, April 11, 2013 12:58 PM