locked
How to backup the real size of the content database files not the initial size?? RRS feed

  • Question

  • Hi,

    i am trying to backup a content database which has mdf and ndf files in two separate drive in the same server. if i check free space using a query its showing initial size has been assigned around 775 GB and free space about 465 GB. i thought i might need to backup 310 GB but when i doing backup showing total size is 775 GB. how to backup only the real size that is 310 GB? what i need to do in this case?

    Thanks in advanced!

    Wednesday, February 21, 2018 5:28 PM

Answers

  • You can either migrate the site to a new database (Move-SPSite) or use Backup Compression on the database for the resulting backup file to eliminate the white space.

    You can also shrink the database, but this isn't recommended as it has performance impacts.


    Trevor Seward

    Office Servers and Services MVP



    Author, Deploying SharePoint 2016

    This post is my own opinion and does not necessarily reflect the opinion or view of Microsoft, its employees, or other MVPs.

    • Proposed as answer by Dean_Wang Thursday, February 22, 2018 7:15 AM
    • Marked as answer by rakib1 Monday, February 26, 2018 3:25 PM
    Wednesday, February 21, 2018 6:19 PM
  • Hi rakib1,

    Sometimes content databases contains millions of records of Audit logs  which represents hundreds of extra gigabytes in the size of content database , i came a cross an issue like this before , we had content database with 300 Gb, we were able to reduce the size  from 300 GB to only 80 GB, because we found that the audit logs have been activated on the portal , anyway we resolved this issue by only taking a backup of the actual content of the site collections without any extra useless content , we do the following :-

    1- execute the following commands on your environment for getting the site collections size

    Add-PSSnapin Microsoft.SharePoint.Powershell -ErrorAction SilentlyContinue
    	
    Get-SPSite | Select-Object Url,  @{n="SizeInMB";e={(($_.Usage).Storage)/1MB}},
    @{n="MaxSizeInMB";e={(($_.Quota).StorageMaximumLevel)/1MB }},
    @{n="WarningSizeInMB";e={(($_.Quota).StorageWarningLevel)/1MB }} | 
    Sort-Object -Descending -Property "SizeInMB" | FT


    2-backup your site collection 

    Backup-SPSite -Identity “http://site/test” -Path “c:\backup\test.bak”.
    
    Restore-SPSite -Identity URL -Path “backup file location” -Force.


    by this way you only took a backup of the sc data without any extra data.

    i hope this will help you 


    Best Regrads, Ahmed Madany MCTS @twitter http://twitter.com/ahmed_madany @Blog http://ahmedmadany.wordpress.com @LinkedIn http://eg.linkedin.com/pub/ahmed-madany/35/80/2b6

    • Marked as answer by rakib1 Tuesday, February 27, 2018 7:12 PM
    Wednesday, February 21, 2018 6:59 PM

All replies

  • You can either migrate the site to a new database (Move-SPSite) or use Backup Compression on the database for the resulting backup file to eliminate the white space.

    You can also shrink the database, but this isn't recommended as it has performance impacts.


    Trevor Seward

    Office Servers and Services MVP



    Author, Deploying SharePoint 2016

    This post is my own opinion and does not necessarily reflect the opinion or view of Microsoft, its employees, or other MVPs.

    • Proposed as answer by Dean_Wang Thursday, February 22, 2018 7:15 AM
    • Marked as answer by rakib1 Monday, February 26, 2018 3:25 PM
    Wednesday, February 21, 2018 6:19 PM
  • Hi rakib1,

    Sometimes content databases contains millions of records of Audit logs  which represents hundreds of extra gigabytes in the size of content database , i came a cross an issue like this before , we had content database with 300 Gb, we were able to reduce the size  from 300 GB to only 80 GB, because we found that the audit logs have been activated on the portal , anyway we resolved this issue by only taking a backup of the actual content of the site collections without any extra useless content , we do the following :-

    1- execute the following commands on your environment for getting the site collections size

    Add-PSSnapin Microsoft.SharePoint.Powershell -ErrorAction SilentlyContinue
    	
    Get-SPSite | Select-Object Url,  @{n="SizeInMB";e={(($_.Usage).Storage)/1MB}},
    @{n="MaxSizeInMB";e={(($_.Quota).StorageMaximumLevel)/1MB }},
    @{n="WarningSizeInMB";e={(($_.Quota).StorageWarningLevel)/1MB }} | 
    Sort-Object -Descending -Property "SizeInMB" | FT


    2-backup your site collection 

    Backup-SPSite -Identity “http://site/test” -Path “c:\backup\test.bak”.
    
    Restore-SPSite -Identity URL -Path “backup file location” -Force.


    by this way you only took a backup of the sc data without any extra data.

    i hope this will help you 


    Best Regrads, Ahmed Madany MCTS @twitter http://twitter.com/ahmed_madany @Blog http://ahmedmadany.wordpress.com @LinkedIn http://eg.linkedin.com/pub/ahmed-madany/35/80/2b6

    • Marked as answer by rakib1 Tuesday, February 27, 2018 7:12 PM
    Wednesday, February 21, 2018 6:59 PM
  • i think Backup method only will work if you are working on the same environment. but upgrading from 2010 to 2013 you need to do content database backup. am i right?

    Thanks

    Friday, February 23, 2018 3:40 PM
  • Hey Trevor,

    sorry forgot to mention, as i am upgrading from 2010 to 2013. in that case i need to do content database backup where backup-spsite will not work. do you think its gonna be performance issue if i do backup Compression? any idea how much size will reduce from 775 GB? i already backed up 775 GB but its taking long time when attaching, is there any way i could do backup compression during restore the content database? sorry too many questions.



    • Edited by rakib1 Friday, February 23, 2018 3:52 PM
    Friday, February 23, 2018 3:45 PM