none
Space Occupied by Database in 1 Year is 2835 GB

    Question

  •  We have extimated the Space Occupied by Database in 1 Year is 2835 GB. It is going hold data for 24 years.

    Could you please suggest what should be the Physical memory configuration for constructing the database server newly with th estimated table size given below.

     
    TableName Expected Monthly Record Count Average Row length in Bytes Records Per Page
    CostCategory 625050 358.854 22.8282254064
    JobCost 300000 132.079 62.023485944
    DataLoadLog 30 1677 4.8849135361
    CountOperations 30 48.5 168.9072164948
    LoadIdentity 30 70 117.0285714286
    LoadHistory 30 31 264.2580645161

    Upon Calculation it has been observed it will grow upto 3 GB per year.

    Could you please suggest the memory reuqired initially based on the above calculation

    • Edited by SubhadipRoy Friday, October 04, 2013 7:11 AM
    Thursday, October 03, 2013 3:46 PM

Answers

  • Hi Subhadip

    This doesnt Seem oltp. It must be for DWH . In this case you should check ammount of operation on database. in this dwh 64 Gb whould be very good ammount of memory. You just need to consider other options aswell. like partitioning database in files with years.

    But If you  are doing very heavy operations simultaneously , You must definetely go with more memory


    Thanks Saurabh Sinha

    http://saurabhsinhainblogs.blogspot.in/

    Please click the Mark as answer button and vote as helpful if this reply solves your problem


    Thursday, October 03, 2013 5:09 PM
  • The physical RAM size has nothing to do with the physical size of the database.  RAM is used for caching and other operations.  The RAM size is determined by number of connections, queries, data cache, etc.

    Thursday, October 03, 2013 5:45 PM
  • Hi,

    To estimate the amount of RAM needed for database server you have to take into consideration the amount of data that your database/application uses for normal and peek activities (buffer cache). You also have to consider the procedure cache for caching procedures and queries.

    I really doubt that any user of your application will run a report that includes data from all 24 years. Users are probably running last quarter or last year reports and if this is OLTP system this will be the biggest consumers of buffer cache.

    If you system is currently running check if you now have memory pressure (Memory:Available MBytes, BufferManager:Buffer Cache Hit Ratio, BufferManager:Page Reads and Writes). If you don't have a memory pressure increase in data stored in tables is not likely to create one, meaning you don't have to increase the amount of memory on the server. This is valid statement only if your queries are well optimized; if you have unnecessary scans with increase of data these scans will require more memory. 


    HTH, Regards, Dean Savović, www.comminus.hr/en/

    Thursday, October 03, 2013 5:55 PM

All replies

  • Hi Subhadip

    This doesnt Seem oltp. It must be for DWH . In this case you should check ammount of operation on database. in this dwh 64 Gb whould be very good ammount of memory. You just need to consider other options aswell. like partitioning database in files with years.

    But If you  are doing very heavy operations simultaneously , You must definetely go with more memory


    Thanks Saurabh Sinha

    http://saurabhsinhainblogs.blogspot.in/

    Please click the Mark as answer button and vote as helpful if this reply solves your problem


    Thursday, October 03, 2013 5:09 PM
  • The physical RAM size has nothing to do with the physical size of the database.  RAM is used for caching and other operations.  The RAM size is determined by number of connections, queries, data cache, etc.

    Thursday, October 03, 2013 5:45 PM
  • Hi,

    To estimate the amount of RAM needed for database server you have to take into consideration the amount of data that your database/application uses for normal and peek activities (buffer cache). You also have to consider the procedure cache for caching procedures and queries.

    I really doubt that any user of your application will run a report that includes data from all 24 years. Users are probably running last quarter or last year reports and if this is OLTP system this will be the biggest consumers of buffer cache.

    If you system is currently running check if you now have memory pressure (Memory:Available MBytes, BufferManager:Buffer Cache Hit Ratio, BufferManager:Page Reads and Writes). If you don't have a memory pressure increase in data stored in tables is not likely to create one, meaning you don't have to increase the amount of memory on the server. This is valid statement only if your queries are well optimized; if you have unnecessary scans with increase of data these scans will require more memory. 


    HTH, Regards, Dean Savović, www.comminus.hr/en/

    Thursday, October 03, 2013 5:55 PM