none
SQL Server installing with millions or rows in tables RRS feed

  • Question

  •  

    Hello,

    Currently we are in the process of implementing a sql server database where couple tables will have millions of rows (about 98 millions and will grow) and an asp.net site that will retrieve and sort the data

     

    What will be the best practice installing the database in situation like this one?

    Do we need a cluster server? Indexing needs to be done in a special  way?

    Thanks in advance.

    Thursday, October 11, 2007 6:17 PM

Answers

  • Clustered servers are about AVAILABILITY -not about quantity.

     

    I really, really, hope you don't mean that 98 million rows will be transferred to a Web server, and then sorted on the web server.

     

    Ninety-eight million rows is NOT a very large table. You should not have any problems. You may need to increase your storage capabilities (SAN, RAID 10, etc.)

     

    Indexing needs to be done in a way that facilitates the queries being served. You may wish to review about the Database Tuning Wizard -check in Books Online.

     

     

     

    Thursday, October 11, 2007 9:14 PM
    Moderator
  • I have several clients with much larger databases -that is not a problem.

     

    One thing to keep in mind, the larger the amount of data transferred to the webserver, the greater amount of time will be required to render the web page. By many orders of magnitude.

     

    Often, page response times can be greatly improved by carefully requesting the smallest amount of data required to render the page. Data silos, client side caches, and xml streams will all increase the page render time.

     

     

     

    Thursday, October 11, 2007 9:50 PM
    Moderator

All replies

  • Clustered servers are about AVAILABILITY -not about quantity.

     

    I really, really, hope you don't mean that 98 million rows will be transferred to a Web server, and then sorted on the web server.

     

    Ninety-eight million rows is NOT a very large table. You should not have any problems. You may need to increase your storage capabilities (SAN, RAID 10, etc.)

     

    Indexing needs to be done in a way that facilitates the queries being served. You may wish to review about the Database Tuning Wizard -check in Books Online.

     

     

     

    Thursday, October 11, 2007 9:14 PM
    Moderator
  •  
    lol..no we will have 98 millions of row to be displayed on the web server. But, certainly we may have about 50K records back to the web pages. So, you said a regular installation will be fine with good storage ( SAN, RAID 10 etc.)?
    Thursday, October 11, 2007 9:39 PM
  • I have several clients with much larger databases -that is not a problem.

     

    One thing to keep in mind, the larger the amount of data transferred to the webserver, the greater amount of time will be required to render the web page. By many orders of magnitude.

     

    Often, page response times can be greatly improved by carefully requesting the smallest amount of data required to render the page. Data silos, client side caches, and xml streams will all increase the page render time.

     

     

     

    Thursday, October 11, 2007 9:50 PM
    Moderator