Thursday, December 06, 2012 5:11 PMDear All,
Hi! I recently worked on a Data warehouse server which has performance issues. When I dig up the DB wise tables then found that frequently used tables have more than 5 billion records.
Now, what steps should I take so that the fetching of records from such tables will be optimized? All such tables already have Indexes & the statistics is also getting update on a daily basis.
Monday, December 10, 2012 8:36 AM
Partition the table based on the query the users frequently use. Partition the table on hash/key_range. Index will help only if the record set is very small.
Database Consultants NY
Monday, December 10, 2012 8:49 AM
Please find below the available option for query performance
1) If the database is SQL server 2012 use column store non clustered index, the performance of the query is faster by 10-15 %,
2) Partition the data warehouse table is the another option
3) Re organize the index
- Marked As Answer by Eileen ZhaoMicrosoft Contingent Staff, Moderator Tuesday, December 18, 2012 7:12 AM
Tuesday, December 11, 2012 2:43 PM
In long run, you will have to do partitioning along with other performance tuning efforts. But in short run, there is no one answer to you question. It would depend query by query. I would suggest you to check the most critical parts of the application that are suffering and take one by one. In parallel build partitioning solution.
The below blog links will help you with partitioning implementation. Please comments on blog if you would have any questions.