none
best configuration

    Question

  • Hello

    We have 60 users in the company , Dell410 a server ( 16 processors ) , 20GB of RAM , 6 15k SAS disks ,

    discs are divided into RAIDs , RAID is hardware- ie by controladoa and are 2 drives in RAID 1 for the OS and 4 drives in RAID 10 for SQL Server .

    The memory was available in SQL configuration with 15 GB for SQL and the rest is with the OS , or 5 GB .

    The server is physical , and it installed Windows 2008 R2 Enterprise with SQL Server 2012 , the server is installed only SQL Server 2012 , there is no longer any application , ie is a server dedicated to SQL Server .

    The database is 30 GB, and the system is an intranet , or IIS runs on top of another server that this only where there is IIS , which is dedicated to this other server IIS fully insulated .

    The problem :

    Certain system screens are slow , very slow , slow is put on certain screens , with some data , and if I run the following commands ,

    DBCC DROPCLEANBUFFERS , FREEPROCCACHE DBCC , DBCC FREESYSTEMCACHE

    The system returns to normal in specific screens , put back some time later to get slow .

    Divergences :

    The Personal Development defends the thesis that needs to set up a minimum of 12 GB memory , and increase the maximum memory to 16 GB , and still leave the parameter to false Automatic Statistic Update

    Well, from there I read several articles that would not be necessary to place a minimum memory because the server is dedicated , or so the sql server is installed on the server and no further application.

    Another detail that would not be advisable to leave the parameter Automatic Statistic Update as false because degrade performance.

    So you want to view in this scenario , because it will be really should I configure a minimal memory ?

    and all articles talking not to leave the parameter to false Automatic Statistic Update?

    In my opinion it is a problem of the database and not the environment.

    What is your opinion ?

    Thank you.
    Friday, September 20, 2013 12:30 AM

Answers

  • Hi,

    Reading this makes it sound like you have an issue with parameter sniffing and therefore I would have to agree that it sounds like a database issue and not an environment problem.

    In response to your main concerns, you don't need a minimum memory setting if you have a dedicated SQL Server, BUT it's good practice to set one anyway and therefore it won't hurt to put one in place.  You never know...  something may change in the future.

    And the articles you have read are correct...  do NOT turn off Automatic Updating Statistics...  this is a very valuable feature to have and it will help your queries run as best they can.

    What you should really do is to run Profiler (or use Extended Events) to track the duration / reads / writes / cpu of your queries and therefore you will be able to see if a specific procedure suddenly goes from being very fast to very slow.  This would be a good indication of the procedure which is being affected.

    Basically you have a stored procedure which runs and therefore SQL Server caches the execution plan...  when the stored procedure runs a second time SQL Server will simply use the cached plan rather than waste resource creating a whole new one.  This is great a lot of the time, however, in some circumstances it can cause issues...  For example, if you have a Customer table in which you have 1,000,000 records...  10,000 begin with the letter A, 990,000 begin with S...  now, you run a stored procedure to return Customer names and addresses passing in a value of "A" as a parameter.  The stored procedure would likely give you a plan which uses a nested loop and run very quickly.  However, if you then run the same procedure with a parameter of "S", then a nested loop would be a BAD idea and would run for a LONG time.

    This is what I suspect you're seeing as it seems to recover when you clear the plan cache.  In my example above you could include an "option (recompile)" statement in the procedure in order to force SQL to get a new execution plan each time.  This would stop the issue from happening.

    There are numerous ways to deal with parameter sniffing (have a look online)...  this is just one suggestion.

    I hope this helps.

    Thanks,

    Kevin

    Friday, September 20, 2013 7:37 AM
  • You should first have a complete understanding of you hardware usage before making any changes.

    Use PERFMON to capture the memory related counters

    page read/sec

    page fault/sec

    page life expetancy

    memory grants pending

    And see if memory is the issue or other hw

    5 gb is enough for the OS and 15 GB also look ok .



    ~DBA and BI Developer~ MCSA 2012 Please vote for this Post if found useful

    Saturday, September 21, 2013 4:10 AM

All replies

  • Hi,

    Reading this makes it sound like you have an issue with parameter sniffing and therefore I would have to agree that it sounds like a database issue and not an environment problem.

    In response to your main concerns, you don't need a minimum memory setting if you have a dedicated SQL Server, BUT it's good practice to set one anyway and therefore it won't hurt to put one in place.  You never know...  something may change in the future.

    And the articles you have read are correct...  do NOT turn off Automatic Updating Statistics...  this is a very valuable feature to have and it will help your queries run as best they can.

    What you should really do is to run Profiler (or use Extended Events) to track the duration / reads / writes / cpu of your queries and therefore you will be able to see if a specific procedure suddenly goes from being very fast to very slow.  This would be a good indication of the procedure which is being affected.

    Basically you have a stored procedure which runs and therefore SQL Server caches the execution plan...  when the stored procedure runs a second time SQL Server will simply use the cached plan rather than waste resource creating a whole new one.  This is great a lot of the time, however, in some circumstances it can cause issues...  For example, if you have a Customer table in which you have 1,000,000 records...  10,000 begin with the letter A, 990,000 begin with S...  now, you run a stored procedure to return Customer names and addresses passing in a value of "A" as a parameter.  The stored procedure would likely give you a plan which uses a nested loop and run very quickly.  However, if you then run the same procedure with a parameter of "S", then a nested loop would be a BAD idea and would run for a LONG time.

    This is what I suspect you're seeing as it seems to recover when you clear the plan cache.  In my example above you could include an "option (recompile)" statement in the procedure in order to force SQL to get a new execution plan each time.  This would stop the issue from happening.

    There are numerous ways to deal with parameter sniffing (have a look online)...  this is just one suggestion.

    I hope this helps.

    Thanks,

    Kevin

    Friday, September 20, 2013 7:37 AM
  • for this scenario I presented, what the minimum memory should I put?
    would put some prejuiso 12 GB minimum memory, and 16 GB of memory maximum?
    how much would be left for the Memory S.O?
    Saturday, September 21, 2013 1:56 AM
  • You should first have a complete understanding of you hardware usage before making any changes.

    Use PERFMON to capture the memory related counters

    page read/sec

    page fault/sec

    page life expetancy

    memory grants pending

    And see if memory is the issue or other hw

    5 gb is enough for the OS and 15 GB also look ok .



    ~DBA and BI Developer~ MCSA 2012 Please vote for this Post if found useful

    Saturday, September 21, 2013 4:10 AM