How to maximize performance for MSSQL data sources?
I'm a dba with a data source with ~ 280 tables, and also complete datasize of ~ 1,5 GB. I would certainly such as to tune the data source to make it execute much better.
What do you do to maintain your SQL data sources performant? Just how much adjusting indexes, data, and also defragmentation do you do?
What are the largest performance killers/improvements, and also just how do you repair to figure out where to maximize?
Edit : This is a data source from a 3rd event CRM system, so I have no control over the code. They have actually included great deals of indexes (in reasonable areas), yet I would love to recognize just how I can keep the rate of the web server.
I do run every night
EXEC sp_MSforeachtable "dbcc dbreindex('?', '', 90)"
to restore indexes and also (with any luck) upgrade data, making use of an upkeep strategy. The very same strategy additionally implements a "Shrink Database Task".
What various other nightly/weekly upkeep jobs, or one - time optimizations could be done?
- Do not run "Shrink Database Task"
- Turn off "Auto Shrink" building
- Run "EXEC sp_MSforeachtable "dbcc dbreindex ('?',"
- , 90) "Then run" EXEC sp_updatestats"
Make certain your SQL server isn't doing anything apart from offering SQL, no documents shares, no internet servers. A data source that dimension should not have any kind of performance concerns, running Profiler will certainly aid absolutely no in on any kind of slow-moving places.
See to it you pre - allocate adequate room so the data source isn't vehicle - expanding regularly.
if you have sp2 of sql2005, you can (and also given that you are a dba possibly currently did) install the performance control panel record plugin. after restarting the web server, i'll see the performance display for handy indices. the performance display gives a wide range of details. with time, i've located that it gathers to much things (yet that could be due to the fact that i have a data source arrangement like fogbugz ... i have several customer data sources, all with the very same framework, on the very same web server ... sql server will certainly deal with required indices in each of these as different which brings about a number of properly replicate access in the record). anyhow, i invest a great deal of time considering the absent index records and also adding/modifying indices therefore.
best pertains to, wear
I obtained orders of size performance renovation just by indexing (although I really did not actually adjust it a lot)
After you take care of the noticeable (suitable equipment, indexing and also defrag' ing) you are best resource for performance is considering your very own code.
All equipment solutions sorry yet I would certainly recommend making use of ;
64 - little bit Windows 2008 Great deals of memory (16GB approximately I would certainly recommend, it is economical presently) Logs on an SSD, information on a mirrored set of 15krpm SCSI/SAS disks Preferably a set of Xeon 55xx - collection chips.
This will certainly all set you back loan clearly yet will certainly assure substantial performance raises promptly - you can make your DB far better yet this requires time (which sets you back also certainly) and also isn't assured to make it much faster.
you need to remove the "reduce data source" component of your upkeep strategy. "reduce data source" will certainly piece your index! here is a wonderful post by Paul S. Randal which clarifies this carefully.
you can restore your data with:
upgrade your data constantly after your data source reindex. here is a post by Colin Stasiuk which clarifies this ideal technique.
Test run your sql/stored procs in the SQL Management Studio. When you run them, activate View Execution Plan and also Client Statistics. Seek table scans or various other actions that make use of occupy huge %'s of the implementation time. If you are running SQL 2008, it will certainly recommend indexes that will certainly make your code run much faster.