Today Jonathan has presented on analyzing trace data, and Kimberly has spoken on troubleshooting Plan Cache Issues and Index Consolidation. It’s really awesome when you listen to Kimberly, because she is speaking so fast, and delivers such a great quality of content – really awesome!!!
Module 10: Analyzing Trace Data
-
Default Trace
- Logs everytime when an object is created and deleted
- Also for Temp tables => Default Trace can get really big
- Lightweight Trace defined by the Product Support Team
- Uses DBCC commands
- DBCC command are also logged in the Default Trace
- sys.configurations, “default trace enabled” to check if Default Trace is enabled/disabled
- Can be enabled/disabled through sp_configure
- Default Trace has always ID of 1
- Default Trace can’t be changed
-
You can create your own Default Trace
- Disable the original Default Trace
-
Create your own
- TempDb object creations can be excluded
- TempDb object creations can be excluded
- Disable the original Default Trace
-
5x50MB files
- Server Activity impacts retention period
- Server Activity impacts retention period
-
It writes to the error log path
- Can’t be changed
- If you change the default error log path, you also change indirectly the location where the Default Trace is written
- Can’t be changed
- Logs everytime when an object is created and deleted
-
sp_get_query_template to normalize statements can be expensive!
- Cursor must be used, so that you can use it for the result in a trace file
- Cursor must be used, so that you can use it for the result in a trace file
-
When you consolidate several databases onto a new instance, ad hoc workloads should put together, because they both have problems
- Doesn’t make sense to mix these workloads with parametrized workloads
- Doesn’t make sense to mix these workloads with parametrized workloads
-
SQLDiag can be used for (Remote) SQL Server Health Checks
- Use Extended Events in it through custom tasks
- SQLskills runs SQLDiag in the snapshot mode for SQL Server Health Checks
- /C 2 compresses the output into a CAB file
- SQLDiag needs sysadmin priviledges on the monitored server
- Use Extended Events in it through custom tasks
- Blocked Process Report
Module 11: DMVs – Part 1
- During Query Compilation the amount of CPU is not taken into account
-
Safe means that the plan can’t be reused
- E.g. when you use the IN statement
- When the plan is not parametrized, then the plan is not safe
- A parametrized plan is a prepared plan
- SQL Server uses the statistics in the background to determine if the plan is a consistent one
-
Unique Clustered Index Seek leads to a safe plan
- It’s a consistent plan
- It’s a consistent plan
-
Non-Unique Non-Clustered index Seek leads to a non safe plan
- This leads to a non consistent plan
- This leads to a non consistent plan
- E.g. when you use the IN statement
-
Plans in the Plan Cache don’t have the Actual properties, because the Plan Cache stores only the plans from the Query Compilation
- An actual plan can’t be get from the cache
- An actual plan can’t be get from the cache
-
sp_executesql leads to prepared statements
- Great for datatype casing
- Leads to parameter sniffing!
- Every single execution reused the cached plan
- Great for datatype casing
- sp_recompile just invalidates the object, the recompilation is done during the next execution
- FORCED parametrization is not always recommended
-
Plan Cache Stores can be cleaned up individually
- SQL_PLAN_CACHE can be cleared periodically if you have memory pressure in the Plan Cache
- DBCC FREESYSTEMCACHE(‘SQL Plans’)
- SQL_PLAN_CACHE can be cleared periodically if you have memory pressure in the Plan Cache
- http://www.sqlskills.com/BLOGS/KIMBERLY/category/Plan-cache.aspx
-
Optimize for adhoc workload
- The query hash is put into the Plan Cache
- The query hash is put into the Plan Cache
- When the values are not yet in the histogram, you can use OPTIMIZE FOR UNKNOWN
- Plan Caching in SQL Server 2008: http://msdn.microsoft.com/en-us/library/ee343986(v=sql.100).aspx
- http://www.sqlskills.com/BLOGS/KIMBERLY/post/RemovingDuplicateIndexes.aspx
- http://www.sqlskills.com/BLOGS/KIMBERLY/post/UnderstandingDuplicateIndexes.aspx
- http://blogs.msdn.com/b/bartd/archive/2007/07/19/are-you-using-sql-s-missing-index-dmvs.aspx
Thanks for reading
-Klaus
1 thought on “SQL MCM Training – Day 4”
Thank you for another fantastic blog. I feel strongly about it and I take pleasure in learning about this subject. Keep it up! Good Job!
Comments are closed.