Friday, March 4, 2011

Creating Custom Data Collectors SQL Server 2008 R2 Part 1

With the release of SQL 2008 Microsoft introduced the Management Data Warehouse and Data Collection. This feature is a fantastic tool for DBAs and System Administrators who need to not only monitor and track system performance but also quickly troubleshoot current issues by comparing active metrics with historic metrics. It is also a very handy resource for being able to report on for management.

There are several canned reports that come with the Management Data Warehouse (MDW) in addition to the canned reports that come with SQL Server Management Studio (SSMS). However there is a lot more data and functionality in the MDW than the handful of canned reports may initially lead you to believe.

The best thing to do in my opinion, if you haven't used any of these features before is to setup a development box for your first implementation of MDW and play with it for awhile before introducing it into your environment. The default collectors keep a lot of data and the database can grow quickly also if you have lots of systems collecting to the same warehouse you can see performance issues if they all try to collect and upload at the same time.

One of the coolest things about Data Collectors and MDW is the ability to create your own Custom collectors. So how to tie this all together. Well I really like the Disk Usage by Top Tables report that comes canned in SSMS.

However I really want to be able to see this data every day over time. Also, my management would like to know how fast our system is growing and how much data we are adding. The solution....?

Well what I can do is first trace the code that runs behind the SSMS report Disk Usage by Top Tables. Once I have that query I tweak it a bit to work in the confines of the collector definition. See a couple of good links here:
http://msdn.microsoft.com/en-us/library/bb677277.aspx

http://www.sql-server-performance.com/2008/system-data-collection-reports-install/



Now I can use the T-SQL from the report to be the basis for my new custom collector, essentially running that report and storing the data over time on whatever schedule I choose. You can tweak the collection frequency, caching and uploading jobs either now or after the collector is created. Also, if you want it to be for one or many databases can be configured. Once it is created be sure to enable it and verify that the new custom_snapshots table appears in your MDW.





Begin Transaction
Begin Try
Declare @collection_set_id_1 int
Declare @collection_set_uid_2 uniqueidentifier
EXEC [msdb].[dbo].[sp_syscollector_create_collection_set] @name=N'Table Metrics', @collection_mode=1, @description=N'Collects data about the table stats.', @logging_level=0, @days_until_expiration=730, @schedule_name=N'CollectorSchedule_Every_6h', @collection_set_id=@collection_set_id_1 OUTPUT, @collection_set_uid=@collection_set_uid_2 OUTPUT
Select @collection_set_id_1, @collection_set_uid_2

Declare @collector_type_uid_3 uniqueidentifier
Select @collector_type_uid_3 = collector_type_uid From [msdb].[dbo].[syscollector_collector_types] Where name = N'Generic T-SQL Query Collector Type';
Declare @collection_item_id_4 int
EXEC [msdb].[dbo].[sp_syscollector_create_collection_item] @name=N'Table_Stats', @parameters=N'

SELECT
a3.name AS schemaname,
a2.name AS tablename,
DB_NAME() AS databasename,
a1.rows as row_count,
(a1.reserved + ISNULL(a4.reserved,0))* 8 AS reserved,
a1.data * 8 AS data,
(CASE WHEN (a1.used + ISNULL(a4.used,0)) > a1.data THEN (a1.used + ISNULL(a4.used,0)) - a1.data ELSE 0 END) * 8 AS index_size,
(CASE WHEN (a1.reserved + ISNULL(a4.reserved,0)) > a1.used THEN (a1.reserved + ISNULL(a4.reserved,0)) - a1.used ELSE 0 END) * 8 AS unused
FROM
(SELECT
ps.object_id,
SUM (
CASE
WHEN (ps.index_id < 2) THEN row_count
ELSE 0
END
) AS rows,
SUM (ps.reserved_page_count) AS reserved,
SUM (
CASE
WHEN (ps.index_id < 2) THEN (ps.in_row_data_page_count + ps.lob_used_page_count + ps.row_overflow_used_page_count)
ELSE (ps.lob_used_page_count + ps.row_overflow_used_page_count)
END
) AS data,
SUM (ps.used_page_count) AS used
FROM sys.dm_db_partition_stats ps
GROUP BY ps.object_id) AS a1
LEFT OUTER JOIN
(SELECT
it.parent_id,
SUM(ps.reserved_page_count) AS reserved,
SUM(ps.used_page_count) AS used
FROM sys.dm_db_partition_stats ps
INNER JOIN sys.internal_tables it ON (it.object_id = ps.object_id)
WHERE it.internal_type IN (202,204)
GROUP BY it.parent_id) AS a4 ON (a4.parent_id = a1.object_id)
INNER JOIN sys.all_objects a2 ON ( a1.object_id = a2.object_id )
INNER JOIN sys.schemas a3 ON (a2.schema_id = a3.schema_id)
WHERE a2.type != N''S'' and a2.type != N''IT''
table_stats
MY_OLTP_DB
', @collection_item_id=@collection_item_id_4 OUTPUT, @frequency=60, @collection_set_id=@collection_set_id_1, @collector_type_uid=@collector_type_uid_3
Select @collection_item_id_4

Commit Transaction;
End Try
Begin Catch
Rollback Transaction;
DECLARE @ErrorMessage NVARCHAR(4000);
DECLARE @ErrorSeverity INT;
DECLARE @ErrorState INT;
DECLARE @ErrorNumber INT;
DECLARE @ErrorLine INT;
DECLARE @ErrorProcedure NVARCHAR(200);
SELECT @ErrorLine = ERROR_LINE(),
@ErrorSeverity = ERROR_SEVERITY(),
@ErrorState = ERROR_STATE(),
@ErrorNumber = ERROR_NUMBER(),
@ErrorMessage = ERROR_MESSAGE(),
@ErrorProcedure = ISNULL(ERROR_PROCEDURE(), '-');
RAISERROR (14684, @ErrorSeverity, 1 , @ErrorNumber, @ErrorSeverity, @ErrorState, @ErrorProcedure, @ErrorLine, @ErrorMessage);

End Catch;

GO





Ok so Part 1 was a little long, so I am breaking this up in two parts. For part 2 I'll go over the SSRS report that sends management a weekly email showing top table growth.

1 comment:

  1. Note: If you want to use this for more than one database on multiple servers I suggest adding @@servername to the select statement of the collector so you have a way of differentiating.

    ReplyDelete