When it comes to performance one of the easiest things you can do to identify potential bottlenecks is routinely review your systems longest running queries (LRQs). Simple changes to stored procedures or inline SQL reports can, over time, cause unexpected performance issues and, if not kept in check, can have devastating effects on the speed and reliability of your SQL environment.

A simple Google search on “how to view longest running queries” will provide you with a handful of useful scripts on how to monitor this. I’ve noticed that most are set up to show a simple TOP 10 but I find its more relevant to get a percentage of the top longest running queries as it will give you a better snapshot of what is going on.

I found inspiration for the following script from a quick Google search but have modified it to include the database name and object name as well as a better method for displaying the actual code that is being executed.

SELECT TOP 10 PERCENT o.name AS 'Object Name', qs.total_elapsed_time / qs.execution_count / 1000.0 AS 'Average Seconds', qs.total_elapsed_time / 1000.0 AS 'Total Seconds', total_physical_reads AS'Physical Reads', total_logical_reads AS 'Logical Reads', qs.execution_count AS 'Count', SUBSTRING (qt.text,qs.statement_start_offset/2, (CASE WHEN qs.statement_end_offset = -1 THEN LEN(CONVERT(NVARCHAR(MAX), qt.text)) * 2 ELSE qs.statement_end_offset END - qs.statement_start_offset)/2) AS 'Query', DB_NAME(qt.dbid) AS 'Database', last_execution_time AS 'Last Executed', @@ServerName AS 'Server Name' FROM sys.dm_exec_query_stats qs CROSS APPLY sys.dm_exec_sql_text(qs.sql_handle) as qt LEFT OUTER JOIN sys.objects o ON qt.objectid = o.object_id where qt.dbid = DB_ID() ORDER BY 'Average Seconds' DESC; 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 SELECT TOP 10 PERCENT o . name AS 'Object Name' , qs . total_elapsed_time / qs . execution_count / 1000.0 AS 'Average Seconds' , qs . total_elapsed_time / 1000.0 AS 'Total Seconds' , total_physical_reads AS 'Physical Reads' , total_logical_reads AS 'Logical Reads' , qs . execution_count AS 'Count' , SUBSTRING ( qt . text , qs . statement_start_offset / 2 , ( CASE WHEN qs . statement_end_offset = - 1 THEN LEN ( CONVERT ( NVARCHAR ( MAX ) , qt . text ) ) * 2 ELSE qs . statement_end_offset END - qs . statement_start_offset ) / 2 ) AS 'Query' , DB_NAME ( qt . dbid ) AS 'Database' , last_execution_time AS 'Last Executed' , @ @ ServerName AS 'Server Name' FROM sys . dm_exec_query_stats qs CROSS APPLY sys . dm_exec_sql_text ( qs . sql_handle ) as qt LEFT OUTER JOIN sys . objects o ON qt . objectid = o . object_id where qt . dbid = DB_ID ( ) ORDER BY 'Average Seconds' DESC ;

You can easily modify this script to limit the number of records being displayed if you are more interested in only identifying user designed stored procedures or simply want to tackle your LRQs one at a time.

The screenshot below displays the TOP 10 PERCENT of the longest running queries in my DYNAMICS lab environment.

Another good strategy is to create a log table to store the results of your LRQ script so you can monitor, over time, how often certain queries are hitting the list. It’s possible, and sometimes likely, that even after performance tuning a query, it will still be hitting your LRQ. In some cases, its unavoidable but in many its just an opportunity for improvement.