Click here to monitor SSC

Simple-Talk columnist

Including timings, I/O and XML Execution plans in your SQL Test rigs

Published 14 March 2014 4:37 pm

Sometimes, when developing SQL code, you need to run an automated test and benchmark over and over again. In this test run 'with bits',  you want a lot of diagnostic information such as the CPU,  IO,  which can be saved, parsed and compared between runs.  You want the results of the SQL so you can compare that with what you expect the result to be. You also want all the diagnostic information. If you’re feeling sensible, you’ll want to do this in PowerShell, but the code is easily translated to VB or C#.

You need to do this because you want to see if the code gives the results and whether it is doing it in a reasonable way with adequate performance. if it is slow, you want to know why.  You can't do this in SSMS if, like me, everything you do is a race against the clock.

The SET STATISTICS  options (SET STATISTICS IO ,SET STATISTICS PROFILE , SET STATISTICS TIME  and SET STATISTICS XML  are the ones you’ll want.  They all send their data via the message stream. This is a good example of the sort of place where you need to get the contents of the message stream from SQL Server. You know, all that stuff that gets  put in a separate pane in SSMS when you view query results via a grid. To do this in .NET you create a  delegate, identifying the method that handles the event,  to listen for the InfoMessage event on the SqlConnection class.  This gets you all the info, and this would be a rather short blog were it not for the fact that it is likely that you’ll want to run a whole batch rather than a single query.  

I generally don’t do more than SET STATISTICS IO  or SET STATISTICS TIME because you can sort out which statement the timings or IO are related to by means of deft PRINT statements (Print statements also go down the message stream, to the great bafflement of rookie Database developers). However, there are few satisfactions in life like saving all your XML execution plans to disk as you run a batch. It saves so much messing about.

Here is, stripped down, the whole works. You’ll just have to imagine that those separate results are being saved to disk as CSV for later analysis, and you’ll have to imagine that we can split the message string into its separate queries and even parse that data if you want. (its not as hard to do as you think, but that’s another blog post. You’ll also just have to imagine that this isn’t AdventureWorks, it’s easy if you try)


$SQL=@'

use adventureworks

 

Set statistics time on

Set statistics io on

print 'first query'

SELECT count(*) as FirstResult

FROM Sales.SalesOrderDetail s

INNER JOIN Production.Product p

  ON s.ProductID = p.ProductID;

 

print 'Second query'

SELECT BusinessEntityID as SecondResult

FROM HumanResources.Employee

WHERE NationalIDNumber = '509647174';

 

print 'final query'

SELECT count(*) as ThirdResult

FROM HumanResources.Employee

WHERE JobTitle LIKE 'Production%';

 

Set statistics time off

Set statistics io off

 

'@

$message =[string]''

 

$ErrorActionPreference = "Stop" # nothing can be retrieved

#

$conn = new-Object System.Data.SqlClient.SqlConnection("Server=MyServer;DataBase=AdventureWorks;password=WouldntYouLikeToKnow; uid=PhilFactor")# fill this in before you run it!

$conn.Open() | out-null #open the connection

$handler = [System.Data.SqlClient.SqlInfoMessageEventHandler] {param($sender, $event)    $global:message = "$($message)`n $($event.Message)" };

$conn.add_InfoMessage($handler);

$conn.FireInfoMessageEventOnUserErrors=$true

$cmd = new-Object System.Data.SqlClient.SqlCommand($SQL, $conn)

$rdr = $cmd.ExecuteReader()

do

    {

    $datatable = new-object System.Data.DataTable

    $datatable.Load($rdr)

    $datatable |Format-table

    }

while ($rdr.IsClosed -eq $false)

$message

 


This gives the output (after the results which merely list the returned results as a table)


 

 Changed database context to 'Adventureworks'.

 

 SQL Server Execution Times:

   CPU time = 0 ms,  elapsed time = 0 ms.

 

 SQL Server Execution Times:

   CPU time = 0 ms,  elapsed time = 0 ms.

 first query

 

 SQL Server Execution Times:

   CPU time = 0 ms,  elapsed time = 0 ms.

 Table 'Product'. Scan count 1, logical reads 15, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.

 Table 'SalesOrderDetail'. Scan count 1, logical reads 276, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.

 

 SQL Server Execution Times:

   CPU time = 94 ms,  elapsed time = 96 ms.

 Second query

 

 SQL Server Execution Times:

   CPU time = 0 ms,  elapsed time = 0 ms.

 SQL Server parse and compile time:

   CPU time = 0 ms, elapsed time = 0 ms.

 Table 'Employee'. Scan count 0, logical reads 2, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.

 

 SQL Server Execution Times:

   CPU time = 0 ms,  elapsed time = 1 ms.

 final query

 

 SQL Server Execution Times:

   CPU time = 0 ms,  elapsed time = 0 ms.

 Table 'Employee'. Scan count 1, logical reads 9, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.

 

 SQL Server Execution Times:

   CPU time = 16 ms,  elapsed time = 1 ms.

 

PS C:\Users\Phil.Factor> 

 

Leave a Reply