Click here to monitor SSC
  • Av rating:
  • Total votes: 49
  • Total comments: 1
Laerte Junior

Fun with SQL Server Profiler trace files and PowerShell

25 November 2010

Running Profiler traces against multiple servers becomes a painful process when it’s time to collate and filter all that data. It would be time-consuming, frustrating and messy if Laerte hadn’t written this handy PowerShell script (complete with examples) to help you out.

I was starting to get annoyed: I had all my trace files neatly separated into separate folders, but as I’d checked Enable File Rollover when I set up Profiler, it was creating multiple copies of the same trace file. To make things trickier, there were three shifts of DBAs, and each was using different trace settings. I tried to bring together the various trace files columns into the same, single file. As you might have guessed, this was really hard to do!

As the cherry on the top of this frustrating puzzle, some of our servers were very busy, and so the possibility of speedily importing the trace files into SQL Server tables was negated by the overhead it would have generated, so I had to handle the import to SQL Server at a later stage. Although it’s a simple procedure to run fn_trace_gettable for one file, when you have multiple files it quickly becomes complicated to generate a uniform and repeatable script… more so if you have to merge all the different trace files.

It was then that I decided to use PowerShell.

When I began to think of a solution, one of the requirements was to have SQL Server tables generated automatically ($filetotable). There was no problem in setting this parameter, but I wanted to centralize all the files in a single table and, as I mentioned, these files could come with different events and columns.

This became a problem when I began to see how many lines of code and conditions that this solution would require. To be fair, I was thinking in procedural terms, as I previously wrote my functions in Visual Basic. I’d forgotten one of the main features of Powershell: Consistency.

Why should I write all this code, if the output of a cmdlet is an object and can be the input of the next cmdlet? With that in mind… let’s check out the code I came up with, as well as a few examples.

First let’s cover the parameters; the functions work with 4 parameters, one of them (the first) being mandatory.

  • $TraceFileName - A mandatory String specifying the full SQL Server trace file path e.g. "C:\Temp\Profiler.trc" or "C:\temp\*.trc"

    This parameter specifies which trace file needs to be read. If you want to use more than one file, which as I said may each come with different events and columns, specify that all *. trc files in a given location will be read. Otherwise, just select the one file.
  • $FileToTable - An optional boolean flag instructing the function to insert all data into SQL tables, divided according to .trc file.

    A little more detail: this parameter specifies whether the files read will be sent to tables in SQL Server, where the function will set up different tables for each file, with the name "POWERSHELLTRACETABLE_" followed by the name of the trace. We will see below that if this parameter is set to true, then we need to pass the name of the Server and Database for these files. If not specified, the default values will be used.
  • $ServerName - The optional server name string. If this is not specified and $FileToTable is set to true, then default server will be used when importing the .trc files.
  • $DatabaseName - the optional database name string. As with $ServerName, if this is not specified and $FileToTable is set to true, then TEMPDB will be used when importing the .trc file.

Get-SQLProfiler Script Code

  1: Function Get-SQLProfiler ()

  2: <#

  3: ----------------------------------------------------------

  4: Load SQL SERVER Profiler Traces Files. (.trc)

  5: ----------------------------------------------------------

  6: Version 1.0

  7: Laerte Poltronieri Junior



 10: $TraceFileName   = MANDATORY String Full SQL SERVER Trace File Path  "C:\Temp\Profiler.trc" or "C:\temp\*.trc"  

 11: $FileToTable = OPTIONAL Boolean Flag to insert all data into SQL tables, divided by .trc file

 12: $ServerName = OPTIONAL Server Name String - If not especified and  $FileToTable = true default server will be used

 13: $DatabaseNe = OPTIONAL Database Name String - If not especified and  $FileToTable = true TEMPDB will be used


 15: #>



 18: {


 20:  [CmdletBinding()]


 22:  PARAM(

 23:    [Parameter(Position=1,Mandatory=$true, ValueFromPipelineByPropertyName=$true,HelpMessage="SQL Server Profiler Trace File")]

 24:    [Alias("FullName")]

 25:    [ValidateScript({$_ -match ".TRC"})]

 26:    [String] $TraceFileName,


 28:    [Parameter(Position=2,Mandatory=$false, ValueFromPipelineByPropertyName=$true,HelpMessage="Flag to insert into SQL Table. Default False")]

 29:    [Alias("InsertFile")]

 30:    [switch] $FileToTable = $false,



 33:    [Parameter(Position=4,Mandatory=$false, ValueFromPipelineByPropertyName=$true,HelpMessage="Server Name Default Localhost")]

 34:    [Alias("SvrName")]

 35:    [String] $ServerName = $env:COMPUTERNAME,


 37:    [Parameter(Position=5,Mandatory=$false, ValueFromPipelineByPropertyName=$true,HelpMessage="Database Name Default TEMPDB")]

 38:    [Alias("DbName")]

 39:    [String] $DatabaseName = "TEMPDB"



 42:   )



 45:  begin

 46:     {



 49:   $verbosePreference="continue"

 50:   if ($fileToTable -AND $servername -eq $env:COMPUTERNAME -and $DatabaseName -eq  "TEMPDB" ) {

 51:    $msg = "Server and Database parameters are not informed default values will be used : Server " + $env:COMPUTERNAME + " Database : TEMPDB"

 52:    write-warning $msg

 53:   } 



 56:   [reflection.assembly]::LoadWithPartialName("Microsoft.SqlServer.ConnectionInfoExtended") | out-null    

 57:   [reflection.assembly]::LoadWithPartialName("Microsoft.SqlServer.ConnectionInfo") | out-null    

 58:   [reflection.assembly]::LoadWithPartialName("Microsoft.SqlServer.Smo") | out-null

 59:  }

 60:  process

 61:  {


 63:   try

 64:   {


 66:    # Verify if was passed multples .trc

 67:    $MultipleFiles = ($TraceFileName.substring($TraceFileName.length  - 5, 5) -eq "*.trc")


 69:    #Setup Final Result and line number

 70:    $LineNumber = 1

 71:    $FinalResult = @()



 74:    # Get All .trc files (one or various)

 75:    foreach ($TraceFilePath in Get-ChildItem $TraceFileName -ErrorAction Stop ) {


 77:     try

 78:     {



 81:      #get trace name to create table

 82:      $TraceFileNameTRC = ($TraceFilePath.PSChildName).trim()

 83:      $TraceFileNameTRC = $TraceFileNameTRC.Trim()


 85:      [String] $TraceFilePathString = $TraceFilePath


 87:      $TableName = "PowershellTraceTable_" + $TraceFileNameTRC.substring(0,$TraceFileNameTRC.length -4)


 89:      $TraceFileReader = New-Object Microsoft.SqlServer.Management.Trace.TraceFile

 90:      $TraceFileReader.InitializeAsReader($TraceFilePathString)


 92:      if ($TraceFileReader.Read()-eq $true)

 93:      {


 95:       while ($TraceFileReader.Read())

 96:       {



 99:        $ObjectTrace = New-Object PSObject



102:        $ObjectTrace | add-member Noteproperty LineNumber   $LineNumber  

103:        $ObjectTrace | add-member Noteproperty TraceFile   $TraceFileNameTRC 



106:        $TotalFields = ($TraceFileReader.FieldCount) -1


108:        for($Count = 0;$Count -le $TotalFields;$Count++)

109:        {

110:         $FieldName = $TraceFileReader.GetName($Count)

111:         $FieldValue = $TraceFileReader.GetValue($TraceFileReader.GetOrdinal($FieldName))

112:         if ($FieldValue -eq $Null){ $FieldValue = ""}


114:         $ObjectTrace | add-member Noteproperty  $FieldName  $FieldValue

115:        }


117:        $FinalResult += $ObjectTrace

118:        $LineNumber ++


120:       }

121:       if ($FileToTable)

122:       {


124:        try {

125:         $SQLConnection = New-Object Microsoft.SqlServer.Management.Common.SqlConnectionInfo

126:         $SQLConnection.ServerName = $ServerName

127:         $SQLConnection.DatabaseName = $DatabaseName



130:         $TraceFileWriter = New-Object Microsoft.SqlServer.Management.Trace.TraceTable



133:         $TraceFileReader.InitializeAsReader($TraceFilePathString)

134:         $TraceFileWriter.InitializeAsWriter($TraceFileReader,$SQLConnection,$TableName)


136:         while ( $TraceFileWriter.Write()) {}

137:        }

138:        Catch {

139:          $msg = $error[0]

140:          write-warning $msg

141:        } 

142:        Finally {

143:         $TraceFileWriter.close()

144:        }



147:       }


149:      }



152:     } Catch {

153:         $msg = $error[0]

154:         write-warning $msg 

155:       } Finally {

156:         $TraceFileReader.close()

157:     }


159:    }


161:    Write-Output $FinalResult  



164:   } Catch {

165:      $msg = $error[0]

166:      write-warning $msg 

167:   } Finally {

168:       $TraceFileReader.Dispose

169:       $TraceFileWriter.Dispose

170:       $SQLConnection.Dispose

171:   }

172:  } 


174: }



You can download this script from the top of this article or from my SkyDrive. Now, let´s do some tests! I created a function module containing just this function and imported it into my powershell profile, just to make our test easier. So, to import this module, I type the following in my PowerShell profile:

Import-Module -Name C:\temp\Get-SQLProfiler.psm1 -WarningAction SilentlyContinue  -ErrorAction Stop 

Reading One Trace File and not Insert into SQL table.

Let’s say we have a SQL Server profiler trace file at C:\Temp\ProfilerTraces\TSQL.trc which looks something like this:

SQL Server trace file.

This file contains the events and columns of the SQL Server Profiler TSQL template, and if we execute:

Get-SQLProfiler "C:\TEMP\ProfilerFiles\TSQL.TRC"

...we’ll get this result:

Figure 2. The sample trace file, processed by the Get-SQLProfiler function.

The LineNumber and TraceFile properties will be in each resulting object of this function, the former being (obviously) the line number, which increases incrementally, and the latter being the name of the trace. Remember, these properties match the columns in the trace file, even if I'm merging files with different columns and events.

Let´s see just the LineNumber, TraceFile, EventClass and TextData properties, filtering for just the lines smaller than or equal to 5:

Get-SQLProfiler "C:\TEMP\ProfilerFiles\TSQL.TRC"  | where-object {$_.LineNumber -le 5} | select LineNumber, TraceFile ,EventClass ,TextData | format-table

Figure 3. The trace file, filtered and displaying just a subset of properties.

Reading One Trace File and Insert into SQL table.

With the previous example, we’d have the same result if the $FiletoTable parameter was set to $true, but it would mean that we would now have a SQL Server table containing this data:

  1: Get-SQLProfiler "C:\TEMP\ProfilerFiles\TSQL.TRC" $true | where-object {$_.LineNumber -le 5} | select LineNumber, TraceFile ,EventClass ,TextData | format-table

As we’ve not specified the server and database, a warning message is displayed, informing us that the function is using default values:

Figure 4. the Get-SQLProfiler function displays a warning when it is forced to use default values.

And the resulting SQL Server table will look something like this:

Figure 5. The SQL Server table created by Get-SQLProfiler, containing the requested data.

Reading Multiple Trace Files and Insert into SQL table.

Let´s say we have a SQL Server Profiler trace file rollover (like the one I described at the start of this post). If you want to merge your results, simply enter the path where the trace files are stored:

  1: Get-SQLProfiler "C:\TEMP\ProfilerFiles\Rollover\*.TRC" $true |  select LineNumber, TraceFile ,EventClass ,TextData | format-table

And you can see in the figure below shows when the file being read changes:

Figure 6. Reading multiple .trc files using Get-SQLProfiler.

As I mentioned, Get-SQLProfiler will created one SQL table for each trace file. However, if you want all the trace files placed in a single table, save the result of the cmdlet into an XML (or txt) file, which you can import soon after the invoke-sqlcmd. I have an example demonstrating this below.

Reading Multiple Trace Files With Different Events and Columns

We can also use Get-SQLProfiler to merge two trace files with different columns. In the example here, I merge the files TSQL_SP.trc and TSQL_Locks.trc.

Figure 7. Merging files with different columns using Get-SQLProfiler.

Aha, I can hear you thinking:

Well that’s all well and good Laerte, but if I merge multiples files with different setups, how can I now find the available properties using Get-SQLProfiler?”

Simple: apply a filter for a single row, such as linenumber = 1, because LINENUMBER and TRACEFILE will always be commonly available in Get-SQLProfiler:

  1: Get-SQLProfiler "C:\TEMP\ProfilerFiles\Merge\*.TRC" | where-object {$_.LineNumber -le 1} | format-table

Figure 8: Determining which properties are representing in the merged data.

Now you just need to get the complete list, choose the fields you’re interested in, and specify them in the | select parameter. Remember that when merging multiple files with the $FiletoTable option set to $true, Get-SQLProfiler creates several tables, one for each file.

But Laerte, I wanted to centralize everything into one file.

I confess that I wanted to do this through a parameter when I first started writing this solution, and was intimidated by how much code I thought it would require. Yet, as I said, PowerShell can resolve this situation with just two more lines of code, and without having to bloat my function. First I create a table with an XML column; for my tests, I’ll put it in tempdb:

USE [tempdb]

Object:  Table [dbo].[PowershellProfileTable_XML]    Script Date: 11/18/2009 02:27:39 ******/
If  EXISTS (Select * FROM sys.objects WHERE Object_id = Object_ID(N'[dbo].[PowershellProfileTable_XML]') And type In (N'U'))
DROP TABLE [dbo].[PowershellProfileTable_XML]

USE [tempdb]

Object:  Table [dbo].[PowershellProfileTable_XML]    Script Date: 11/18/2009 02:27:42 ******/


CREATE TABLE [dbo].[PowershellProfileTable_XML](
[XML] [xml] NULL


Now let´s get our Powershell to work; we just save our merged output to XML (and we can still choose which properties we want to focus on), and then we can import the resulting file into SQL Server:

(Get-SQLProfiler "C:\TEMP\*.TRC" | Select LIneNumber,EventClass,TextData,NTUserName,LoginName | ConvertTo-Xml -NoTypeInformation).save("c:\temp\test.xml")
"insert into PowershellProfileTable_XML values ('" + ((Get-Content "C:\Temp\test.xml") -replace "'", "''")  + "')" 
invoke-sqlcmd -ServerInstance $env:COMPUTERNAME -database
"master" -query $xml

Now, with a simple Xquery, we can list all of the data in the XML files:

Select t2.Col1.value('(Property[@Name="LineNumber"]/text())[1]', 'int') LineNumber,
Property[@Name="EventClass"]/text())[1]', 'nvarchar(255)') EventClass,
Property[@Name="TextData"]/text())[1]', 'nvarchar(max)') TextData,
Property[@Name="NTUserName"]/text())[1]', 'nvarchar(255)') NtUserName,
Property[@Name="LoginName"]/text())[1]', 'nvarchar(255)') LoginName
FROM dbo.testeXML
CROSS APPLY xml.nodes('/Objects/
Object') As t2(Col1)
where t2.Col1.value('(
Property[@Name="LoginName"]/text())[1]', 'nvarchar(255)') Like '%laerte%'

Figure 9. Querying the merged data of multiple trace files.

The beauty of this solution is that I don’t need to apply any more parameters to handle this, as the consistency characteristic of PowerShell does it all for me; I just need to pipe and save. Now we can read our trace files, save them to XML, filter them by any property, and then merge and import them into a SQL Table…and do so effortlessly.

However, being who I am, I’ve kept tinkering with module, and I’ll briefly walk through some of the final adjustments I’ve made for the (at the time of writing) latest version, which you can download at the top of this article.

The first change I made was to remove the parameters for Get-SQLProfiler to insert results into a SQL Server table, as I wanted to see if it was useful to only insert the data into one common table after applying filters (or manipulating the data any way you want). Remember, if you want to use different tables for each .TRC file, just use the fn_trace_gettable TSQL function.

So, now that I was filtering and manipulating my data before moving it into a table, I created another function, Save-InfoToSQLTable, to actually take care of the insert. So, with this latest version of the module, you can filter, sort, do essentially whatever you want to your data, and then insert your curated dataset into a SQL Server table.

Let’s see some examples of how this works:

Read All Traces found in C:\Profiler\Compras\ and filter for dbo.fc_pco_round_000 in TextData:

dir "c:\Profiler\Compras\*.trc" | Get-Sqlprofiler | Where-Object { $_.textdata -like ‘*dbo.fc_pco_round_000*’ }

Create the SQL Server table and save the filtered data into it (Table Name created by the function):

dir "c:\Profiler\BlockedProcess_210510_1.trc" | Get-Sqlprofiler | Where-Object { $_.textdata -like ‘*dbo.fc_pco_round_000*’  and $_.reads –ge 100000 and $_.spid -eq 90}  | Save-InfoToSQLTable -ServerName Vader -DatabaseName Testes –NewTable

Create the SQL Server table and save the filtered data into it (Table Name specified via a parameter):

dir "c:\Profiler\BlockedProcess_210510_1.trc" | Get-Sqlprofiler |  Save-InfoToSQLTable -ServerName Vader -DatabaseName Testes  -newtable -TableName TestProfiler

Saving the data into a SQL Server table which has already been created:

Get-Sqlprofiler -TraceFileName "c:\Profiler\compras\prf-compras_1.trc" | Save-InfoToSQLTable -ServerName Vader -DatabaseName Testes -TableName Teste

Saving the data into a SQL Server table which has already been created with a UserName and Password:

Get-Sqlprofiler -TraceFileName "c:\Profiler\compras\prf-compras_1.trc" | Save-InfoToSQLTable -ServerName Vader -DatabaseName Testes -TableName Teste –Username UserName –PassWord YourPassword

Before we finish, there are some considerations which you should bear in mind:

  • This module only works with PowerShell X86.
  • If you’re use large files, use I suggest you use T-SQL and the fn_trace_gettable function. In my tests, the T-SQL was superior in terms of upload speed.

I see use-case of this module as being a situation where you have small files (e.g. during a rollout) and want to filter by various conditions, such as seen in the example above. Once you’ve already filtered the data, you can save it for easy access at a later date.

Thanks to my friend Shay Levy who, as always, takes time to share his immense knowledge. Jeffrey Hicks too, who offered a lot of tips on mandatory parameters in functions, as well as my SQL Server Day buddy Felipe Ferreira for his XQuery advice, and obviously Chad Miller, the master of PowerShell with SQL Server.

Laerte Junior

Author profile:

Laerte Junior is a Cloud and Datacenter Management MVP, focused on PowerShell and automation and, through through his technology blog and simple-talk articles, an an active member of the SQL Server and PowerShell community around the World. He is a skilled Principal Database Architect, Developer, and Administrator, specializing in SQL Server and PowerShell Programming and Automation. He also is PASS Virtual Chapter Mentor , Co-Lead of the PASS PowerShell Virtual Chapter and former PASS Regional Mentor for Brasil. In addition He is also a HUGE Star Wars Fan, his main skill. "May The Force be with us" You should follow him on Twitter as @LaerteSQLDBA

Search for other articles by Laerte Junior

Rate this article:   Avg rating: from a total of 49 votes.





Must read
Have Your Say
Do you have an opinion on this article? Then add your comment below:
You must be logged in to post to this forum

Click here to log in.

Subject: This could really help me
Posted by: JoseMinistro (view profile)
Posted on: Monday, March 31, 2014 at 5:14 PM
Message: Hi,

First of all congratulations for your article and thanks for sharing your work.

I have a situation where I need to import Analysis Services trace files into a database, but the function "Fn_trace_gettable(" doesn't, work for Analysis Service Trace files.

Do you think you code could work with Analysis Service Trace files?

I will really appreciate a comment on this.

Many thanks.

José Ministro


Top Rated

PowerShell Desired State Configuration: Pull Mode
 Automated configuration management is simpler with Desired State Configuration (DSC) and Push mode, but... Read more...

The Poster of the Plethora of PowerShell Pitfalls
 One of the downsides of learning a new computer language is that transfer of training doesn't always... Read more...

Getting Data Into and Out of PowerShell Objects
 You can execute PowerShell code that creates the data of an object, but there is no cmdlet to generate... Read more...

SMTP Routing Changed in an Exchange Database Availability Group
 In Exchange Server 2010 it is possible to make the system more resilient by creating a Database... Read more...

Emulating the Exchange 2003 RUS for Out-of-Band Mailbox Provisioning in Exchange 2007
 Exchange's Recipient Update Service was important in Exchange 2000 or 2003 in order to complete the... Read more...

Most Viewed

Upgrade Exchange 2003 to Exchange 2010
  In this article, the first of two in which Jaap describes how to move from Exchange Server 2003... Read more...

Upgrade Exchange 2003 to Exchange 2010 - Part II
 In Jaap's second article on upgrading straight from Exchange Server 2003 to 2010, he explains how to... Read more...

Goodbye Exchange ExMerge, Hello Export-Mailbox
 ExMerge was a great way of exporting a mailbox to an Exchange PST file, or for removing all occurences... Read more...

Exchange E-mail Addresses and the Outlook Address Cache
 Because Exchange auto-complete cache uses X.500 addresses for e-mail sent to addresses within the... Read more...

PowerShell One-Liners: Collections, Hashtables, Arrays and Strings
 The way to learn PowerShell is to browse and nibble, rather than to sit down to a formal five-course... Read more...

Why Join

Over 400,000 Microsoft professionals subscribe to the Simple-Talk technical journal. Join today, it's fast, simple, free and secure.