Click here to monitor SSC
  • Av rating:
  • Total votes: 67
  • Total comments: 23
Laerte Junior

Gathering Perfmon Data with Powershell

08 July 2010

When you have to routinely collect  data from Performance Monitor Counters, it soon becomes easier and more convenient to use  PowerShell. SQL Server MVP Laerte Junior was inspired to create a script, and guides us through its useful functions.

I was reading an excellent article by Allen White (Twitter|Blog) on Performance Data Gathering (to which I give all the credit for inspiring this module), and that's when a Powershell apple fell on my head: “Why not make a function that facilitates this?” As a DBA, one of my almost-daily tasks is to capture and analyze Perfmon counters. You may wonder to yourself, “but you can do that with the get-counter cmdlet in Powershell 2.0.

Yes you can, and it is very helpful. However, I need multiple counters, and the results displayed in line with all values separated by commas (which is usually used to facilitate insertion into SQL Server), and that’s when get-counter gets a little trickier to use. I tried to think of an easy way to choose which counters you want, save this configuration for later use, and then insert the output data into a SQL Server table for further analysis. As a result of my tinkering, I believe I’ve got something pretty useful to share with you.

Before I go on, I want to give a shout out to the people who helped me solve some issues I encountered; as always, my friend MVP Shay Levy (Twitter|Blog), and also MVP JVierra , Trevor Sullivan (Twitter|Blog), and Lauri Kotilainen (Twitter|Blog) . A special thank-you goes to MVP Pinal Dave (Twitter|Blog), who invited me to be a guest blogger on SQLAuthority.com, and MVP Grant Fritchey (Twitter|Blog) for the kind words on his blog.

This article will cover how to use this new module in a variety of situations, with some clear and every-day examples to hopefully make everything obvious. To start with, I’ll walk you through how to find out more information on the various Perfmon counters, so that you can choose which ones you want to use, as well as how to save that configuration so that you can reuse it later (and on different servers). Once we’ve covered that, we’ll take a look at how to get the data collected, and two ways to store it once we’ve got it.

Performance Counters

A polished version of this whole module will be available in the next release of SQLPSX - SQL Server Powershell Extensions. For now, you can download it from the top of this article, and I should point out that this module is a V1, so it might (and probably will) have some issues, and you can contact me anytime if you need a hand. Alternatively, you can use the built-in help to see all of the parameters and some examples:

Get-Help Get-PerfCounterCategory -examples

All tests were done on two Hyper-V 64-bit Virtual Machines;  a Windows 2008 Server R2 domain controller with SQL Server 2008 R2 (Obiwan), and a Windows 7 machine with SQL Server 2008 (Chewie). Before you get started, if you want to work with multiple servers, you need to enable the RemoteRegistry Service in the remote machines , as you can see :

Figure 1.Ensuring the RemoteRegistry Service is active on the remote machines.

For further reading on why this is necessary, I suggest you take a look at “Why run the RemoteRegistry Service?” by Brian Groth.

Top Tip:
If you want to know which information (properties) are returned by a given functions, type:
Function | get-member –membertype noteproperty.

For example:
Get-PerfCounterCategory | Get-Member -MemberType NoteProperty

Finding the Information

The thing that most bothered me at the start of this project was that, for some counters in Perfmon, I knew their names but could not remember exactly which category they were in. I’ll give you a few examples. Let’s say you remember that Buffer Cache Hit Ratio is in one of the SQL Server categories, but which one? We know that it's painful to look in IDE for the answer, so now it's much easier; if you want to discover all the registered categories, you can use this command:

Get-PerfCounterCategory

Alternatively, if you want to be more fine-grained and only discover the categories starting with “SQLServer”, ordering by Category Name, then use this command:

Get-PerfCounterCategory -CategoryName "SQLServer*" | Sort-Object 
Category_name | Format-list Machine_Name,Category_Name,Category_Type,
Category_Description 

Figure 2. Results for the Get-PerfCounterCategory cmdlet.

Each Performance Counter category has a number of instances, or it can have just a single instance. For example, with the Processor Counter, you can have one instance for each processor, and with Buffer Manager you have only a single instance. You can see the information for, as an example, all instances in the Processor category by typing:

Get-PerfCounterCategory -CategoryName "PROCESSOR*" | Get-PerfCounterInstance
| Sort-Object Category_name | Format-list Machine_Name,Category_Name,
Instance_name 

What’s that I hear you ask? Can you get all counters in all instances and categories? Yes, of course you can… and with some help information about each counter, too:

Get-PerfCounterCategory | Get-PerfCounterInstance | Get-PerfCounterCounters 
| Format-List Machine_Name,Category_Name,Counter_Name,Counter_Type,
Counter_Help 

Figure 3. All counters in all instances and categories, with information.

How about if you want to see all the counters from Buffer Manager category? Just use:

Get-PerfCounterCategory -CategoryName "*Buffer Manager*" | 
Get-PerfCounterInstance | Get-PerfCounterCounters | Format-List 
Machine_Name,Category_Name,Counter_Name,Counter_Type,Counter_Help 

And if I want to work with multiple servers? Simple, just pipe the servers names into the cmdlet:

Obiwan, Chewie| Get-PerfCounterCategory -CategoryName "*Buffer Manager*" 
| Get-PerfCounterInstance | Get-PerfCounterCounters | Format-List 
Machine_Name,Category_Name,Counter_Name,Counter_Type,Counter_Help 

… Or use a flat text file with the servers’ names inside it:

Get-content servers.txt | Get-PerfCounterCategory -CategoryName 
"*Buffer Manager*" | Get-PerfCounterInstance | Get-PerfCounterCounters | 
Format-List Machine_Name,Category_Name,Counter_Name,Counter_Type,Counter_Help

As you can see , we can perform the search for missing information in various ways, facilitating the process of selecting the appropriate counters .

Setting an XML Configure File

When I started writing this module, my big goal was that I should somehow be able to be mobile with the counters that I chose. That is, I should be able to save and then later use  the same configurations as easily within the original server as any another. After thinking about it, I decided that the gathering of data should start with reading an XML configuration file containing all the counters I’m interested in, and that I should have one file for memory counters, one for processor counters, and so on, to segment the data gathering .

It seems like it might be a really complicated solution, especially when XML is involved, but thankfully it’s not too bad, as you’ll see in a moment. To start with, creating an XML file with all the counters from the Processor category and _Total instance is as simple as running:

Get-PerfCounterCategory -CategoryName "Processor*" | Get-PerfCounterInstance
 -InstanceName "_Total" | Get-PerfCounterCounters | Save-ConfigPerfCounter
 -PathConfigFile c:\temp\TemplateProcessor.XML -NewFile 

If you then look in your C:\temp folder, an XML file called TemplateProcessor_MACHINENAME.XML will be there, ready to be used.

Top Tip
To facilitate the ability to identify and use multiple servers in your data-gathering process, the name of the target machine is added to the name of both the output file and the XML config file.

<>

Figure 4. Generating an XML file containing the Perfmon configuration

As you can see, in this example I used the –NewFileswitch parameter which, naturally, creates a new .XML file. Perhaps you’re wondering whether there are situations where I would not use this parameter? Let’s say we have a file specifying just the Buffer Cache Hit Ratio counter from the Buffer Manager category:

Get-PerfCounterCategory -CategoryName "*Buffer Manager*" | 
Get-PerfCounterInstance | Get-PerfCounterCounters -CounterName 
"*cache hit ratio" | Save-ConfigPerfCounter -PathConfigFile 
c:\temp\TemplateBufferManager.XML -NewFile 

However, now we need to add the Page Life Expectancy counter to the already-existing file, so we just omit the  -NewFile  parameter, and the selected counters will be added to the XML file, rather than overwriting it:

Get-PerfCounterCategory -CategoryName "*Buffer Manager*" | 
Get-PerfCounterInstance | Get-PerfCounterCounters -CounterName "page life*" 
| Save-ConfigPerfCounter -PathConfigFile c:\temp\TemplateBufferManager.XML

To create the configuration file in such a way that multiple servers can use it, we just need to specify the desired machines…

"ObiWan", "Chewie" | Get-PerfCounterCategory -CategoryName 
"*Buffer Manager*" | Get-PerfCounterInstance | Get-PerfCounterCounters | 
Save-ConfigPerfCounter -PathConfigFile c:\temp\BufferManager.XML -NewFile 

… and a separate file will be created for each server, using the name passed in the –PathConfigFile parameter  and adding the server name:

Figure 5. Creating configuration files for multiple servers.

Gathering Data

With all that set up, the next step, starting to actually gather information, is a bit more complicated. Let’s take a look:

To start with, the command below allows us to gather data using the counters defined in C:\temp\TemplateBufferManager_Obiwan.XML , starting the collection job on 05/24/2010 08:00:00 AM, ending it on 05/30/2010 22:00:00 PM, with an interval of 10 seconds  between each data collection, and outputting the values to C:\temp\ TemplateBufferManager.txt:

Set-CollectPerfCounter -DateTimeStart "05/24/2010 08:00:00" -DateTimeEnd 
"05/30/2010 22:00:00" -Interval 10 -PathConfigFile 
c:\temp\TemplateBufferManager_Obiwan.XML -PathOutputFile 
c:\temp\TemplateBufferManager.txt

When you run this command, you will notice that the Powershell session will be locked; the function is in a loop to gather the data, and while this loop is running the session will be locked. Thankfully, we can resolve this by simply adding the –RunAsJobparameter, which tells Powershell to perform this procedure asynchronously:

Set-CollectPerfCounter -DateTimeStart "05/24/2010 08:00:00" -DateTimeEnd 
"05/30/2010 22:00:00" -Interval 10 -PathConfigFile 
c:\temp\TemplateBufferManager_Obiwan.XML -PathOutputFile
c:\temp\TemplateBufferManager.txt -RunAsJob

As you may have guessed, this parameter creates a Job, and when you’re working with Jobs there are some things you have to take into consideration:  
The job created will be called “PERFCOUNTERS_” plus the name of the XML file and the current time (YYYYMMDDHHMMSS). In the case of my example, the name will be: PERFCOUNTERS_TemplateBufferManager_OBIWAN_20100306193300.

Of course, if you’re working with Jobs, then you’ll want to see which jobs are running:

Get-job -state Running

To call the specific data-gathering Job, use ID or Name (which you have discovered using the command above):

get-job -name PERFCOUNTERS_TemplateBufferManager_OBIWAN_20100306193300 | 
format-list

To see if the job is running without errors, run the Receive-Job  cmdlet, and heed Marco Shaw’s (Twitter | Blog) excellent advice:

… when using receive-job, one may want to use the switch parameter -keep. Otherwise, any associated output is lost if receive-job is run again.

With the –keep parameter, the output is retained on the screen when you next run the receive-job cmdlet. So, with that in mind, our investigative command is now:

get-job -name PERFCOUNTERS_TemplateBufferManager_OBIWAN_20100306193300 | 
receive-job -keep

Alternatively, if we want to see all jobs used by the PerfCounters Module, we just need to return all jobs starting with “PERFCOUNTERS…”, so we can use where-object to find what we need:

get-job -State running | Where-Object {$_.name -like "PerfCounters*"} 

And finally, if I want to stop the job before the date set in the Set-CollectPerfCounter command, I just need to type:

stop-job -name PERFCOUNTERS_TemplateBufferManager_OBIWAN_20100306193300 
or
stop-job -id <job id> (The Job ID can be found using Get-Job)

Figure 6. Finding out what state the PerfCounter jobs are in.

Uploading Data to a SQL Server Table

We can do this in one of two ways; the first method is bulk inserting the .txt file, and the other is, when the data is being gathered, to save it directly into a SQL Server Table. Let’s take a closer look:

Bulk Inserting

After you’ve run your data-gathering job, you will see that the .txt file is ready to be inserted into SQL Server using a simple T-SQL bulk insert, and the Save-PerfCounterSQLTable function will help you do that. The command below will upload the output .txt file, and create a new table to receive the data (using -NewTable switch parameter):

Save-PerfCounterSQLTable -ServerName Vader -DatabaseName Master -NewTable 
-PathConfigFile c:\temp\TemplateBufferManager_ObiWan.xml -PathOutputFile 
c:\temp\TemplateBufferManager.txt 

Alternatively, if you want to upload the output .txt file into an existing Table, simply omit the -NewTable  switch parameter and pass the target table name in the –TableNameparameter:

Save-PerfCounterSQLTable -ServerName Vader -DatabaseName Master -TableName 
PerfCounterSQLTable_20100528100655 -PathConfigFile 
c:\temp\TemplateBufferManager.xml -PathOutputFile 
c:\temp\TemplateBufferManager.txt 

If you prefer to have a bit more control of your tables, it’s a simple matter to combine the two previously mentioned switches to upload the .txt file and create a new table to receive it, with a name chosen by you :

Save-PerfCounterSQLTable -ServerName Vader -DatabaseName Master -TableName 
MyTableName -NewTable -PathConfigFile c:\temp\TemplateBufferManager.xml
-PathOutputFile c:\temp\TemplateBufferManager.txt 

As you may have noticed, the PathConfigFile  and PathOutputFile parameters, which contain the full paths of the XML configuration file and output file respectively, are required. If you’d like to learn more about the Save-PerfCounterSQLTable command, use :

Get-Help Save-PerfCounterSQLTable –full

Saving  directly into SQL Server Table:

To best demonstrate this, let's create a complete example. First, we choose the counters that we want to use. In case, this is the whole SQL Server Buffer Manager category on the Chewie and ObiWan machines, as well as the Processor category on Chewie. We save this configuration into C:\temp\BufferManager.XML and c:\temp\Processor.XML

"ObiWan", "Chewie" | Get-PerfCounterCategory -CategoryName "*Buffer Manager*"
 | Get-PerfCounterInstance | Get-PerfCounterCounters | Save-ConfigPerfCounter
 -PathConfigFile c:\temp\BufferManager.XML -NewFile 
"Chewie" | Get-PerfCounterCategory -CategoryName "Processor*" | 
Get-PerfCounterInstance | Get-PerfCounterCounters | Save-ConfigPerfCounter 
-PathConfigFile c:\temp\Processor.XML -NewFile

Figure 7. The demo configuration files for Obiwan and Chewie.

Now, with the XML configured, we can now start the gathering of data using background Jobs and saving their output directly into a SQL Server Table. We do not pass the target SQL Server table name as a parameter, so one table will be created for each server, using the naming format of PERFCOUNTERS_XMLFileName_YYYYmmDDhhMMss. Even though you’re using a SQL Table as a data repository, you must pass the path to an output file into your command, because the output file always will be created. Why, you ask? Let’s say you lose your connection to the SQL Server repository; this way, you don’t also lose the data, because it will also be stored in the .txt file. In this case, as we using several different XML files, we’ll only pass the target path, without the file name, and will be creating output files in this location using XMLNAME_MACHINENAME.TXT as a naming convention.

dir "c:\temp\*.Xml" | Set-CollectPerfCounter -DateTimeStart 
"05/24/2010 08:00:00" -DateTimeEnd "06/30/2010 22:00:00" -Interval 10 
-PathOutputFile c:\temp\ -ServerName ObiWan -DatabaseName Testes -NewTable 
-RunAsJob 

Figure 8. Creating the jobs to gather data, and sending the collected data directly to a SQL Server table.

As you can see from figure 8, three jobs are created, and their names all start with PERFCOUNTER, so you can use where-object to quickly and easily find all jobs used by the PerfCounters Module:

Get-Job -State Running | Where-Object { $_.name -like "Perfcounter*" } 

Now, if we look in ObiWan’s SQL Server, we can see that there are three tables with the collected data: one for each machine we gathered data from (i.e. one for each XML file):

Figure 9. Investigating the gathered data in SQL Server.

… And the .txt files containing the same gathered output are created as well:

Figure 10. The gathered data, stored in .txt files.

When you’re setting this up, you can use your own table name to make things easy to find; I am passing the –TableName parameter with “BufferManager”, and so for each machine, a table called BufferManager_MACHINENAME will be created. In this case, BufferManager_Chewie and BufferManager_ObiWan:

dir "c:\temp\BufferManager*.Xml" | Set-CollectPerfCounter -DateTimeStart 
"05/24/2010 08:00:00" -DateTimeEnd "06/30/2010 22:00:00" -Interval 10 
-PathOutputFile c:\temp\BufferManager.txt -ServerName ObiWan -DatabaseName 
Tests -NewTable -RunAsJob -TableName "BufferManager" 

Figure 11. Investigating the data stored in our custom-named tables.

Perhaps you’re wondering why it’s so useful to be able to specify the target table name? Let’s say you stop the gathering, and want to continue on a completely different day, but outputting the data to the same table. Just Pass the -TableName parameter with the name of the original table, and don’t pass the –newtable switch parameter:

Set-CollectPerfCounter -PathConfigFile c:\Temp\BufferManager_Chewie.XML 
-DateTimeStart "05/24/2010 08:00:00" -DateTimeEnd "06/30/2010 22:00:00" 
-Interval 10 -PathOutputFile c:\temp\BufferManager.txt -ServerName ObiWan 
-DatabaseName Testes -RunAsJob -TableName "BufferManager_Chewie" 

Remember, in this case you have to explicitly declare the XML file and, in the case of the code snippet above, I will restart the data gathering to Chewie, with C:\temp\BufferManager_Chewie.xml as the configuration file, and the BufferManager_Chewie table as the SQL Server repository. The data is saved without creating a new table, and the output .txt file is always created as a data backup.

In these examples I use Windows Authentication, but you can pass the Username and Password as parameters. If you get stuck at any point and want some more information, just type get-help <FunctionName>  -examples.

Well folks, I hope you can get some  good use out of this module, as I am. Once again, Powershell wins.

Top Tip
When I was doing this article, I accidentally created 857 tables in SQL Server, and I had to drop them. Is that complicated? Not at all – here’s a hint:

dir
|% {$ _.drop ()}

...where % is an alias for a foreach-object.

Laerte Junior

Author profile:

Laerte Junior is a PowerShell MVP and, through his technology blog and simple-talk articles, an active member of the Microsoft community in Brasil. He is a skilled Principal Database Architect, Developer, and Administrator, specializing in SQL Server and Powershell Programming with over 8 years of hands-on experience. He holds a degree in Computer Science, has been awarded a number of certifications (including MCDBA), and is an expert in SQL Server 2000 / SQL Server 2005 / SQL Server 2008 technologies. He also organizes, and is a speaker at microsoft community events, attracting hundreds of attendees. Laerte has also recently become a Friend of Redgate in Brasil, has taught classes at universities, and produced webcasts for the community.

You should follow him on Twitter as @LaerteSQLDBA

Search for other articles by Laerte Junior

Rate this article:   Avg rating: from a total of 67 votes.


Poor

OK

Good

Great

Must read
Have Your Say
Do you have an opinion on this article? Then add your comment below:
You must be logged in to post to this forum

Click here to log in.


Subject: Congrats Laerte
Posted by: Math (view profile)
Posted on: Friday, July 09, 2010 at 8:21 AM
Message: I'd like to say a few things, I downloaded this module features the first time that Laerte posted on his blog. We found some problems and he promptly fix, answering all my emails quickly. Today I use this module to my 5 servers SQL Server and I can say it works very well. When Laerte says to contact him if you have any doubts, he tells the truth. In addition to extensive knowledge, is a striking simplicity and humility. Congratulations Laerte, including for your MVP. It was more than deserved

Subject: Re
Posted by: laerte (view profile)
Posted on: Friday, July 09, 2010 at 8:32 AM
Message: Sweet Math. Actually None of this would be meaningless if it were not for people and comments like yours. Thank you my friend, I think that after much conversation we had via email can consider you as one. I am honored :)

Subject: Another Good Article.
Posted by: Phil Factor (view profile)
Posted on: Friday, July 09, 2010 at 9:34 AM
Message: Lerte, my friend,
Your enthusiasm is infectious. Keep these glorious articles coming.
One day you must tell me how it is possible to accidentally create 857 tables in a database. It is one thing I haven't yet managed to do!

Subject: Re
Posted by: laerte (view profile)
Posted on: Friday, July 09, 2010 at 9:50 AM
Message: Thanks a Lot Phil. It's always a great pleasure a comment from you. hahah, about the tables, just use this script. Actually it was to be 57 (to my tests), only typed 857 by mistake.

"Compras" | % {
$TableName = $_
0..857 | % {
$SQL = "Create Table tbl_$($tablename)_$_(codigo int, nome varchar(50))"
Invoke-Sqlcmd -ServerInstance Vader -Database MundoNet -Query $SQL -SuppressProviderContextWarning

}
}

Subject: 857 tables
Posted by: Phil Factor (view profile)
Posted on: Friday, July 09, 2010 at 10:42 AM
Message: Wow! The power of PowerShell!

Subject: That's is the guy.
Posted by: PH (not signed in)
Posted on: Friday, July 09, 2010 at 11:24 AM
Message: That's is the guy.

[]´s

My friend.

Subject: Great Article
Posted by: pinaldave (view profile)
Posted on: Friday, July 09, 2010 at 10:23 PM
Message: I have been always admirer of the articles of Laerte.

This article is the proof why I appreciate his knowledge. A well known community contributor.

Kind Regards,
Pinal

Subject: Re
Posted by: laerte (view profile)
Posted on: Saturday, July 10, 2010 at 8:01 AM
Message: PInal, Such kind words coming from you My Friend ,a SQL Server Legend, make me more proud and honored. Thanks a lot for all support you gave and gives to me. :)

Subject: big thanks
Posted by: rreid740 (view profile)
Posted on: Monday, July 12, 2010 at 6:49 AM
Message: Great article!

This has been on my mind for a while, never enough quiet time to think it out - thanks for sharing it!

roger reid
(who loves a good gui as much as the next guy, but prefers to leave eyes and fingers out of cron-type jobs...)

Subject: Very good
Posted by: Felipe Santana (not signed in)
Posted on: Monday, July 12, 2010 at 6:58 AM
Message: Laertes always showing the power of Powershell integrated with SQL Server.
Great article, congratulations once again.

Subject: Powershell Is Great
Posted by: Barbosa (view profile)
Posted on: Monday, July 12, 2010 at 9:23 PM
Message: I Love PowerShell.

Thanxxx for this article.


Subject: Powershell Is Great
Posted by: Barbosa (view profile)
Posted on: Monday, July 12, 2010 at 10:16 PM
Message: I Love PowerShell.

Thanxxx for this article.


Subject: RE: Gathering Perfmon Data with Powershell
Posted by: DBA_DUDE (view profile)
Posted on: Thursday, July 29, 2010 at 3:01 PM
Message: Looks great!, is it still available? Getting a file not found from the "the top of this article" link.

Subject: RE: Gathering Perfmon Data with Powershell
Posted by: DBA_DUDE (view profile)
Posted on: Thursday, July 29, 2010 at 3:47 PM
Message: Looks great!, is it still available? Getting a file not found from the "the top of this article" link.

Subject: RE
Posted by: laerte (view profile)
Posted on: Friday, July 30, 2010 at 2:43 PM
Message: DBA_DUDE, my friend the link is ok (in perfcunters on top of the article on the right). But if you can not, ping me at laertejuniordba@hotmail.com . In the next SQLPSX release (which should be ready in the coming weeks) will also be available

Subject: Gathering perfmon data with powershell
Posted by: abdul samad (view profile)
Posted on: Saturday, November 06, 2010 at 3:21 AM
Message: Hi Laerte,

This article is just amazing and I want to congratulate you for writing such awesome script and great explanation.

It really really helped me a lot, Thank you and great work Laerte.

I am facing one problem while executing that script: When I am executing that script as job and I am executing it for 5 minutes with an Interval of 10 seconds,start time= 4 NOV 2010 at 6 PM(IST) and end time: 4 NOV 2010 at 6:05 PM (IST), below is the time schedule from the script in 24 hour clock.

-DateTimeStart
"11/04/2010 18:00:00" -DateTimeEnd "11/04/2010 18:05:00" -Interval 10

The job is started successfully and running but it is NOT ENDING(stopping) at the specified -DateTimeEnd parameter, ie as per my settings it should stop after 5 minutes,but it is not stopping.

I have to manually stop the job by executing the below script suggested by you:

stop-job -id <job id>

apart from that there are no issues in the script as of now.

I will be very much thankful to you if you can help me in this regard.


Abdul Samad




Subject: Re
Posted by: laerte (view profile)
Posted on: Friday, November 19, 2010 at 4:34 AM
Message: Hi Adbul, thanks for the kind words my friend. I am very happy that you enjoyed. I'll do some testing on the samples that you sent, but please, feel free to add me on msn and we can talk better
laertejuniordba@hotmail.com

Subject: Thank you
Posted by: MortenD (view profile)
Posted on: Thursday, July 14, 2011 at 3:15 AM
Message: This is a really cool and powerful script.
Thank you so much !!

Unfortunately i'm not much of a developer myself and i ran into a issue gathering process counters (CPU and Private Bytes) on all processes to create a top x on the most consuming processes.
The problem is that a process can exist on the time of creating the XML but once i start gathering a process might not be running.

Then
[$Increment].nextvalue() | Out-Null
Fails
Example
Set-CollectPerfCounter Error Detail :Exception calling "NextValue" with "0" argument(s): "Instance 'splwow64' does not exist in the specified Category."[0]

I bet some of you hardcore PS guys might have a workaround on that. I would really appreciate your help.
Br.
/Morten

Subject: MontenD
Posted by: laerte (view profile)
Posted on: Friday, July 15, 2011 at 4:04 AM
Message: Hi MontenD, thank you for reporting this issue. As we talked by email, I will create a new release in SQLPSX with the correction that we did.

Cheers :)

Subject: Problem solved
Posted by: MortenD (view profile)
Posted on: Tuesday, August 02, 2011 at 5:26 AM
Message: Thanks to Laerte
I had my problem solved.

Now the module can handle monitoring for instance system processes where the situation can occur that a process running at the time for creating the traceprofile is not running at runtime (once start gathering)

Thank you for your time and help.
I'm looking forward to future improvements of this GREAT tool
/Morten

Subject: A few issues
Posted by: Zach.Skinner (view profile)
Posted on: Thursday, September 22, 2011 at 10:59 AM
Message: I maintain several instances which contain hyphens in their host names. Not all SQL statements that are built within the module encapsulate the database name with brackets.

Additionally, there is a bug in the Get-ProcessPerfcounter function. When writing to a SQL table, the function does not update the current time to stop the loop. Specifically, the $now variable should be updated before calling continue on line 96.


Subject: A few issues
Posted by: Zach.Skinner (view profile)
Posted on: Thursday, September 22, 2011 at 12:06 PM
Message: I maintain several instances which contain hyphens in their host names. Not all SQL statements that are built within the module encapsulate the database name with brackets.

Additionally, there is a bug in the Get-ProcessPerfcounter function. When writing to a SQL table, the function does not update the current time to stop the loop. Specifically, the $now variable should be updated before calling continue on line 96.


Subject: a few issues
Posted by: laerte (view profile)
Posted on: Sunday, September 25, 2011 at 8:12 PM
Message: Hi Zack, can you email me ? laertejuniordba@hotmail.com

Thanks :)

 

Phil Factor
Searching for Strings in SQL Server Databases

Sometimes, you just want to do a search in a SQL Server database as if you were using a search engine like Google.... Read more...

 View the blog

Top Rated

SQL Server XML Questions You Were Too Shy To Ask
 Sometimes, XML seems a bewildering convention that offers solutions to problems that the average... Read more...

Continuous Delivery and the Database
 Continuous Delivery is fairly generally understood to be an effective way of tackling the problems of... Read more...

The SQL Server Sqlio Utility
 If, before deployment, you need to push the limits of your disk subsystem in order to determine whether... Read more...

The PoSh DBA - Reading and Filtering Errors
 DBAs regularly need to keep an eye on the error logs of all their SQL Servers, and the event logs of... Read more...

MySQL Compare: The Manual That Time Forgot, Part 1
 Although SQL Compare, for SQL Server, is one of Red Gate's best-known products, there are also 'sister'... Read more...

Most Viewed

Beginning SQL Server 2005 Reporting Services Part 1
 Steve Joubert begins an in-depth tour of SQL Server 2005 Reporting Services with a step-by-step guide... Read more...

Ten Common Database Design Mistakes
 If database design is done right, then the development, deployment and subsequent performance in... Read more...

SQL Server Index Basics
 Given the fundamental importance of indexes in databases, it always comes as a surprise how often the... Read more...

Reading and Writing Files in SQL Server using T-SQL
 SQL Server provides several "standard" techniques by which to read and write to files but, just... Read more...

Concatenating Row Values in Transact-SQL
 It is an interesting problem in Transact SQL, for which there are a number of solutions and... Read more...

Why Join

Over 400,000 Microsoft professionals subscribe to the Simple-Talk technical journal. Join today, it's fast, simple, free and secure.