Click here to monitor SSC
  • Av rating:
  • Total votes: 39
  • Total comments: 16
Laerte Junior

The PoSh DBA: Solutions using PowerShell and SQL Server

18 September 2012

PowerShell is worth using when it is the quickest way to providing a solution. For the DBA, it is much more than getting information from SQL Server instances via PowerShell; it can also be run from SQL Server as part of a system that helps with administrative and monitoring tasks. Laerte explains how.

Why should a DBA learn PowerShell ? It is all about solutions. In this article, I want to explain how one can integrate PowerShell, TSQL, SQL Jobs and SQL WMI alerts into a complete solution. I will go further in this topic in my new written project along three great friends. Stay Tuned, as we will soon have a complete guide about day-to-day solutions for the DBA, using PowerShell and SQL Server.

When you read about using PowerShell and SQL Server, you are usually learning about the way that you use PowerShell to access SQL Server. Sometimes, instead, you’ll want to use PowerShell directly from SQL Server to create solutions. You may want to do it from TSQL, getting data back in a form that can then be inserted into a table, or execute it on the server from SSMS. You might want to run PowerShell scripts from the SQL Server Agent, or to set up sophisticated alerts using WMI that then execute jobs that are written in PowerShell. I’ll be showing you how to do all this; but let’s take things in easy stages.

Running PowerShell scripts from SSMS Query Editor

Did you know that you can run your PowerShell cmdlets and functions, along with their parameters, very simply from the Management Studio (SSMS) Query Editor, executing them on the server? Yeah, by using xp_cmdshell. Before I start showing you how to use xp_cmdshell to run PowerShell cmdlets from within TSQL, I must make you aware that, by enabling xp_cmdshell on a server, you’re creating potential security issues. There are good reasons why xp_CmdShell is disabled by default.

When using xp_CmdShell to run PowerShell in SSMS , you’ll Just need to remember three things:

  • Use 'Results to text' in SSMS, from the menu item Query -> 'Results to...' to see an easily-readable output from PowerShell.
  • You are not running PowerShell on your own machine, but on the server that is hosting the SQL Server instance you’re connected to. The userID which you are using to open a connection will need execute permissions. You’ll need to make sure that your PowerShell profile has all the functions installed that you need to run. What account is this? If you’re using a SysAdmin account, then the PowerShell command will be executed under the ‘SQL Server service account’, which is the security context in which the SQL Server service is running. Otherwise, you’ll need to set up a proxy account on that server using sp_xp_CmdShell_proxy_account
  • Make sure that PowerShell is installed to the correct version on the host server.

Once you’ve enabled xp_CmdShell, and you have the necessary permissions to use it, PowerShell can give you valuable information easily. I’ll show you a couple of examples: Getting Disk space, and seeing what services are running.

Seeing what services are running

We’ll start simply by listing all the services on the server.

xp_cmdshell 'PowerShell.exe -noprofile Get-Service'

You have the full list of services, whatever their status. What if you wanted only those that had stopped? You’ll need to combine two CmdLets to do this in a pipeline , so it is now time to use a command-line parameter to run the command, the –Command parameter

xp_cmdshell 'PowerShell.exe -noprofile -command "Get-Service | where {$_.status -eq ''Stopped''}"'

And with only a small change we can see all the SQL services that have stopped.

xp_cmdshell 'PowerShell.exe -noprofile -command "Get-Service -name *sql* | where {$_.status -eq ''Stopped''}"'

But you can also query a server remotely once it has been configured :

xp_cmdshell 'PowerShell.exe -noprofile -command "Get-Service -computername ObiWan -name *sql* | where {$_.status -eq ''Stopped''}"'

Getting the free disk space

Here, we are using a Get-Diskspace function (@sqlvariant) for the host of the SQL Server instance. This requires a function that you can download at the top of the article, and which will need to be placed on the server.

xp_cmdshell 'PowerShell.exe -command "get-diskspace ."'

To get the disk space for a different, remote, server, for example, use …

xp_cmdshell 'PowerShell.exe -command "get-diskspace -servername ObiWan"'

To get the disk space for all the Servers into a file called Servers.txt:

xp_cmdshell 'PowerShell.exe -command "get-diskspace -servername (get-content c:\temp\servers.txt)"'

Alternatively, to get just the percentage disk space, you can also use Get-Counter and \LogicalDisk(*)\% Free Space to get all counter instances. You can do this locally, for the host of your instance ….

xp_cmdshell 'PowerShell.exe -noprofile -command "Get-counter -counter ''\LogicalDisk(*)\% Free Space'' | select -expand countersamples"'

…or for a remote server

xp_cmdshell 'PowerShell.exe -noprofile -command "Get-counter -computername ObiWan -counter ''\LogicalDisk(*)\% Free Space'' | select -expand countersamples"'

So can you do more than this and run scripts the same way? Well, no, because there is a limitation. You can’t use the “ (double-quote) character, which is essential for PowerShell, because it is used in the command-line parameter to delimit the script-fragment being executed. To do this, you’ll need to save the script as a file and execute that.

Returning data from PowerShell

If you run a PowerShell CmdLet in xp_cmdshell, how do you get the data back into SQL as tabular data? We’ve shown you the output, but it is not immediately obvious as to how to read it. Xp_cmdshell actually returns a table consisting of a single column called ‘output’. You can insert it into a table using INSERT..EXEC, but INSERT..EXEC has certain restrictions. You cannot nest them and it cannot contain an output clause.

However we can use this method to return an XML representation of the PowerShell objects being returned. All we then have to do is to shred it into a relational form and create a table.

Taking a more refined version of the previous PowerShell command

DECLARE @Percentage XML
--create a table variable for the data to go into, preserving the order of insertion
--insert the XML into the table, line by line
EXECUTE xp_cmdshell '@PowerShell -noprofile -command "Get-counter -counter ''\LogicalDisk(*)\% Free Space'' | select -expand countersamples | select @{Name=''Disk''; e={$_.InstanceName}}, @{Name=''%FreeSpace''; e={''{0:n2}'' -f $_.CookedValue}}|ConvertTo-XML -As string" '
--now assemble the XML as a string in the correct order
SELECT @XML_as_String=COALESCE(@XML_as_String,'') + theXML
ORDER BY theOrder
--now save it into an XML variable
SELECT  @Percentage = @XML_as_String
-- the entity-attribute-value result that comes from the XML has to be pivoted
SELECT MAX (CASE WHEN attribute='Disk' THEN VALUE ELSE '' END ) AS [Disk],
CAST (MAX (CASE WHEN attribute='%FreeSpace' THEN VALUE ELSE '' END )AS numeric(9,2)) AS [% Free Space]
SELECT --shred the XML into an EAV table along with the number of the object in the collection
[property].value('(./text())[1]', 'Varchar(20)') AS [Value],
[property].value('@Name', 'Varchar(20)') AS [Attribute],
DENSE_RANK() OVER (ORDER BY [object]) AS unique_object
FROM @Percentage.nodes('Objects/Object') AS b ([object])
APPLY b.object.nodes('./Property') AS c (property)

This will give the result

Disk                 % Free Space

-------------------- ---------------------------------------

c:                   91.38

d:                   79.61

_total               83.53

(3 row(s) affected)

Which any DBA will recognise as data! What have we done here? We have chosen to create an XML representation of the report which was then returned to SQL Server line by line. We had to re-assemble it into an XML file and shred it in the way that we needed. This is laborious for an ad-hoc request but it makes a lot of sense for a scheduled monitoring job.

Running Scripts That has Posh Jobs on a SQL Agent Job

PowerShell can be run from the scheduler to do regular jobs such as ETL. Although this generally takes little more effort than testing it in the PowerShell ISE, just sometimes PowerShell gives you a culture-shock. Sometimes things happen that you don’t expect, even though they make sense when you think about it later. For example, I recently developed a script that created a lot of PowerShell Jobs. For some reason, when I ran it in the PowerShell Command-line console, it all worked fine. When I then ran it on the SQL Server scheduler, using a CMDExec jobtype, calling PowerShell, nothing happened: and there was no error message in Jobs History.

The script invoked a process that retrieved all the windows updates applied to a list of servers in the past 24 hours, and saved the results into a SQL Server Table within a repository server. It was using runspaces, though what I’ll describe will be useful for anyone that is using background jobs. I was using PowerShell to create jobs that ran in parallel, one for each server I was getting information from. I was getting a list of servers from a file called ‘c:\temp\Servers.txt’ and for each server name, I was starting a background job on the local computer. This job then obtained the windows update information for the server which was then filtered by a Where cmdlet for only those within the past day. The results were reported back to a tbl_WindowsPatches table in a SQL Server repository. The code is this:

Get-Content c:\temp\Servers.txt | % {

  Start-Job -Name $_ -InitializationScript{Ipmo Functions -Force -DisableNameChecking} `
           -scriptblock {    
            (    Get-WindowsUpdates -ComputerName $args[0] `
                                  | Where {$_.UpdateDate -ge ((Get-Date).adddays(-1))} `
                                  | select    CurrentDate, `
                                              ServerName, `
                                              UpdateClientApplicationID, `
                                              UpdateDescription, `
            )   | Out-DataTable `
                | Write-DataTable   -ServerInstance R2D2 `
                            -Database SQLServerRepository -TableName tbl_WindowsPatches
                                } -ArgumentList $_

You can download the Get-WindowsUpdates , Out-Datatable and Write-DataTable functions at the top of this article in the Functions.psm1 file . In this case I have a module called Functions that join all these functions . As I am using PowerShell jobs and  it runs in another runspace, these functions are not visible. So I need to explicit load them in , in the functions module, in the line
$_ -InitializationScript{Ipmo Functions -Force -DisableNameChecking}

Why should it have worked in the PowerShell console, but not when run from the SQL Server Agent? My first test was to create a .bat and run  :

PowerShell.exe "C:\Temp\Automation\GetWindowsUpdates.ps1"  

Ok. So what is happening ? Nothing was stored. My script was creating the jobs, each of which was running in its own independent runspace,  and then it was closing the PowerShell session. The script seemed to run in an open command console, but not when it was closed immediately after the script was run. Were these separate jobs being closed prematurely when the parent session was closed?

To check that this was the problem, I added, to the line in the .bat file that executed PowerShell, the parameter noexit so as to prevent the closure of the session…

PowerShell.exe –noexit "C:\Temp\Automation\GetWindowsUpdates.ps1"

… and it worked. Why? My script was creating the jobs, each of which was running in its own independent runspace, and then closed the PowerShell session. The PowerShell jobs hadn’t been completed when the main session was closed, so nothing was returned to the table, but no error was raised. Why was the session closed before the jobs had completed? It was because the PowerShell jobs run in another runspace, but within the same session that was called. The session must not be closed until all PowerShell jobs finish.

What do I need to do? All I have to do is to wait until all PowerShell Jobs are finished, it is as simple as that. I’d forgotten to add a ‘wait-job *’!

Get-Job | Wait-Job | Out-Null  
Remove-Job -State Completed

The final code is :

Get-Content c:\temp\Servers.txt | % {

    Start-Job -Name $_ -InitializationScript{Ipmo Functions -Force -DisableNameChecking} `
                    -scriptblock {    
            (    Get-WindowsUpdates -ComputerName $args[0] `
                                  | Where {$_.UpdateDate -ge ((Get-Date).adddays(-1))} `
                                  | select    CurrentDate, `
                                                ServerName, `
                                                UpdateClientApplicationID, `
                                                UpdateDescription, `
            )     | Out-DataTable `
                | Write-DataTable   -ServerInstance R2D2 `
                            -Database SQLServerRepository -TableName tbl_WindowsPatches
                                } -ArgumentList $_
Get-Job | Wait-Job | Out-Null  
Remove-Job -State Completed

I save this code into     C:\Temp\Automation\GetWindowsUpdates.ps1 on the server,

The Command to run as a CMDExec SQL Agent Step is :

PowerShell.exe "C:\Temp\Automation\GetWindowsUpdates.ps1"

Outputting SQL Agent Job PowerShell in Job History

Although I’ve shown you how to get data from a PowerShell job that is running in a batch, there are times when all you need is a record of what a script did, so you can check afterwards

If, for example, you have a SQL Agent PowerShell Job that deletes old files in a log shipping process, and you want to output a list of the files that were removed in the Job History you can just to use write-output.  This script shows you what happens:

$FilesRemoved = 'Files Deleted : '
gci "c:\test\*.*" | foreach {

    $FilesRemoved += "Name: {0}, " -f $
    Remove-Item $_.fullname  

write-output $FilesRemoved

Then if you look at your Job History, you’ll then see the list.

The trick to get this to work with SQL Agent PowerShell Job, is that instead of using

    $FilesRemoved += Name: $($"

you need to use format specifier :

"Name: {0}, " -f $

Triggering PowerShell jobs with SQL WMI Alerts

This solution is using PowerShell scripts of course, SQL Server Jobs and a  SQL WMI Alert

Imagine that you have a download folder on a SQL Server host that has several files downloaded automatically by FTP. It is called FTPDownload. A file with a specific name is downloaded once a day. The contents of this file must be loaded into a staging table in another SQL Server.

Firstly, lets create the Windows Query Language (WQL) query to monitor the specific file in a specific folder : for us the folder is c:\FTPDownload and the file is FileImport.CSV

@wmi_query=N'Select * from __InstanceCreationEvent WITHIN 300 WHERE TargetInstance ISA ''CIM_DataFile'' AND  TargetInstance.Drive=''C:''and TargetInstance.path=''\\ftpdownload\\'' and TargetInstance.Name = ''c:\\ftpdownload\\FileImport.csv''',

To a complete explanation about the WMI and WQL I suggest you read the excellent ebook from my good  friend and PowerShell Jedi Ravikanth Chaganti -WMI Query Language via PowerShell

 Then Lets create the Job called IMPORTCSV  with a PowerShell code called importCSV.ps1 on c:\FTPScripts (you’ll need to get these CmdLets from SQLPSX)

try {
   $DataImport = Import-Csv -Path "c:\FTPDownLoad\FileImport.csv" -ErrorAction Stop
    $DataTable = Out-DataTable -InputObject $DataImport
    Write-DataTable -ServerInstance YOURSERVER -Database YOURDATABASE -TableName YOURTABLE -Data $DataTable
    $Msg = "FileImport.csv successfully imported"
    Rename-Item -Path  "c:\FTPDownLoad\FileImport.csv" -NewName  "c:\FTPDownLoad\FileImport_$(Get-date -format 'yyyyMMdd').csv"
    Write-Output $Msg
} catch {
      $ex = $_.Exception
      Write-Error "$ex.Message"
    throw "Failure"


Now it is time to create the SQL WMI Alert to monitor the arrival of this file in the FTP folder, based on our WQL :

Now let’s set the response to the alert to execute the IMPORTCSV job

Here is the code that creates the alert if you’d rather do it via TSQL and you already have the ID of the job that you wish to execute when the alert is fired.

USE [msdb]
/****** Object:  Alert [CheckCSVFile]    Script Date: 8/9/2012 8:47:15 PM ******/
EXEC msdb.dbo.sp_delete_alert @name=N'CheckCSVFile'
/****** Object:  Alert [CheckCSVFile]    Script Date: 8/9/2012 8:47:15 PM ******/
EXEC msdb.dbo.sp_add_alert @name=N'CheckCSVFile',
@wmi_query=N'Select * from __InstanceCreationEvent WITHIN 300 WHERE TargetInstance ISA ''CIM_DataFile'' AND  TargetInstance.Drive=''C:''and TargetInstance.path=''\\ftpdownload\\'' and TargetInstance.Name = ''c:\\ftpdownload\\FileImport.csv''',

And all is done . Every time that a file called Fileimport.csv is created on folder c:\FTPDownload then, 5 minutes later, (the reason for the ‘WITHIN 300’ clause in the WQL),  the alert is fired and the job is then run.  Why 300? Just to allow time for the file to arrive and be written to disk.

If you have some problem with the file and the routine generates a error, the job will finish with no errors even you using Try-Catch. This is because the exit code is 0. The error will be recorder on the job history if you look for it, but that job should finish with an error being flagged to SQL Server Agent.

How to solve this ? Just add the line ‘throw “Failure” ’ into the catch block and then change the exit code to 1 :

} catch {
      $ex = $_.Exception
      Write-Error "$ex.Message"
    throw "Failure"

and the Job will finish  with an error :

 Now that you’ve got that running, you can send an email using PowerShell or by the SQL job, informing whether the job was successful . it is up to you!

For now that is it folks! I hope you guys liked it the Posh DBA series. Some cool stuff are coming .


As usual, I cannot forget the awesome Jedi that is always helping this young Padawan and , of course, everyone that is needing help.

My good friends Ravikanth Chaganti, Shay Levy, my editor Andrew Clarke and the mysterious Sir Phil Factor (thanks for the XML part, Phil) Sir Bob Beauchemin , my brother Mark Broadbent and all people that kindly give their time and knowledge to share.

References :

Laerte Junior

Author profile:

Laerte Junior is a Cloud and Datacenter Management MVP, focused on PowerShell and automation and, through through his technology blog and simple-talk articles, an an active member of the SQL Server and PowerShell community around the World. He is a skilled Principal Database Architect, Developer, and Administrator, specializing in SQL Server and PowerShell Programming and Automation. He also is PASS Virtual Chapter Mentor , Co-Lead of the PASS PowerShell Virtual Chapter and former PASS Regional Mentor for Brasil. In addition He is also a HUGE Star Wars Fan, his main skill. "May The Force be with us" You should follow him on Twitter as @LaerteSQLDBA

Search for other articles by Laerte Junior

Rate this article:   Avg rating: from a total of 39 votes.





Must read
Have Your Say
Do you have an opinion on this article? Then add your comment below:
You must be logged in to post to this forum

Click here to log in.

Subject: Thanks for the mention
Posted by: Phil Factor (view profile)
Posted on: Friday, September 21, 2012 at 5:18 AM
Message: Ah! fame at last. Thanks for the mention, Laerte.

Great article, as usual, Laerte.

Subject: Re
Posted by: Laerte (not signed in)
Posted on: Friday, September 21, 2012 at 5:30 AM
Message: Hey Sir, for me you are a reference in SQL Server and also Powershell. In fact I need to deeply Thank you for your tech reviews and ideas. Nothing fairer than the minimum to Thank you for all the help.

Subject: Good Stuff
Posted by: kin (not signed in)
Posted on: Friday, September 21, 2012 at 6:09 AM
Message: Very informative and useful. Implementing some automation for my daily DBA job and this article will serve as a reference.

Subject: Re
Posted by: Laerte (not signed in)
Posted on: Friday, September 21, 2012 at 6:20 AM
Message: Hey Kin, glad you liked my friend :) Thanks for the nice words

Subject: Posh Man
Posted by: Edvaldo Castro (not signed in)
Posted on: Friday, September 21, 2012 at 8:10 AM
Message: Hey Laerte,

Congrats, Nice post with a great and very usefull content... Keept posting and sharing your knowledge...


Subject: Re
Posted by: Alex Rosa (not signed in)
Posted on: Friday, September 21, 2012 at 9:30 AM
Message: Great article, congrats.

Subject: Re
Posted by: Laerte (not signed in)
Posted on: Friday, September 21, 2012 at 9:40 AM
Message: Edvaldo and Alex, Thanks guys. I really appreciate your feedback and I am glad you guys liked it.

Subject: Disk space function code
Posted by: Ameena (not signed in)
Posted on: Monday, October 1, 2012 at 11:54 AM
Message: Hi,

This is very userful and practical article.

The link to Aaron Nelson's function in the reference section you provided does not work. Could you please add the alternate link.


Subject: Disk Space
Posted by: Laerte (not signed in)
Posted on: Monday, October 1, 2012 at 2:29 PM
Message: Hi Ammeena, I am glad you liked it. The link to Aaron´s function is
I made some improvements as accept pipeline input and an array in the ServerName parameter, and you can download it at the top of the article in the functions.rar

Subject: Nice article - just one question.
Posted by: djdevelop (view profile)
Posted on: Wednesday, October 3, 2012 at 7:45 AM
Message: At the end of the section titled "Outputting SQL Agent Job PowerShell in Job History" you describe the trick to get the output from SQL Agent PowerShell Job, but no matter how hard I squint, I can't see the difference between the two lines.


Subject: Re
Posted by: Laerte (not signed in)
Posted on: Wednesday, October 3, 2012 at 11:33 AM
Message: djdevelop you're right, I believe it was a some edition issues .I'll get in touch with my Editor to fix. Thanks Man

Subject: Power Shell in SQL Server
Posted by: Rudy (view profile)
Posted on: Monday, October 29, 2012 at 7:46 AM
Message: Very nice article.

I would question the use of xp_cmdshell as this is a security hole. Most companies I know of don't all the use of xp_cmdshell.

Other than that, good job.

Subject: xp_cmdshell
Posted by: Paul Brewer (view profile)
Posted on: Monday, October 29, 2012 at 3:48 PM
Message: Hi,
Great article..

If you switch the query analyser windows to SQLCMD Mode, you can type the following:

!!PowerShell.exe -noprofile Get-Service

It's the equivalent of

xp_cmdshell 'PowerShell.exe -noprofile Get-Service'

and works without changing default server configuration settings to allow xp_cmdshell.


Subject: xp_cmdshell
Posted by: Paul Brewer (view profile)
Posted on: Monday, October 29, 2012 at 4:08 PM
Message: Hi,
Great article..

If you switch the query analyser windows to SQLCMD Mode, you can type the following:

!!PowerShell.exe -noprofile Get-Service

It's the equivalent of

xp_cmdshell 'PowerShell.exe -noprofile Get-Service'

and works without changing default server configuration settings to allow xp_cmdshell.


Subject: xp_cmdshell
Posted by: Jeff Moden (view profile)
Posted on: Sunday, June 9, 2013 at 1:25 PM
Message: @Paul Brewer,
That's nice... how would you run such a thing from a scheduled job?


I'm not sure why anyone would think that xp_CmdShell is a security issue. The only people that can use it are the same ones that can turn it on and those people must have "SA" or "Control Server" privs. That includes any potential attacker who will have software that turns it on in the blink of an eye. If somneone can't get in as "SA", they can't use xp_CmdShell even if it's on.

xp_CmdShell isn't a security issue. Bad security that allows the wrong people in as "SA" is the issue.

I actually think that turning off xp_CmdShell is a "worst practice" because it lulls people into a false sense of security.


Excellent article and my hat's off to you. Keep it up!

Subject: Double Post
Posted by: Jeff Moden (view profile)
Posted on: Sunday, June 9, 2013 at 1:28 PM
Message: I'm not sure why this site double posted for me. I only hit the "Post Comment" button once.

Simple-Talk Database Delivery

Patterns & Practices Library

Visit our patterns and practices library to learn more about database lifecycle management.

Find out how to automate the process of building, testing and deploying your database changes to reduce risk and make rapid releases possible.

Get started

Phil Factor
Routine SQL DML Testing for the Unenthusiastic Tester

There are more exciting things in life than unit testing SQL Statements, checking the results, timings, and... Read more...

 View the blog

Top Rated

Database Continuous Integration
 Have you ever longed for a way of making the delivery of databases more visible, predictable and... Read more...

Releasing Databases in VSTS with Redgate SQL CI and Octopus Deploy
 You can still do Database Lifecycle Management (DLM) workflows in the hosted version of Team foundation... Read more...

Five Online Database Modelling Services
 To design, or redesign, a database of any complexity, the Entity-Relationship modelling tool becomes... Read more...

Using R Machine Learning Script as a Power BI Desktop Data Source
 You can develop a Power BI Dashboard that uses an R machine learning script as its data source and... Read more...

Connected Development with SQL Server Data Tools
 Although SQL Server Data Tools (SSDT) encourages 'disconnected' database development, it also provides... Read more...

Most Viewed

Beginning SQL Server 2005 Reporting Services Part 1
 Steve Joubert begins an in-depth tour of SQL Server 2005 Reporting Services with a step-by-step guide... Read more...

Ten Common Database Design Mistakes
 If database design is done right, then the development, deployment and subsequent performance in... Read more...

Temporary Tables in SQL Server
 Temporary tables are used by every DB developer, but they're not likely to be too adventurous with... Read more...

SQL Server Index Basics
 Given the fundamental importance of indexes in databases, it always comes as a surprise how often the... Read more...

Concatenating Row Values in Transact-SQL
 It is an interesting problem in Transact SQL, for which there are a number of solutions and... Read more...

Why Join

Over 400,000 Microsoft professionals subscribe to the Simple-Talk technical journal. Join today, it's fast, simple, free and secure.