Click here to monitor SSC
  • Av rating:
  • Total votes: 109
  • Total comments: 12
Laerte Junior

The PoSh DBA: Grown-Up PowerShell Functions

10 May 2012

Laerte goes step-by-step through the process of tidying up and making more reusable an untidy collection of PowerShell routines, showing how pipelines and advanced functions can make PowerShell more effective in helping  to automate many of the working DBA's chores.

There comes a time after you’ve got used to using PowerShell to automate your DBA chores that you look at all those PowerShell functions you’ve written, and feel slightly dissatisfied. You realize, although everything works, you could, be using PowerShell in way that is more easily enhanced, reused and maintained. With some fairly simple techniques, our PowerShell functions can become more reusable. They can be made to behave much more like the built-in cmdlets that are provided with PowerShell.   You will, for example, want your functions to participate in pipelines, so that you can use filters, sort the order, group objects, write out the data and take advantage of other useful PowerShell facilities. You’d probably benefit from other features of cmdlets such as standard parameters, debug, ‘whatIf’,  and command-line Help. This article will be explaining how these features can help with scripting.

Reusable Scripts

If you take a look at my old scripts on Simple-Talk, you will see some functions that I wrote assuming that I needed to send the object populated with all the  information.  Something like :

Function Get-MSSQLServerInfo ($ServersList)

{

       [reflection.assembly]::LoadWithPartialName("Microsoft.SqlServer.Smo") | out-null

       $FinalResult = @()

       $ServersList = Get-Content $ServersList

        foreach ($svr in  $ServersList ) {

              $Server=New-Object "Microsoft.SqlServer.Management.Smo.Server" "$svr"

              $Object = New-Object PSObject   -Property

@{         ServerNname = $svr

                           EngineEdition   = $Server.EngineEdition

                           Product = $Server.Product

                           ProductLevel = $Server.ProductLevel }

              $FinalResult += $Object

       }

       Write-Output $FinalResult  

}

This code will run perfectly. However, it isn’t a reusable function.  I need to send the function a list of servers as a  parameter but, if I need to get any information about my servers within other function, I cannot use this function to do it;  I would need to write all this again inside the other function.

I would need a  series of filters, the first of which outputted a stream of  server objects. The next one would add the server information, the next one filtering just the information I needed, and the next one, maybe displaying it. I need to think in terms of pipelines

How does a Pipeline Work?

I suspect that you are already familiar with the phrase “I am piping..” or “pipe to”. Other Shell languages use this ‘pipe’ concept, but PowerShell is different in that you are “piping” objects and not their text-representations. The ‘pipe’ is more like an assembly line, where a stream of objects is passed from the start to the end of the line, and may be changed, filtered out, or added to. The output of one filter becomes the input of the next. It is represented by the conventional ‘pipe’ character  “|”

In this line …

Get-ChildItem c:\posh -Filter "*.ps1"

… I am Looking for files with the PowerShell (PS1) file-type in the c:\posh folder and my output is a directory listing like this :

     Directory: C:\posh

 

 

Mode                LastWriteTime     Length Name

----                -------------     ------ ----

-a---        26/04/2012     14:40       2422 eventhandler.ps1

-a---        26/04/2012     14:40        697 filesystemwatcher.ps1

-a---        15/07/2007     22:06        157 install.ps1

-a---        26/04/2012     14:40       1532 sqlbackup.ps1

Now , let’s send the output to a text file :

Get-ChildItem c:\posh -Filter "*.txt" | Out-File c:\temp\FilesTXT.txt

PowerShell uses the pipeline to send a stream of objects, thereby encapsulating all the information and methods within the object in an easily accessible way. In this case,  the get-ChildItem cmdlet outputs an object called System.IO.FileInfo with all the information (properties, methods etc.) and the pipeline sends this package information  to Out-File. The Out-File Cmdlet can, for example, get the date and length parameter without having to do any parsing of a text stream. It can determine the properties of the FileInfo object and access them directly.

This would suggest that, if I want to write a function that will be used with pipeline, I would just need to  populate an object. However, real-life isn’t that simple. I would need to understand how to use pipes in order to write effective PowerShell scripts.

 It was then that I was reminded of something I’d read in the Microsoft TechNet

 “The question, however, is whether you’d want to write PowerShell scripts without using pipelines. You can order a banana split and ask them to hold the ice cream; that’s fine, but at that point you don’t really have a banana split, do you? The same is true of the PowerShell pipeline: you can write scripts without a pipeline. But, at that point, do you really have a PowerShell script? “

 I needed to clear my head by doing an experiment. Let’s see this example to illustrate what I mean

function foo1() {

        Write-Host "Foo1 Showing $_"

              $_

}

 

function foo2() {

 

        Write-Host "Foo2 Showing $_"

              $_

             

}

 

function foo3() {

        Write-Host "Foo3 Showing $_"

             

}

... and run ...

1,2,3 |  foo1 | foo2 | foo3

We’ve created three functions and passed them an array of integers. What Are you expecting? I’d assumed that the process was waiting until the first function had populated the entire object and sent it to the next  cmdlet. I thought I’d see something like this…

Foo1 Showing 1

Foo1 Showing 2

Foo1 Showing 3

Foo2 Showing 1

Foo2 Showing 2

Foo2 Showing 3

 

Foo3 Showing 1

Foo3 Showing 2

Foo3 Showing 3 

For my First Surprise the output really was ...

Foo1 Showing

Foo2 Showing

Foo3 Showing

 

Doh! I’d forgotten that these objects would be passed in an enumeration, and that I should have iterated over the  $input variable with a foreach loop. 

function foo1() {
                     foreach($element in $input)
        { Write-Host "Foo1 Showing $element"
                     $element
                     }
}

function foo2() {
                     foreach($element in $input)
        { Write-Host "Foo2 Showing $element"
                     $element
                     }
      
}

function foo3() {
                     foreach($element in $input)
        { Write-Host "Foo3 Showing $element"
                     }
}

1,2,3 |  foo1 | foo2 | foo3

and the output is this…

Foo1 Showing 1

Foo1 Showing 2

Foo1 Showing 3

Foo2 Showing 1

Foo2 Showing 2

Foo2 Showing 3

Foo3 Showing 1

Foo3 Showing 2

Foo3 Showing 3

I could do a lot better than this, since I’m forcing PowerShell to batch up the input to every function until it is all processed and then pass it as an enumeration. In fact, a function/advanced function can be written to accept an input element, process it and then and pass output to the pipeline before the next in the sequence is processed. This gives you better performance and always saves on memory.  I should have written code to handle the input stream this way. To do so, we’ll need some other language constructs. These are

  • Begin Block : The code inside this block, will be executed first.
  • Process Block : The code inside this block will execute once FOR EACH value piped
  • End Block : When everything is processed, this block execute once the code inside it.

The Syntax is something like this:

Function Foo {

 

      Begin {}

      Process {}

      End {}

 

}

So firstly, let’s see what happens when our example is rewritten just using the Process Block :

 function foo1() {

       PROCESS {

        Write-Host "Foo1 Showing $_"

              $_

       }

}

 

function foo2() {

       Process {

 

        Write-Host "Foo2 Showing $_"

              $_

       }

}

 

function foo3() {

       Process {

        Write-Host "Foo3 Showing $_"

             

       }

}

 

 

1,2,3 |  foo1 | foo2 | foo3

 

And the Output is :

Foo1 Showing 1

Foo2 Showing 1

Foo3 Showing 1

Foo1 Showing 2

Foo2 Showing 2

Foo3 Showing 2

Foo1 Showing 3

Foo2 Showing 3

Foo3 Showing 3

 

Each value in the sequence is being streamed down the pipeline, one at a time.

Now we’ll modify the functions using all the blocks that I’ve listed just to show you the sequence of events :

 

function foo1() {

    begin {

        Write-Host "Foo1 begin"

    }

    process {

        Write-Host "Foo1 Process $_"

              $_

    }

    end {

        Write-Host "Foo1 end"

    }

}

 

function foo2() {

    begin {

        Write-Host "Foo2 begin"

    }

    process {

        Write-Host "Foo2 Process $_"

              $_

    }

    end {

        Write-Host "Foo2 end"

    }

}

 

function foo3() {

    begin {

        Write-Host "Foo3 begin"

    }

    process {

        Write-Host "Foo3 Process $_"

 

    }

    end {

        Write-Host "Foo3 end"

    }

}

 

 

1,2,3 |  foo1 | foo2 | foo3

 

And the output is :

Foo1 begin

Foo2 begin

Foo3 begin

Foo1 Process 1

Foo2 Process 1

Foo3 Process 1

Foo1 Process 2

Foo2 Process 2

Foo3 Process 2

Foo1 Process 3

Foo2 Process 3

Foo3 Process 3

Foo1 end

Foo2 end

Foo3 end

At this point I have discovered how I can write functions that use pipelines. I can put this to use immediately in my function, Get-MSSQLServerInfo that gets information about servers. but as I’ve already said, my function is still not reusable. How can I change this?

A function should do one thing well.

We can see from our code that I have functionality that can be generalized for a lot of other operations that I will need to perform in SQL Server. The task of making a connection is one of  those.. Why not write a function to do this?

function Get-MSSQLServerConnection

{

    param([Parameter(Position=0, Mandatory=$true)] [string]$sqlserver)

 

   $sqlconn = new-object ("Microsoft.SqlServer.Management.Common.ServerConnection") $sqlserver

 

    $sqlconn.Connect()

 

    Write-Output $sqlconn

 

}

Then we could  change the code in our Get-MSSQLServerInfo function to use our new  Get-MSSQLServerConnection :

function Get-MSSQLServerInfo

{

    param(

    [Parameter(Position=0, Mandatory=$true)] [string]$sqlserver)

 

   

    $sqlconn = Get-MSSQLServerConnection $sqlserver

       $Server=New-Object "Microsoft.SqlServer.Management.Smo.Server" $sqlconn

       $Object = New-Object PSObject -Property @{ServerNname = $server.name

                                                 EngineEdition= $Server.EngineEdition                                                  Product = $Server.Product                                                          ProductLevel = $Server.ProductLevel}

       Write-Output $Object

}

 

As you can see, I am setting the parameter $sqlserver as Mandatory  (required) and with Position 0 or “named”. (for more information about parameters, type help about_parameters at the PowerShell prompt)

Now I have a function that makes a connection to a server, and it can be used in all other functions that require a connection to SQL Server.

 I can, of course, get information about several servers :

$ServersList = Get-Content c:\posh\Servers.txt

foreach ($svr in  $ServersList ) {

      Get-MSSQLServerInfo $svr

}

But why use foreach if I can receive the instance or server name by pipeline? You’ll probably be thinking that you just need to use ValueFromPipeline in the parameters options. (for more information, type help About_pipeline)

Let’s try if it works.. Type the same server 3 times and pipe to the function : (in my case the name is 7-PC)

"7-pc","7-pc","7-pc" | Get-MSSQLServerInfo

Are you expecting 3 rows with the same info correct ? Wrong.. the output is :

 

EngineEdition ProductLevel       ServerNname           Product

------------- ------------       -----------           -------

EnterpriseOrDeveloper RTM        7-pc                  Microsoft SQL Server           

Just one line..Why? Remember the process block ? That is what is  missing . So :

function Get-MSSQLServerInfo

{

    param(

    [Parameter(Position=0, Mandatory=$true,ValueFromPipeline = $true)] [string]$sqlserver)

 

       process {

   

           $sqlconn = Get-MSSQLServerConnection $sqlserver

              $Server=New-Object "Microsoft.SqlServer.Management.Smo.Server" $sqlconn

              $Object = New-Object PSObject   -Property @{    ServerNname = $server.name

                                                EngineEdition=$Server.EngineEdition

                                                Product = $Server.Product

                                                ProductLevel = $Server.ProductLevel}

              Write-Output $Object                                                                       

       }

}

 

Finally we have the right output

EngineEdition ProductLevel            ServerNname         Product

------------- ------------             -----------        -------

EnterpriseOrDeveloper RTM              7-pc               Microsoft SQL Server      

EnterpriseOrDeveloper RTM              7-pc               Microsoft SQL Server       

EnterpriseOrDeveloper RTM              7-pc               Microsoft SQL Server     

You’ll be wondering why I am putting the Get-MSSQLServerConnection in the process block rather than the Begin block.  It is because I need a new connection to each server. In fact, I would just use the begin block to do initialization tasks such as creating variables or loading namespaces.

At this point I’m still not entirely happy with the code. I was trying to do several processes in the one function. For each server, I was making a connection, getting the information, and then outputting it. In PowerShell, it is easier to write functions that do one discrete operation at a time.

Don Jones has several articles about this, and I’ve provided links at the end of this article.

Now my function is reusable and can now be used in any application that needs to connect to a server via SMO

But something is still missing…..

Hey Function, now behave as a cmdlet

 You can do more. If you add the Cmdletbinding() attribute to your function,  it will then have some of the characteristics of cmdlets. We are now well on the road to making an advanced function. You can write functions that can perform operations similarly to a built-in cmdlet. Advanced  functions are written in PowerShell rather than by using a Microsoft .NET Framework language.

You can, for example  

  • use Write-Debug and Write-Verbose in your function and you can control this using –verbose and –debug parameters in your function. This parameters are automatically added by the [Cmdletbinding()]
  • make the common parameters available for your function. To more info type help About_commomparameters
  • implement –whatif  This allows the user of your function to find out what would happen if he executed the function. In other words, it is something like “If I drop all these tables, what would  happen?”. By typing –whatif , you can have an idea that what would happen before you did it for real. ( I will show how to do this in my next article)

Finally you can see more details of the power of the [Cmdletbinding()] by  typing :

  • about_Functions_Advanced
  • about_Functions_Advanced_CmdletBindingAttribute
  • about_Functions_Advanced_Methods   
  • about_Functions_Advanced_Parameters

 Our (almost) final code is :

function Get-MSSQLServerConnection

{

    param([Parameter(Position=0, Mandatory=$true)] [string]$sqlserver)

      $sqlconn = new-object ("Microsoft.SqlServer.Management.Common.ServerConnection") $sqlserver

    $sqlconn.Connect()

    Write-Output $sqlconn

 

}

 

function Get-MSSQLServerInfo

{

       [Cmdletbinding()]

    param(

    [Parameter(Position=0, Mandatory=$true,ValueFromPipeline = $true)] [string]$sqlserver)

 

       begin {

              [reflection.assembly]::LoadWithPartialName("Microsoft.SqlServer.Smo") | out-null

       }

 

       process {

   

           $sqlconn = Get-MSSQLServerConnection $sqlserver

           $Server=New-Object "Microsoft.SqlServer.Management.Smo.Server" $sqlconn

           $Object = New-Object PSObject   -Property @{ServerNname=$server.name

                                                       EngineEdition= $Server.EngineEdition

                                                       Product =$Server.Product

                                                       ProductLevel = $Server.ProductLevel}

           Write-Output $Object                                                                       

       }

}

 

 

Handling Errors.

Basic error handling in PowerShell  is simple. There is the try/catch/finally block.
For details,  see  about_try_catch_finally) and this link from Steven Murawski - Error Handling

One obvious source of problems in our function could be in connection to SQL Server, possibly if I pass the name incorrectly or the service is out. This would be a serious candidate for  placing in an error block.

function Get-MSSQLServerInfo

{

       [Cmdletbinding()]

    param(

    [Parameter(Position=0, Mandatory=$true,ValueFromPipeline = $true)] [string]$sqlserver)

 

       begin {

              [reflection.assembly]::LoadWithPartialName("Microsoft.SqlServer.Smo") | out-null

       }

 

       process {

   

              Try {

               $sqlconn = Get-MSSQLServerConnection $sqlserver

               $Server=New-Object "Microsoft.SqlServer.Management.Smo.Server" $sqlconn

$Object = New-Object PSObject -Property @{ServerNname=$server.name

                                                         EngineEdition=$Server.EngineEdition

                                                         Product = $Server.Product

                                                         ProductLevel = $Server.ProductLevel}

               Write-Output $Object                                                                       

              } Catch {

             

                     Write-Error $Error[0]

             

              }

       }

}

 

 

 

The Comment-based Help

 

The first thing I do when I want to know more about a cmdlet is to get the help about it. So if my function needs to act as a cmdlet, it must be a supplier of  help via comment-based help.

 Comment-based Help is written as a series of comments. You can type a comment symbol (#) before each line of comments, or you can use the "<#" and "#>" symbols to create a comment block. All the lines within the comment block are interpreted as comments.

about_Comment_Based_Help

So, with a help comment-block …

#######################

<#

.SYNOPSIS

Gets an Information about the SQL Server.

.DESCRIPTION

The Get-MSSQLServerInfo function  gets a collection Custom Object about the SQL Server Server.

.EXAMPLE

get-content servers.txt | Get-MSSQLServerInfo

.EXAMPLE

"Server1","Server2","Server3" | Get-MSSQLServerInfo

.EXAMPLE

Get-MSSQLServerInfo Server1

.LINK

Get-MSSQLServerInfo

#>

function Get-MSSQLServerInfo …..

…our function can provide help just like a cmdlet. If you type Get-Help Get-MSSQLServerInfo you will see the magic!.

For a complete list that what sections you can use in your help, type

help about_Comment_Based_Help

The output

I suspect you will have asked yourself ‘Why  is Laerte using a custom object (PSObject)  to output the Server properties? The answer is that I was doing it only for the first example. It is better practice  to use the live object in this case, so that you are passing into the pipe all the information (properties, methods) .from the $Server Object, allowing you to then filter out whatever information you actually need for your purpose.

The process block would then  be :

 

process {

   

           $sqlconn = Get-MSSQLServerConnection $sqlserver

           $Server=New-Object "Microsoft.SqlServer.Management.Smo.Server" $sqlconn

           Write-Output $Server                                                                                       

       }

 

And then I could use it simply in a neat pipeline

"7-pc" | Get-MSSQLServerInfo | select *

Or

Get-MSSQLServerInfo "7-pc" | select *

However, the custom objects are very useful when you need to combine more than one object in a single output. In this example I need to join some information about the SQL Server and the Configurations (WMI Managed Computer) of the SQL Instance to provide a single object as output.

function Get-MSSQLServerInfo

{

       [Cmdletbinding()]

    param(

    [Parameter(Position=0, Mandatory=$true,ValueFromPipeline = $true)] [string]$sqlserver)

 

       begin {

              [reflection.assembly]::LoadWithPartialName("Microsoft.SqlServer.Smo") | out-null

              [reflection.assembly]::LoadWithPartialName("Microsoft.SqlServer.SqlWmiManagement ") | Out-Null

       }

 

       process {

              try {

   

           $sqlconn = Get-MSSQLServerConnection $sqlserver

              $Server=New-Object "Microsoft.SqlServer.Management.Smo.Server" $sqlconn

              $ManagedComputer = New-Object "Microsoft.SqlServer.Management.Smo.Wmi.managedcomputer" $sqlconn.server

              $Object = New-Object PSObject   -Property

 @{    ServerName = $server.name

       EngineEdition   = $Server.EngineEdition

       Product = $Server.Product

       ProductLevel = $Server.ProductLevel

       ClientProtocols = ($ManagedComputer.ClientProtocols | Select displayname)

       ConnectionSettings = $ManagedComputer.ConnectionSettings.machinename }

Write-Output $Object

       } catch {

              Write-error $Error[0]

       }                                                                      

       }

You can see that, in order to combine the information about the two objects, the $server and $ManagedComputer, I am using a Custom Object ($object)

Displaying Object Status Updates

When your function is performing operations, it helps to have a mechanism that allows you to opt to display information on the screen that reports on progress. You can use to debug the command processing as well. In times past, this was called ‘Printfing’. With advanced functions, we can use the Write-Verbose cmdlets instead of using Write-Host.

This cmdlet writes text to the verbose message stream, and this can be switched in or out as we wish.

So we can write our code using this feature :

function Get-MSSQLServerConnection

{

       param([Parameter(Position=0, Mandatory=$true)] [string]$sqlserver)

       try {

              $sqlconn = new-object ("Microsoft.SqlServer.Management.Common.ServerConnection") $sqlserver

       $sqlconn.Connect()

       Write-Output $sqlconn

       } catch {    

              Write-Error $Error[0]

       }

 

}

function Get-MSSQLServerInfo

{

       [Cmdletbinding()]

    param(

    [Parameter(Position=0, Mandatory=$true,ValueFromPipeline = $true)] [string]$sqlserver)

 

       begin {

              [reflection.assembly]::LoadWithPartialName("Microsoft.SqlServer.Smo") | out-null

       }

 

       process {

   

           $sqlconn = Get-MSSQLServerConnection $sqlserver

              try {

                     Write-Verbose "Connectin To SQL Server $($sqlserver)"

              $Server=New-Object "Microsoft.SqlServer.Management.Smo.Server" $sqlconn

                     write-verbose "Outputing Information"

              Write-Output $Server                                                                       

              } catch {

                     Write-Error $Error[0]

              }

             

       }

}

Do you remember what I said below the characteristics of an Advanced Function ? One of them is to enable the –verbose common parameter. By default, the message will not be displayed if we not pass the –verbose or change the value of the $VerbosePreference  variable. Type  about_Preference_Variables to further information.

Get-MSSQLServerInfo "7-pc" -verbose  | select *

Working with Modules

Once you have your powerful set of functions, you’ll want them to behave as if they were built in to your copy of PowerShell. To do this, you can use a script module. The script module is a PowerShell Script file that has a .PSM1 Extension. It is a useful way of sharing your functions and using them by calling them the same way as you would a native cmdlet . When you start a PowerShell Session, your modules will be loaded and you could then, for example, invoke Get-MSSQLServerInfo by typing directly from the PowerShell command-line.

One of the characteristics of a module are you can choose whether functions are exported so that they would, for example, show up in a get-module  cmdlet.  You can also load your assemblies just once so it is never necessary to do it within a function. You can also create aliases for your functions.

I can see that  the Get-MSSQLServerConnection function is only useful in order to connect a SQL Server from inside other function, so I don’t want it to show up in the list of cmdlets and functions that can be used.

First we create a .psm1 file empty. Then  we load the Assemblies.

In our function we use SQL Server SMO assembly. We just add this type :

add-type -AssemblyName "Microsoft.SqlServer.Smo"

We can now remove the begin block in Get-MSSQLServerInfo Function because we were only using it to load the assembly.

Now we add our functions :

function Get-MSSQLServerConnection

{

       param([Parameter(Position=0, Mandatory=$true)] [string]$sqlserver)

       try {

              $sqlconn = new-object ("Microsoft.SqlServer.Management.Common.ServerConnection") $sqlserver

       $sqlconn.Connect()

       Write-Output $sqlconn

       } catch {    

              Write-Error $Error[0]

       }

 

}

 

function Get-MSSQLServerInfo

{

       [Cmdletbinding()]

    param(

    [Parameter(Position=0, Mandatory=$true,ValueFromPipeline = $true)] [string]$sqlserver)

 

       process {

   

           $sqlconn = Get-MSSQLServerConnection $sqlserver

              try {

                     Write-Verbose "Connectin To SQL Server $($sqlserver)"

              $Server=New-Object "Microsoft.SqlServer.Management.Smo.Server" $sqlconn

                     write-verbose "Outputing Information"

              Write-Output $Server                                                                       

              } catch {

                     Write-Error $Error[0]

              }

             

       }

}

How about creating an alias for Get-MSSQLServerInfo . We can use gsql instead :

 Set-Alias -Name gsql -Value Get-MSSQLServerInfo

As I said before, I only want  Get-MSSQLServerInfo to be exported as a cmdlet so we add :

 Export-ModuleMember -Function "Get-MSSQLServerInfo" -Alias gsql

As you can see I am exporting the alias as well.

Ufa. Our module is done. The final code is :

add-type -AssemblyName "Microsoft.SqlServer.Smo"

 

function Get-MSSQLServerConnection

{

       param([Parameter(Position=0, Mandatory=$true)] [string]$sqlserver)

       try {

              $sqlconn = new-object ("Microsoft.SqlServer.Management.Common.ServerConnection") $sqlserver

       $sqlconn.Connect()

       Write-Output $sqlconn

       } catch {    

              Write-Error $Error[0]

       }

 

}

 

function Get-MSSQLServerInfo

{

       [Cmdletbinding()]

    param(

    [Parameter(Position=0, Mandatory=$true,ValueFromPipeline = $true)] [string]$sqlserver)

 

       process {

   

           $sqlconn = Get-MSSQLServerConnection $sqlserver

              try {

                     Write-Verbose "Connectin To SQL Server $($sqlserver)"

              $Server=New-Object "Microsoft.SqlServer.Management.Smo.Server" $sqlconn

                     write-verbose "Outputing Information"

              Write-Output $Server                                                                       

              } catch {

                     Write-Error $Error[0]

              }

             

       }

}

 

Set-Alias -Name gsql -Value Get-MSSQLServerInfo

Export-ModuleMember -Function "Get-MSSQLServerInfo" -Alias gsql

 

Now it is only to put in the module path and import into our profile (see Jonathan article) and after that, typing GET-MODULE :

Here it is. Name MSSQLServer. Note that only Get-MSSQLServerInfo is showed because it was all we exported.

And we can type gsql too rather than the whole name . But let’s see something interesting : if I type gsql - , as I am using a Script Editor, all parameters will be shown

Wooowww .. What is this ? My function has only a Server Name parameter. Why is it showing a bunch of parameters ?

Ha! It is an advanced function remember ? Cmdletbinding ? Now we can use all the common parameters as well.

That is it, guys. I’ve explained why I believe that you can actually make life easier by introducing a little about reusable functions, how it can help your life too in order to avoid rework and some of the featured of advanced functions into your PowerShell scripts and how to create as a module. I’ve described how the use of pipes can make scripts run faster and cut out some of the complexity.

In the next article, I will show to you some of the other features of Advanced Functions can help to add more power to your PowerShell scripting studding some examples of CODEPLEX - SQLPSX library (SQL Server PowerShell Extensions)  . I hope that you have had as much fun reading this as I’ve had in writing it. Any comments or ideas are very welcome

Acknowlegements

These are the Resources to this article. Not to PowerShell. I am not listing event 1% of the PowerShell Jedis that I like to read.

 

Laerte Junior

Author profile:

Laerte Junior is a PowerShell MVP and, through his technology blog and simple-talk articles, an active member of the Microsoft community in Brasil. He is a skilled Principal Database Architect, Developer, and Administrator, specializing in SQL Server and Powershell Programming with over 8 years of hands-on experience. He holds a degree in Computer Science, has been awarded a number of certifications (including MCDBA), and is an expert in SQL Server 2000 / SQL Server 2005 / SQL Server 2008 technologies. He also organizes, and is a speaker at microsoft community events, attracting hundreds of attendees. Laerte has also recently become a Friend of Redgate in Brasil, has taught classes at universities, and produced webcasts for the community.

You should follow him on Twitter as @LaerteSQLDBA

Search for other articles by Laerte Junior

Rate this article:   Avg rating: from a total of 109 votes.


Poor

OK

Good

Great

Must read
Have Your Say
Do you have an opinion on this article? Then add your comment below:
You must be logged in to post to this forum

Click here to log in.


Subject: Good article
Posted by: jhonatantirado (view profile)
Posted on: Thursday, May 10, 2012 at 8:23 AM
Message: Nice post! I've been using PowerShell for automated backups and data transfers over FTP so I think I'll give pipelines a try.

Subject: Really Nice article
Posted by: Math (view profile)
Posted on: Friday, May 11, 2012 at 1:28 PM
Message: Laerte,
I've been following your blog and articles for a long time and I see the growth that is in them. Congratulations, this was undoubtedly one of your " The best" . Looking forward to the next.

Subject: Thanks
Posted by: Laerte Junior (not signed in)
Posted on: Friday, May 11, 2012 at 9:39 PM
Message: Thanks jhonatantirado and Math for the kind words. I am glad that you guys liked it :)

Subject: Must read
Posted by: Josep (not signed in)
Posted on: Monday, May 14, 2012 at 2:29 AM
Message: Amazing post. A lot of Powershell concepts in a very well structured article. Explained in an easy way when it's not. I would like to have read it before the many head-eaches I've had!
I'm sending it to my co-workers...


Subject: A technical portrait
Posted by: Woojo01 (view profile)
Posted on: Tuesday, May 15, 2012 at 7:40 AM
Message: One of the best PoSh articles I've read. The material is presented clearly and logically in such a way it "paints" a complete portrait of the ideas presented.
I have read and used all of the points mentioned but had never put them together in such an elegant manner. Kudos to you for sharing it!

Subject: Thanks Guys :)
Posted by: Laerte Junior (not signed in)
Posted on: Tuesday, May 15, 2012 at 9:19 AM
Message: Thanks a lot for the comments Josep and Woojo01. It is very comforting kind words like these. For me, it's an incentive to continue sharing.
If you guys want to share experiences, please ping me at laertejuniordba@hotmail.com. It will be a pleasure :)

Subject: Should be added to Wikipedia
Posted by: Reiner (view profile)
Posted on: Tuesday, May 15, 2012 at 9:39 AM
Message: Woa!!!
I agree with Math. This is an incredible post. Thanks so much for all your time and effort.

Subject: Should be added to Wikipedia
Posted by: Reiner (view profile)
Posted on: Tuesday, May 15, 2012 at 9:41 AM
Message: Woa!!!
I agree with Math. This is an incredible post. Thanks so much for all your time and effort.

Subject: Sorry
Posted by: Reiner (view profile)
Posted on: Tuesday, May 15, 2012 at 9:42 AM
Message: I hit F5 in an effort to refresh the page and apparently my comment was double posted. Sorry.

Subject: Wikipedia for a simple padawan ?
Posted by: Laerte Junior (not signed in)
Posted on: Tuesday, May 15, 2012 at 9:49 AM
Message: HAHAH..I am just a Padawan man. Wikipedia is for Jedis with a LOT of Midi-chlorians. LOL
Sweet Bro..Thanks for the words :)

Subject: Super Intoduction zo Advanced Functions!
Posted by: Peter Kriegel (not signed in)
Posted on: Thursday, May 17, 2012 at 4:19 AM
Message: Very good eplanatation of advanced Functions,Pipeline and reuse of Code in Modules !
Thank you !
(first part of PowerShell Tutorial ? ;-))

Subject: Thanks
Posted by: Laerte Junior (not signed in)
Posted on: Thursday, May 17, 2012 at 7:08 AM
Message: Hey Peter, in fact thanks to you. Good to see that you liked it. And yes, I am doing something like these. Not So deeply in the advanced functions cmdletbinding metadata, but I believe that will be a cool introduction :)

 

Phil Factor
Searching for Strings in SQL Server Databases

Sometimes, you just want to do a search in a SQL Server database as if you were using a search engine like Google.... Read more...

 View the blog

Top Rated

SQL Server XML Questions You Were Too Shy To Ask
 Sometimes, XML seems a bewildering convention that offers solutions to problems that the average... Read more...

Continuous Delivery and the Database
 Continuous Delivery is fairly generally understood to be an effective way of tackling the problems of... Read more...

The SQL Server Sqlio Utility
 If, before deployment, you need to push the limits of your disk subsystem in order to determine whether... Read more...

The PoSh DBA - Reading and Filtering Errors
 DBAs regularly need to keep an eye on the error logs of all their SQL Servers, and the event logs of... Read more...

MySQL Compare: The Manual That Time Forgot, Part 1
 Although SQL Compare, for SQL Server, is one of Red Gate's best-known products, there are also 'sister'... Read more...

Most Viewed

Beginning SQL Server 2005 Reporting Services Part 1
 Steve Joubert begins an in-depth tour of SQL Server 2005 Reporting Services with a step-by-step guide... Read more...

Ten Common Database Design Mistakes
 If database design is done right, then the development, deployment and subsequent performance in... Read more...

SQL Server Index Basics
 Given the fundamental importance of indexes in databases, it always comes as a surprise how often the... Read more...

Reading and Writing Files in SQL Server using T-SQL
 SQL Server provides several "standard" techniques by which to read and write to files but, just... Read more...

Concatenating Row Values in Transact-SQL
 It is an interesting problem in Transact SQL, for which there are a number of solutions and... Read more...

Why Join

Over 400,000 Microsoft professionals subscribe to the Simple-Talk technical journal. Join today, it's fast, simple, free and secure.