Click here to monitor SSC
Phil Factor

Automated Script-generation with Powershell and SMO

03 February 2012

In the first of a series of articles on automating the process of building, modifying and copying SQL Server databases, Phil Factor demonstrates how one can generate TSQL  scripts for databases, selected database objects, or table contents from PowerShell and SMO.

If you’re a DBA or a database developer, you’ve probably found that point‘n’click script-generation via SSMS becomes an increasingly tedious chore, and one that always increases the chance of error. Now, if you’re happy with the tedium of using SSMS to create your database scripts regularly, then use it, because it works, and it is probably the best way of getting one-off ad-hoc scripts for special purposes.

In my case, however, I want automated scripting, because it gives me the opportunity of getting the result exactly as I want it. I’m then more confident of having an up-to-date database build-script to hand, and I’m more likely to use this technique to perform tedious routine jobs such as integration, source control and creating test databases.

A live database is created from a number of SQL scripts, both DDL and DML. Script-generation is the act of reverse-engineering that live database, or a database component such as a schema or set of routines, into one or more scripts that are capable of creating an exact replica.

You’ll probably want a complete build script, as it is difficult to create a build script for a working database from individual object scripts alone, and impossible unless you are aware of the server settings and have scripted the database settings too. It is therefore wise to regularly save a complete script for building the entire database, and maybe a separate script for the database settings as well. You’ll probably also want to use script-generation to get the source of individual routines, tables and views for source control, for editing and for your own records.

There are a number of choices to be made when creating your database scripts, depending on the problem you’re trying to solve. It’s like buying a frou-frou cup of coffee. You are bombarded with decisions. Are you updating the schema of an existing database, or are you creating it from scratch? Do you only wish to update the routines? Do you want to temporarily disable the constraints in order to import data in the ‘wrong’ order? Do you want to leave out the DRI until you’ve imported all the data?

You need this flexibility required in a script-generation solution because you have a fair amount of complexity to deal with. A database application, at any point in time, consists of one or more databases on one or more servers, and may have more than one version or fork being developed. They may be using different data. As well as the code for all the routines, views and tables, there will be database settings and server settings. Finally, there will be data, even if just the basic enumerations and static data without which nothing works. As well as the code in the shared ‘public’ database, you may also have stealth things you are trying out, and sandbox stuff that needs to be preserved. You will also need to script your endpoint configuration and tasks that will go on the SQL Server agent. If you’re doing serious website work, you’ll have queues managed by service broker too.

If the production-DBAs have scripts for all of this, for all current versions, along with migration and rollback scripts, then they are smiling.

As well as squirreling away the code in order to preserve the work of the team, keep track of progress, maintain test-cells, do builds and rollbacks from builds, and to relate bugs to code alterations, you need codes to understand the database. You can understand a minnow of a database such as Adventureworks through Point n’ click in SSMS, but for an industrial-strength behemoth, then it is far quicker to eyeball down the build scripts. I know of very few ways to generate database scripts, and a lot of these do it wrong, because the scripts are formatted as a machine-to-machine communication. Table build scripts, for example, can be written for legibility or merely in order to get a correct build. In order to quickly learn a database, you need the legible version.

No, I had to write my own version and it has paid dividends. Functions, for example, are easier to understand with a structured comment block listing the parameters and comments in extended properties, and even where they are referenced and what they reference. Tables are far better with the simpler constraints written in-line, and comments on both table and column pulled from the extended properties.

Automated scripting of database DDL

Let’s start with the obvious technique. We’ll use PowerShell and Server Management Objects (SMO). No, don’t panic. SMO is a monster, but that’s because it is written for many purposes, the worst of which is providing an interface between SSMS and the servers. In most cases, there is already a high-level interface for the likes of you and me. I’ll admit that SMO isn’t easy. It is always a bad sign when you've got a problem with a script, and you reach for Google, only to find nothing more than a string of Chinese characters on half a page, and a similar StackOverflow question left unanswered since 2008, save for advice to use SSIS instead.

This sort of thing tends to happen when you're using SMO, which is such a shamebecause it is so powerful. In fact, almost any job that you can do via SSMS you can do through SMO. With PowerShell and SMO, you can work magic, but with that strange, lonesome feeling that not many people have ever walked down the same path. It's the awful documentation, combined with the intricate, arcane, multi-layered interface that makes SMO hard work for the uninitiated, but it shouldn't be like this.

Microsoft has always shown its ambivalence in letting users loose on SMO, by neglecting to provide anything remotely resembling adequate documentation. All we get is minimal, and probably machine-generated, documentation of the SMO classes, methods and so on. Even the examples have errors. Microsoft has to keep SMO up to date because it is used for SSMS, but there seems to be a great deal of passive resistance to supporting users who need to use it for scripting. For this reason, I’ll be keeping these examples as simple as I can.

Getting stuck in quickly

There is usually a quick way to do things. Here is the PowerShell to script out the MyDatabase database from MyServer into the local directory E:\MyScriptsDirectory' (it will do others, of course, by changing the assignments to the three variables at the head of the script). Note that in these PowerShell scripts I've opted to avoid the terse style, mainly because the terse style is less intelligible for those of us who just want to use PowerShell without getting too absorbed.

$Filepath='E:\MyScriptsDirectory' # local directory to save build-scripts to
$DataSource='MyServer' # server name and instance
$Database='MyDatabase'# the database to copy from
# set "Option Explicit" to catch subtle errors
set-psdebug -strict
$ErrorActionPreference = "stop" # you can opt to stagger on, bleeding, if an error occurs
# Load SMO assembly, and if we're running SQL 2008 DLLs load the SMOExtended and SQLWMIManagement libraries
$ms='Microsoft.SqlServer'
$v = [System.Reflection.Assembly]::LoadWithPartialName( "$ms.SMO")
if ((($v.FullName.Split(','))[1].Split('='))[1].Split('.')[0] -ne '9') {
[System.Reflection.Assembly]::LoadWithPartialName("$ms.SMOExtended") | out-null
   }
$My="$ms.Management.Smo" #
$s = new-object ("$My.Server") $DataSource
if ($s.Version -eq  $null ){Throw "Can't find the instance $Datasource"}
$db= $s.Databases[$Database]
if ($db.name -ne $Database){Throw "Can't find the database '$Database' in $Datasource"};
$transfer = new-object ("$My.Transfer") $db
$transfer.Options.ScriptBatchTerminator = $true # this only goes to the file
$transfer.Options.ToFileOnly = $true # this only goes to the file
$transfer.Options.Filename = "$($FilePath)\$($Database)_Build.sql";
$transfer.ScriptTransfer()
"All done"

I reckon this is the simplest PowerShell script to get an executable build script, and it isn’t too painful. Die-hard SMO-heads will notice that I have to write to a file via SMO in order to get the batch-terminator GO into the script. To do this, I’ve created a ‘ScriptOptions’ object, which isn’t entirely necessary yet, but will be once we increase the complexity of the task. If you run the script, it will successfully build a database, but there will be a lot missing, because we’ve been using the default options for generating the script. You’ll have no DRI. The tables, in other words, won’t have their constraints and indexes, or any dependent objects at all. It will miss out all the extended properties as well.

There is some work to be done. Not all the defaults for the script options are sensible. A quick bit of PowerShell to query the ScriptOptions object will tell us what the defaults are. I’ll print them all out because this is a useful reference when you’re struggling with a script task. This is the equivalent of the options for the frou-frou cup of coffee, and you’ve just been served ‘black without sugar’. Our scripting options are below, along with the defaults.

Options: ‘Do you want cinnamon with that coffee, sir?’

These are the scripting options with their default settings. I haven't explained them, because they are either obvious or  undocumented

FileName
Encoding System.Text.UnicodeEncoding
DriWithNoCheck False
IncludeFullTextCatalogRootPath False
BatchSize 1
ScriptDrops False
TargetServerVersion Version110
TargetDatabaseEngineType Standalone
AnsiFile False
AppendToFile False
ToFileOnly False
SchemaQualify True
IncludeHeaders False
IncludeIfNotExists False
WithDependencies False
DriPrimaryKey False
DriForeignKeys False
DriUniqueKeys False
DriClustered False
DriNonClustered False
DriChecks False
DriDefaults False
Triggers False
Statistics False
ClusteredIndexes False
NonClusteredIndexes False
NoAssemblies False
PrimaryObject True
Default True
XmlIndexes False
FullTextCatalogs False
FullTextIndexes False
FullTextStopLists False
Indexes False
DriIndexes False
DriAllKeys False
DriAllConstraints False
DriAll False
Bindings False
NoFileGroup False
NoFileStream False
NoFileStreamColumn False
NoCollation False
ContinueScriptingOnError False
IncludeDatabaseRoleMemberships False
Permissions False
AllowSystemObjects True
NoIdentities False
ConvertUserDefinedDataTypesToBaseType False
TimestampToBinary False
AnsiPadding False
ExtendedProperties False
DdlHeaderOnly False
DdlBodyOnly False
NoViewColumns False
SchemaQualifyForeignKeysReferences False
AgentAlertJob False
AgentJobId True
AgentNotify False
LoginSid False
NoCommandTerminator False
NoIndexPartitioningSchemes False
NoTablePartitioningSchemes False
IncludeDatabaseContext False
NoXmlNamespaces False
DriIncludeSystemNames False
OptimizerData False
NoExecuteAs False
EnforceScriptingOptions False
NoMailProfileAccounts False
NoMailProfilePrincipals False
NoVardecimal True
ChangeTracking False
ScriptDataCompression True
ScriptSchema True
ScriptData False
ScriptBatchTerminator False
ScriptOwner False

This is a scarily complicated set of options. One can’t help wondering why some of these options would be required. However, we can soon put our script right without too much bother. Your own requirements may be different, but I was aiming for an exact copy of AdventureWorks in my testing!

Making an exact copy of AdventureWorks

$Filepath='E:\MyScriptsDirectory' # local directory to save build-scripts to
$DataSource='MyServer' # server name and instance
$Database='MyDatabase'# the database to copy from
# set "Option Explicit" to catch subtle errors
set-psdebug -strict
$ErrorActionPreference = "stop" # you can opt to stagger on, bleeding, if an error occurs
# Load SMO assembly, and if we're running SQL 2008 DLLs load the SMOExtended and SQLWMIManagement libraries
$ms='Microsoft.SqlServer'
$v = [System.Reflection.Assembly]::LoadWithPartialName( "$ms.SMO")
if ((($v.FullName.Split(','))[1].Split('='))[1].Split('.')[0] -ne '9') {
[System.Reflection.Assembly]::LoadWithPartialName("$ms.SMOExtended") | out-null
   }
$My="$ms.Management.Smo" #
$s = new-object ("$My.Server") $DataSource
if ($s.Version -eq  $null ){Throw "Can't find the instance $Datasource"}
$db= $s.Databases[$Database]
if ($db.name -ne $Database){Throw "Can't find the database '$Database' in $Datasource"};
$transfer = new-object ("$My.Transfer") $db
$CreationScriptOptions = new-object ("$My.ScriptingOptions")
$CreationScriptOptions.ExtendedProperties= $true # yes, we want these
$CreationScriptOptions.DRIAll= $true # and all the constraints
$CreationScriptOptions.Indexes= $true # Yup, these would be nice
$CreationScriptOptions.Triggers= $true # This should be included when scripting a database
$CreationScriptOptions.ScriptBatchTerminator = $true # this only goes to the file
$CreationScriptOptions.IncludeHeaders = $true; # of course
$CreationScriptOptions.ToFileOnly = $true #no need of string output as well
$CreationScriptOptions.IncludeIfNotExists = $true # not necessary but it means the script can be more versatile
$CreationScriptOptions.Filename =  "$($FilePath)\$($Database)_Build.sql";
$transfer = new-object ("$My.Transfer") $s.Databases[$Database]

$transfer.options=$CreationScriptOptions # tell the transfer object of our preferences
$transfer.ScriptTransfer()
"All done"

If you test a database built with this script against the original, it gives a pretty good account of itself. All that’s missing are some extended properties on indexes, but there is no switch that one can flip to tickle those out of SMO, so I suspect that someone has made a mistake.

We did this intermediate version because it is simple and demonstrates a clean technique which you can take and expand on. It’s great for archiving a complete build script that you can use in source control alongside the individual object scripts. However, I’m going to include a more complete version that will give you a database build script and an object-deletion script as well as the object-build script, all concatenated into one script. You’ll begin to understand why I like to create a ‘ScriptingOptions’ object to store the options, since it is more efficient for this sort of job.

Getting database settings and object drops into a database-script

 # set "Option Explicit" to catch subtle errors

set-psdebug -strict

$DirectoryToSaveTo='e:\MyScriptsDirectory\' # local directory to save build-scripts to

$servername='MyServer' # server name and instance

$Database='MyDatabase' # the database to copy from

$ErrorActionPreference = "stop" # you can opt to stagger on, bleeding, if an error occurs

Trap {

# Handle the error

$err = $_.Exception

write-host $err.Message

while( $err.InnerException ) {

   $err = $err.InnerException

   write-host $err.Message

   };

# End the script.

break

}

# Load SMO assembly, and if we're running SQL 2008 DLLs load the SMOExtended and SQLWMIManagement libraries

$v = [System.Reflection.Assembly]::LoadWithPartialName( 'Microsoft.SqlServer.SMO')

if ((($v.FullName.Split(','))[1].Split('='))[1].Split('.')[0] -ne '9') {

[System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.SMOExtended') | out-null

   }

$My='Microsoft.SqlServer.Management.Smo'

$s = new-object ("$My.Server") $ServerName # get the server.

$Server=$s.netname -replace '[\\\/\:\.]',' ' # remove characters that can cause problems

$instance = $s.instanceName -replace '[\\\/\:\.]',' ' # ditto

$DatabaseName =$database -replace '[\\\/\:\.]',' ' # ditto

$DirectoryToSaveTo=$DirectoryToSaveTo+$Server+'\'+$Instance+'\' # database scripts are local on client

if (!( Test-Path -path "$DirectoryToSaveTo" )) # create it if not existing

       {$progress ="attempting to create directory $DirectoryToSaveTo"

              Try { New-Item "$DirectoryToSaveTo" -type directory | out-null }

             Catch [system.exception]{

       Write-Error "error while $progress. $_"

                return

                }

           }

<# now we will use the canteen system of SMO to specify what we want from the script. It is best to have a list of the defaults to hand and just override the defaults where necessary, but there is a chance that a later revision of SMO could change the defaults, so beware! #>

$CreationScriptOptions = new-object ("$My.ScriptingOptions")

$CreationScriptOptions.ExtendedProperties= $true # yes, we want these

$CreationScriptOptions.DRIAll= $true # and all the constraints

$CreationScriptOptions.Indexes= $true # Yup, these would be nice

$CreationScriptOptions.Triggers= $true # This should be included when scripting a database

$CreationScriptOptions.ScriptBatchTerminator = $true # this only goes to the file

$CreationScriptOptions.Filename = "$($DirectoryToSaveTo)$($DatabaseName)_Build.sql";

# we have to write to a file to get the GOs

$CreationScriptOptions.IncludeHeaders = $true; # of course

$CreationScriptOptions.ToFileOnly = $true # no need of string output as well

$CreationScriptOptions.IncludeIfNotExists = $true # not necessary but it means the script can be more versatile

$transfer = new-object ("$My.Transfer") $s.Databases[$Database]

$transfer.options=$CreationScriptOptions # tell the transfer object of our preferences

$scripter = new-object ("$My.Scripter") $s # script out the database creation

$scripter.options=$CreationScriptOptions # with the same options

$scripter.Script($s.Databases[$Database]) # do it

"USE $Database" | Out-File -Append -FilePath "$($DirectoryToSaveTo)$($DatabaseName)_Build.sql"

"GO" | Out-File -Append -FilePath "$($DirectoryToSaveTo)$($DatabaseName)_Build.sql"

# add the database object build script

$transfer.options.AppendToFile=$true

$transfer.options.ScriptDrops=$true

$transfer.EnumScriptTransfer()

$transfer.options.ScriptDrops=$false

$transfer.EnumScriptTransfer()

"All written to $($DirectoryToSaveTo)$($DatabaseName)_Build.sql"

This isn’t entirely what we want for other purposes, of course. What about when you want to create a database without indexes, constraints or triggers, import the data in BCP fast-mode and then add the indexes, constraints and triggers? Yes, you can squirrel away far more test-runs this way, and load them rapidly, but in order to do it, you need a build script without them first, and a second build script with them only. With the first ones, you can have a ‘knock-down’ kill script that deletes everything from the database before you start, but you definitely don’t want it for the second script. You’ll soon be eyeing up all those scripting options, though, believe me. I’ll be covering a lot about this in future articles.

Automated scripting of objects

A second task is to save each object to a separate file. You’ll need to do this to get your local (unshared) database into source control if you’re not using SQL Source Control. The simplest way of doing this, if you are lucky enough to have SQL Compare, is

sqlcompare.exe /s1:MyServer /db1:MyDatabase /mkscr:MyNewDirectory    /q

You can do it in Powershell, and you will, again, have more knobs you can twiddle to get the individual scripts exactly how you like them.

Here is a simplified script that shows you one of the several methods of doing this. Like SQL Compare, it saves each object type into its own subdirectory.

$ServerName='MyServer'# the server it is on

$Database='MyDatabase' # the name of the database you want to script as objects

$DirectoryToSaveTo='E:\MyScriptsDirectory' # the directory where you want to store them

# Load SMO assembly, and if we're running SQL 2008 DLLs load the SMOExtended and SQLWMIManagement libraries

$v = [System.Reflection.Assembly]::LoadWithPartialName( 'Microsoft.SqlServer.SMO')

if ((($v.FullName.Split(','))[1].Split('='))[1].Split('.')[0] -ne '9') {

    [System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.SMOExtended') | out-null

}

[System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.SmoEnum') | out-null

set-psdebug -strict # catch a few extra bugs

$ErrorActionPreference = "stop"

$My='Microsoft.SqlServer.Management.Smo'

$srv = new-object ("$My.Server") $ServerName # attach to the server

if ($srv.ServerType-eq $null) # if it managed to find a server

   {

   Write-Error "Sorry, but I couldn't find Server '$ServerName' "

   return

}

$scripter = new-object ("$My.Scripter") $srv # create the scripter

$scripter.Options.ToFileOnly = $true

# we now get all the object types except extended stored procedures

# first we get the bitmap of all the object types we want

$all =[long] [Microsoft.SqlServer.Management.Smo.DatabaseObjectTypes]::all `

    -bxor [Microsoft.SqlServer.Management.Smo.DatabaseObjectTypes]::ExtendedStoredProcedure

# and we store them in a datatable

$d = new-object System.Data.Datatable

# get everything except the servicebroker object, the information schema and system views

$d=$srv.databases[$Database].EnumObjects([long]0x1FFFFFFF -band $all) | `

    Where-Object {$_.Schema -ne 'sys'-and $_.Schema -ne "information_schema" -and $_.DatabaseObjectTypes -ne 'ServiceBroker'}

# and write out each scriptable object as a file in the directory you specify

$d| FOREACH-OBJECT { # for every object we have in the datatable.

   $SavePath="$($DirectoryToSaveTo)\$($_.DatabaseObjectTypes)"

   # create the directory if necessary (SMO doesn't).

   if (!( Test-Path -path $SavePath )) # create it if not existing

        {Try { New-Item $SavePath -type directory | out-null }

        Catch [system.exception]{

            Write-Error "error while creating '$SavePath' $_"

            return

         }

    }

    # tell the scripter object where to write it

    $scripter.Options.Filename = "$SavePath\$($_.name -replace '[\\\/\:\.]','-').sql";

    # Create a single element URN array

    $UrnCollection = new-object ('Microsoft.SqlServer.Management.Smo.urnCollection')

    $URNCollection.add($_.urn)

    # and write out the object to the specified file

    $scripter.script($URNCollection)

}

"Oh wide one, All is written out!"

This time, we get SMO’s EnumObjects method for the current database object, then get the scripter object, via the Script method, to script out each object individually and save it to a separate file. Each filename is generated from the name of the object, and the directory name is generated from its object type. You can, of course, be very selective about what you script out and you’ll see that scripting out a single object type, such as a table, is very simple.

In this following script, we'll save just the tables, scripting different schemas into different directories and adding the DRI, indexes, extended properties and triggers to each table. It would be equally simple to script whatever types of objects you want just by 'or'ing the DatabaseObjecTypes to taste. ( in the previous script, I specified 'all but...')

$ServerName='MyServer'# the server it is on

$Database='MyDatabase' # the name of the database you want to script as objects

$DirectoryToSaveTo='E:\MyScriptsDirectory' # the directory where you want to store them

# Load SMO assembly, and if we're running SQL 2008 DLLs load the SMOExtended and SQLWMIManagement libraries

$v = [System.Reflection.Assembly]::LoadWithPartialName( 'Microsoft.SqlServer.SMO')

if ((($v.FullName.Split(','))[1].Split('='))[1].Split('.')[0] -ne '9') {

   [System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.SMOExtended') | out-null

}

[System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.SmoEnum') | out-null

set-psdebug -strict # catch a few extra bugs

$ErrorActionPreference = "stop"

$My='Microsoft.SqlServer.Management.Smo'

$srv = new-object ("$My.Server") $ServerName # attach to the server

if ($srv.ServerType-eq $null) # if it managed to find a server

   {

   Write-Error "Sorry, but I couldn't find Server '$ServerName' "

   return

}

$scripter = new-object ("$My.Scripter") $srv # create the scripter

$scripter.Options.ToFileOnly = $true

$scripter.Options.ExtendedProperties= $true # yes, we want these

$scripter.Options.DRIAll= $true # and all the constraints

$scripter.Options.Indexes= $true # Yup, these would be nice

$scripter.Options.Triggers= $true # This should be includede

# first we get the bitmap of all the object types we want

$objectsToDo =[long] [Microsoft.SqlServer.Management.Smo.DatabaseObjectTypes]::Table

# and we store them in a datatable

$d = new-object System.Data.Datatable

# get just the tables

$d=$srv.databases[$Database].EnumObjects($objectsToDo)

# and write out each scriptable object as a file in the directory you specify

$d| FOREACH-OBJECT { # for every object we have in the datatable.

    $SavePath="$($DirectoryToSaveTo)\$($_.DatabaseObjectTypes)\$($_.Schema)"

    # create the directory if necessary (SMO doesn't).

    if (!( Test-Path -path $SavePath )) # create it if not existing

           {Try { New-Item $SavePath -type directory | out-null }

        Catch [system.exception]{

             Write-Error "error while creating '$SavePath' $_"

             return

             }

        }

    # tell the scripter object where to write it

    $scripter.Options.Filename = "$SavePath\$($_.name -replace '[\\\/\:\.]','-').sql";

    # Create a single element URN array

    $UrnCollection = new-object ("$My.urnCollection")

    $URNCollection.add($_.urn)

    # and write out the object to the specified file

    $scripter.script($URNCollection)

    }

"All is written out, wondrous human"

Automated scripting of static data

The last task we’ll set ourselves is to script out static data. This will really just include all those small tables without which your database simply won’t work. Now, there is no way that anyone but you will know which tables this includes, but don’t ‘go ape’ with this script, since it is scripting INSERT statements and that sort of thing doesn’t scale effectively for big tables. No, sir: you’ll want native-mode, fast-mode BCP for that.

This time, I’ve used a slightly different approach, in that I’ve actually constructed the URNs from the (maybe qualified)  names of the table; this means the schema too if you specify it, and also the database if you want that too. You just specify what tables you want to script and we just go and do it. With SMO there are always several ways of getting to your destination.

# set "Option Explicit" to catch subtle errors

set-psdebug -strict

$DirectoryToSaveTo='MyDirectory'; # local directory to save build-scripts to

$servername='MyServer'; # server name and instance

$Database='AdventureWorks'; # the database to copy from (Adventureworks here)

$Filename='MyFileName';

$TableList='product, AdventureWorksDW.dbo.DimCustomer, HumanResources.Department, person.countryRegion';

# a list of tables with possible schema or database qualifications

# Adventureworks used for this example

$ErrorActionPreference = "stop" # you can opt to stagger on, bleeding, if an error occurs

# Load SMO assembly, and if we're running SQL 2008 DLLs load the SMOExtended and SQLWMIManagement libraries

$v = [System.Reflection.Assembly]::LoadWithPartialName( 'Microsoft.SqlServer.SMO')

if ((($v.FullName.Split(','))[1].Split('='))[1].Split('.')[0] -ne '9') {

  [System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.SMOExtended') | out-null

  }

# Handle any errors that occur

Trap {

  # Handle the error

  $err = $_.Exception

  write-host $err.Message

  while( $err.InnerException ) {

   $err = $err.InnerException

   write-host $err.Message

   };

  # End the script.

  break

  }

# Connect to the specified instance

$s = new-object ('Microsoft.SqlServer.Management.Smo.Server') $ServerName

# Create the Database root directory if it doesn't exist

$homedir = "$DirectoryToSaveTo\$Database\"

if (!(Test-Path -path $homedir))

  {Try { New-Item $homedir -type directory | out-null } 

     Catch [system.exception]{

        Write-Error "error while creating '$homedir'  $_"

        return

        } 

  }

$scripter = new-object ('Microsoft.SqlServer.Management.Smo.Scripter') $s

$scripter.Options.ScriptSchema = $False; #no we're not scripting the schema

$scripter.Options.ScriptData = $true; #but we're scripting the data

$scripter.Options.NoCommandTerminator = $true;

$scripter.Options.FileName = $homedir+$Filename #writing out the data to file

$scripter.Options.ToFileOnly = $true #who wants it on the screen?

$ServerUrn=$s.Urn #we need this to construct our URNs.

$UrnsToScript = New-Object Microsoft.SqlServer.Management.Smo.UrnCollection

#so we just construct the URNs of the objects we want to script

$Table=@()

foreach ($tablepath in $TableList -split ',')

  {

  $Tuple = "" | Select Database, Schema, Table

  $TableName=$tablepath.Trim() -split '.',0,'SimpleMatch'

   switch ($TableName.count)

    {

      1 { $Tuple.database=$database; $Tuple.Schema='dbo'; $Tuple.Table=$tablename[0];  break}

      2 { $Tuple.database=$database; $Tuple.Schema=$tablename[0]; $Tuple.Table=$tablename[1];  break}

      3 { $Tuple.database=$tablename[0]; $Tuple.Schema=$tablename[1]; $Tuple.Table=$tablename[2];  break}

      default {throw 'too many dots in the tablename'}

    }

     $Table += $Tuple

   }

foreach ($tuple in $Table)

  {

   $Urn="$ServerUrn/Database[@Name='$($tuple.database)']/Table[@Name='$($tuple.table)' and @Schema='$($tuple.schema)']";

  $urn

   $UrnsToScript.Add($Urn)

  }

#and script them

$scripter.EnumScript($UrnsToScript) #Simple eh?

"Saved to $homedir"+$Filename+', wondrous carbon-based life form!'

"done!"

Now we’re making progress, and I’m hoping that, in these simplified scripts, I’ve given you some useful clues as to how to generate particular types of build scripts. I’ll let you turn them into robust command-line tools with parameters and all the error handling, logging and other bits and pieces you’ll want. In the next article, I’ll explain how you can extend this functionality by actually executing SQL batches via SMO. This will allow you to automate migration scripts and do integration testing and all manner of other things all from the one script.

Phil Factor

Author profile:

Phil Factor (real name withheld to protect the guilty), aka Database Mole, has 30 years of experience with database-intensive applications. Despite having once been shouted at by a furious Bill Gates at an exhibition in the early 1980s, he has remained resolutely anonymous throughout his career. See also :

Google + To translate this article...

Search for other articles by Phil Factor

Rate this article:   Avg rating: from a total of 40 votes.


Poor

OK

Good

Great

Must read
Have Your Say
Do you have an opinion on this article? Then add your comment below:
You must be logged in to post to this forum

Click here to log in.


Subject: Combine with source control
Posted by: RealSQLGuy (view profile)
Posted on: Tuesday, February 07, 2012 at 11:30 AM
Message: Good stuff. I've been doing something similar for the past year, combined with our source control software, to automate version control of our database objects.

http://www.real-sql-guy.com/search/label/Source%20Control

Subject: Re:Combine with source control
Posted by: Phil Factor (view profile)
Posted on: Tuesday, February 07, 2012 at 12:04 PM
Message: @RealSQLGuy
Ah yes, we have trodden the same path, I can see. Quite a struggle eh? It is well worth checking out the site, and a great example of Source-control integration. Lots of useful info here.
See also Grant's approach
http://www.simple-talk.com/sql/database-administration/auditing-ddl-changes-in-sql-server-databases/

Subject: good
Posted by: Anonymous (not signed in)
Posted on: Sunday, February 19, 2012 at 8:49 AM
Message: nice article. dba must read

Subject: fcayOOONlus
Posted by: Seyit (not signed in)
Posted on: Tuesday, February 21, 2012 at 10:00 PM
Message: Thanks, saved me heaps of time! Tagging in delicious for future reference!

Subject: Objects created in master
Posted by: johns (not signed in)
Posted on: Wednesday, February 22, 2012 at 12:40 PM
Message: Great Scripts, Thanks!
Testing the script following "Getting database settings and object drops into a database-script"
The database is created fine, but the objects are created in master, oops.
I added
#Change the database context
"USE $Database" | Out-File -Append -FilePath "$($DirectoryToSaveTo)$($DatabaseName)_Build.sql"
"GO" | Out-File -Append -FilePath "$($DirectoryToSaveTo)$($DatabaseName)_Build.sql"
after $scripter.Script($s.Databases[$Database]) # do it
That looks like it fixed that issue.
Thanks again
johns

Subject: SMO is very useful
Posted by: puzsol (view profile)
Posted on: Wednesday, February 22, 2012 at 5:33 PM
Message: I recently stumbled upon SMO and used it to build a C# util to copy a database (data and all) from the client server where we only have read access. It works quite well, though I had to modify the order of the scripts as it tries to assign users to roles before the roles are created.

The really neat thing is that the with the Transfer object is that you can use EnumObjects to obtain the order to copy table data (I used SqlBulkCopy) with DataReferentialIntegrity in tact (already on the table structure).

Of course I could have just used the RedGate SQL tools, but that wouldn't have been anywhere near as much fun (or undocumented frustration).

One other "error" with SMO is even if you specify to exclude filegroups from the transfer, it still includes the with filegroup option on the fulltext indexes - so again manual editing of the script created.

Subject: re: Objects created in Master
Posted by: Phil Factor (view profile)
Posted on: Thursday, February 23, 2012 at 3:44 AM
Message: @JohnS
Oops. One moment it's there, the next moment it isn't.  I've added in your fix to the script whilst I figure out where the USE <database> disappeared to. Sometimes using SMO directly is like trying to catch a rat with your bare hands.
Thanks for that!
(Later)
Ah. I took out ...
    $transfer.options.IncludeDatabaseContext=$true
...
which gives you the USE <Database>  - because it would only return it as a string rather than write it to file. I then did...
  #now we alter the name to create the copy. This isn't safe with certain database names
  if ($CopyOfDatabase -eq $null) {$CopyOfDatabase="New$Database"}
       (get-content "$($DirectoryToSaveTo)$($DatabaseName)_Build.sql") | `
     Foreach-Object {$_ -replace $Database, "$CopyOfDatabase"} >"$($DirectoryToSaveTo)$($DatabaseName)_Build.sql"
"USE [$CopyOfDatabase]"  >> "$($DirectoryToSaveTo)$($DatabaseName)_Build.sql"
# add the database object build script
...when I tidied it up for publication, I took out the full copying bit but I nicked out the insertion of the 'USE' by mistake!



Subject: Getting the table order.
Posted by: Phil Factor (view profile)
Posted on: Thursday, February 23, 2012 at 3:55 AM
Message: @PuzSol
It sounds like a useful bit of code. The table-dependency order thing is described in a subsequent article. Although it is certainly possible to do using SMO, this is most effectively done, I reckon, with a topological sort, in TSQL. With such a table-function, then deleting the data in every table in a database becomes frighteningly easy, and you can BCP in entire data sets without any kill and fill nonsense!
With SMO, one gets such a wonderful feeling of success when you get it to bend to your will that it is easy to forget the grinding frustration of working against a perfectly horrible documentation

Subject: Few questions
Posted by: software_now (view profile)
Posted on: Tuesday, March 06, 2012 at 3:16 PM
Message: I scripted one of our dbs with 4,500 objects to a file using your script. Then I created a new database and ran the data scripts your script generated (getting confusing? :) )

If I check number of objects by using "select COUNT(*) from sys.all_objects" from PROD versus new test db - I get 300+ object mismatch

So, here is the changes I had to make, to get it closer:

I added:
$CreationScriptOptions.Bindings= $true
$CreationScriptOptions.Permissions= $true
$CreationScriptOptions.IncludeDatabaseRoleMemberships= $true

I am now down to 245 differences.

Few problems I am aware of:
1. It looks like sequence in which those scripts get execute is part of the problem
2. Constraint names are not preserved
3. Not all permissions get scripted, such as:
REVOKE CONNECT FROM [PROD\secureAccount]

Any idea how to get around those?

Still a great script though!!

Subject: Re: Few questions
Posted by: Phil Factor (view profile)
Posted on: Wednesday, March 07, 2012 at 3:34 PM
Message: Some index extended properties got missed in the testing I did. I checked with SQL compare. You can download it and use it to see exactly what is getting missed. I expect that there is a combination of options that will make it produce a correct script. Without knowing exactly what is getting missed, we are rather guessing Thanks for the test. That is quite a database.

Subject: Permissions in the script?
Posted by: software_now (view profile)
Posted on: Thursday, March 08, 2012 at 9:19 AM
Message: Do you know of a way to generate permissions? Here is one of the pieces I am missing.

Example:
GRANT CREATE FUNCTION TO [ProductSupportLevel3]
GRANT CREATE PROCEDURE TO [ProductSupportLevel3]
GRANT DELETE TO [ProductSupportLevel3]
GRANT EXECUTE TO [ProductSupportLevel3]
GRANT INSERT TO [ProductSupportLevel3]
GRANT SELECT TO [ProductSupportLevel3]
GRANT UPDATE TO [ProductSupportLevel3]
GRANT VIEW DEFINITION TO [ProductSupportLevel3]

Subject: Re: Permissions on tyhe script?
Posted by: Phil Factor (view profile)
Posted on: Friday, March 09, 2012 at 9:23 AM
Message: Simple. (I generally need to leave all this out) Just include this..

$CreationScriptOptions.Permissions = $true

To get users and logins, do this...

$transfer.CopyAllLogins=$true
$transfer.preserveLogins=$true

Subject: Great Script. I have one small issues.. Can you help me?
Posted by: barrynoble (view profile)
Posted on: Tuesday, June 05, 2012 at 10:05 AM
Message: The only issue I have is that for some of the databases I have a partition function that is range right. The script generates the partition function with no right or left making it default to left. I have tried multiple things to correct that but nothing has worked. Do you have any suggestions?

Subject: Re: Great Script. I have one small issues.. Can you help me?
Posted by: Phil Factor (view profile)
Posted on: Tuesday, June 05, 2012 at 11:04 AM
Message: I've found exactly the same thing, which I'd say is a bug.  Create a database with two range functions, one RANGE LEFT, and the other RANGE RIGHT, and even SSMS will script them incorrectly leaving out the LEFT or RIGHT keyword. If SSMS does it wrongly, I don't think that any tweaking with SMO will get it right.  To get it right, I'd do it directly.  Unfortunately, sp_HelpText won't help you, but you can do it using the values in sys.partition_range_values and sys.partition_functions. The latter has a column called boundary_value_on_right which records whether range partitioning is 1 (Boundary value is included in the RIGHT range of the boundary) or 0 (Boundary value is included in the LEFT range of the boundary). For scripting, I quite often use SQL directly just because of certain issues like this. (I also like to have extended properties to appear as comments as well as parameters within scripts )

Subject: Re: Great Script. I have one small issues.. Can you help me?
Posted by: Phil Factor (view profile)
Posted on: Tuesday, June 05, 2012 at 11:56 AM
Message: SELECT 'CREATE PARTITION FUNCTION '
+QuoteName(pf.Name)+'('+sys.Types.Name+')
AS ' + CASE WHEN pf.TYPE='R' THEN 'RANGE'
+ CASE Boundary_Value_On_Right WHEN 1 THEN ' RIGHT ' ELSE ' LEFT ' END ELSE Type_desc END
+ 'FOR VALUES (' COLLATE sql_Latin1_General_cp1_ci_ai
+ CASE WHEN sys.Types.Name IN
('Varchar','nvarchar','datetime','date','time',
'datetime','datetime2','char', 'nchar',
'datetimeoffset', 'smalldatetime')
THEN Stuff(
(SELECT ', '''+CONVERT(NVARCHAR(256),VALUE)+''''
FROM sys.Partition_Range_Values rv
WHERE pf.Function_Id=rv.Function_Id
AND pp.Parameter_Id=rv.Parameter_Id
FOR XML PATH(''), TYPE).VALUE('.', 'varchar(max)')
,1,2,'') +')'
ELSE
Stuff(
(SELECT ', '+CONVERT(NVARCHAR(256),VALUE)
FROM sys.Partition_Range_Values rv
WHERE pf.Function_Id=rv.Function_Id
AND pp.Parameter_Id=rv.Parameter_Id
FOR XML PATH(''), TYPE).VALUE('.', 'varchar(max)')
,1,2,'') +')'
END
FROM sys.Partition_Functions pf
INNER JOIN sys.Partition_Parameters pp
ON pp.Function_Id=pf.Function_Id
INNER JOIN sys.Types
ON sys.Types.System_Type_Id=pp.System_Type_Id

Subject: Great script. Additional challenge
Posted by: ricardodiazjimenez (view profile)
Posted on: Wednesday, June 06, 2012 at 9:21 AM
Message: Phil, thank you these scripts are awesome. Helped me a lot.

I have this additional challenge. Hope you can give me some tips.

I need the scripts to create some tables in the database (not all). For this I have a table called "TableExport" in the database which containts the names of the tables that I want to "script". How can I filter the script creation with the table names in my "TableExport"?

Finally, I would like to create just one file for all the tables in the script, not a separate one for each table. How do you accomplish these?

Thanks again!

Subject: Getting error message for $transfer.ScriptTransfer
Posted by: Atif Shehzad (view profile)
Posted on: Tuesday, September 10, 2013 at 4:26 AM
Message: I am getting following error message on one of my servers.
$transfer.ScriptTransfer <<<< () + CategoryInfo : NotSpecified: (:) [], ParentContainsErrorRecordE xception + FullyQualifiedErrorId : DotNetMethodException

Output file is created with few objects coded in it but error message is generated later. Could you please advise for it.

Thanks

 

Phil Factor
Searching for Strings in SQL Server Databases

Sometimes, you just want to do a search in a SQL Server database as if you were using a search engine like Google.... Read more...

 View the blog

Top Rated

Continuous Delivery and the Database
 Continuous Delivery is fairly generally understood to be an effective way of tackling the problems of... Read more...

The SQL Server Sqlio Utility
 If, before deployment, you need to push the limits of your disk subsystem in order to determine whether... Read more...

The PoSh DBA - Reading and Filtering Errors
 DBAs regularly need to keep an eye on the error logs of all their SQL Servers, and the event logs of... Read more...

MySQL Compare: The Manual That Time Forgot, Part 1
 Although SQL Compare, for SQL Server, is one of Red Gate's best-known products, there are also 'sister'... Read more...

Highway to Database Recovery
 Discover the best backup and recovery articles on Simple-Talk, all in one place. Read more...

Most Viewed

Beginning SQL Server 2005 Reporting Services Part 1
 Steve Joubert begins an in-depth tour of SQL Server 2005 Reporting Services with a step-by-step guide... Read more...

Ten Common Database Design Mistakes
 If database design is done right, then the development, deployment and subsequent performance in... Read more...

SQL Server Index Basics
 Given the fundamental importance of indexes in databases, it always comes as a surprise how often the... Read more...

Reading and Writing Files in SQL Server using T-SQL
 SQL Server provides several "standard" techniques by which to read and write to files but, just... Read more...

Concatenating Row Values in Transact-SQL
 It is an interesting problem in Transact SQL, for which there are a number of solutions and... Read more...

Why Join

Over 400,000 Microsoft professionals subscribe to the Simple-Talk technical journal. Join today, it's fast, simple, free and secure.