Click here to monitor SSC
  • Av rating:
  • Total votes: 14
  • Total comments: 2
Allen White

Test Your SQL Server Backups with PowerShell

23 August 2013

The best way of checking SQL Server backups is to restore them and run DBCC CHECKDB on the restored database. To do this regularly means that you need to automate the task. Allen White shows how, with PowerShell.

You're a good DBA, and you know that the first thing a DBA has to do is make sure that backups are done.  Every one of us knows that our backups are the first most important thing we're responsible for, so we make sure they're done. So my next question to you is 'have you tested them?'

Sure we all periodically have to restore a copy of a database to recover some data that got inadvertently deleted, and we haven't had a problem with that, so we're good, right? Well, not necessarily. Are we really good? Are we testing all of our backups? Are we running a DBCC CHECKDB against those databases? Sometimes running CHECKDB can impact server performance in unfortunate ways, so maybe we're not doing that as consistently as we should. Is that bad? (Yeah, it really is. You know it, but you're busy, and we've got all these user requests we've got to get done first, right?)

One of the things I love about PowerShell is the ability to automate tasks, so my solution to this problem uses PowerShell and the new Backup-SqlDatabase and Restore-SqlDatabase cmdlets I presented in my last article.  I've created a script which you can download, use, modify or just study.  It uses two separate instances of SQL Server, which you specify on the command line when you invoke the script, and a directory which is accessible by both instances, and to which the service accounts for both instances have full rights. I’ve added an alternative  version for Red Gate SQL Backup Pro as an attachment to this article.

The first thing we want to do is create a backup of all the user databases in the first instance.  I'll assume that this instance is your production instance, and has the databases which your business relies upon. We don't want to interfere with the log chain in these databases, so we'll do a 'copy only' backup.  As it backs these databases up it writes the backups to the directory you've specified on the command line, and since both Enterprise and Standard editions of SQL Server 2008 R2 and newer support backup compression, we'll include that option as well.

After the backups are complete we'll connect to the destination instance and determine the default file and log directory locations, so we can intelligently place the data files as we restore them.  We'll also use WMI (Windows Management Instrumentation) to determine the space available on the destination drives, and use that to make sure we have enough space on those drives before attempting to restore the databases.

Finally we'll run DBCC CHECKDB against each of the user databases on the destination server, to make sure that there aren't any corruption issues in the production databases that haven't been caught yet.

Sound like a plan? OK, let's get started.

The first thing we'll do is to provide comment-based help for anyone wanting to use the script. To see the results of this use the Get-Help cmdlet followed by the path and name of the script.

<#

.Synopsis

 Backs up and test user databases using second instance

.DESCRIPTION

 This script will perform a full backup of the user databases on a server, restore them

 to a second SQL Server instance, and perform a DBCC CheckDB on each restored database.

 It will verify before each restore that there's enough free space on the target disk

 drives before performing the restore, and it will kill all processes connected to the

 database being restored so it can overwrite the database there if it already exists.

.EXAMPLE

 ./test-dr.ps1 WS12SQL WS12SQL\TEST01 c:\Workdir

.EXAMPLE

 ./test-dr.ps1 WS12SQL1 WS12SQL2 \\WS12SQL3\Workdir

#>

After the help we'll specify the parameters for the script. We'll precede the parameters with the CmdletBinding() method to allow advanced parameter features to be supported in our scripts. We'll precede each parameter with a comment, which will be displayed to the user should they neglect to include any required parameters, and well set each parameter to be mandatory.

# Get the SQL Server instance name from the command line

[CmdletBinding()]

param(

  # srcinst is the SQL Server instance being backed up

  [Parameter(Mandatory=$true)]

  [string]$srcinst=$null,

  # dstinst is the SQL Server instance where the databases will be restored and tested

  [Parameter(Mandatory=$true)]

  [string]$dstinst=$null,

  # workdir is the directory where the backups will be written

  [Parameter(Mandatory=$true)]

  [string]$workdir=$null

  )

 

As always, we should provide error handling, and I still tend to use the Trap function for that purpose. (Note that I specify 'continue' after the break, because I want to try to restore each database backed up, even if one restore might fail.)

# Handle any errors that occur

Trap {

  # Handle the error

  $err = $_.Exception

  write-output $err.Message

  while( $err.InnerException ) {

    $err = $err.InnerException

    write-output $err.Message

    };

  # Continue the script.

  continue

  }

Because the new backup and restore cmdlets are part of the SQL Server 2012 SQLPS module, we need to test to see if it's loaded, and load it if it hasn't already been.  Also, because loading the module changes the current location the script's running in, I added Push-Location to store the current location before loading the module, and Pop-Location after it, to return us to our original location.

# Test to see if the SQLPS module is loaded, and if not, load it

if (-not(Get-Module -name 'SQLPS')) {

  if (Get-Module -ListAvailable | Where-Object {$_.Name -eq 'SQLPS' }) {

   Push-Location # The SQLPS module load changes location to the Provider, so save the current location

    Import-Module -Name 'SQLPS' -DisableNameChecking

    Pop-Location # Now go back to the original location

   }

  }

Now we can connect to the source instance to perform our backups. We'll create an SMO Server object and connect to the instance with that, and pull in the Databases collection.  We'll also create an empty collection to store the database name and backup file name for each backup we do.  We create the backup file name by taking the database name and concatenating the current date and time to it.  For each backup we create a new system object with two note properties, one for the database name and one for the backup file path and name and add that to our collection, then back up the database.

# Connect to the specified instance

$srv = new-object ('Microsoft.SqlServer.Management.Smo.Server') $srcinst

$dbs = $srv.Databases

$bdir = $workdir

 

# Set up a collection to store the database names and backup files

$bup = @()

 

foreach ($db in $dbs)

  {

  if ($db.IsSystemObject -eq $False)         # Only back up user databases

   {

    $dbname = $db.Name

     $dt = Get-Date -Format yyyyMMddHHmmss

     $bfile = "$bdir\$($dbname)_db_$($dt).bak"

    $bkup = New-Object System.Object

    $bkup | Add-Member -type NoteProperty -name Name -value $dbname

    $bkup | Add-Member -type NoteProperty -name Filename -value $bfile

    $bup += $bkup

     Backup-SqlDatabase -ServerInstance $srcinst -Database $dbname -BackupFile $bfile -CompressionOption On -CopyOnly

   }

  }

If you want to use SQL Backup Pro, you’ll need to use Transact-SQL. Normally you’d use the Invoke-Sqlcmd cmdlet to send T-SQL to the server, but there’s a bug (that’s been fixed – see Connect: invoke-sqlcmd -querytimeout 0 still times out by Grant Fritchey) that may prevent large databases from backing up correctly.  Instead, ADO.NET makes perfect sense. (Notice that we open the connection to run the query, then close it after it’s done, following database development best practices.)

 

    $cn = new-object System.Data.SqlClient.SqlConnection("Data Source=$srcinst;Integrated Security=SSPI;Initial Catalog=master");

    $cn.Open()

    $q = "execute master..sqlbackup N`' -SQL `" BACKUP DATABASE $dbname TO DISK = ''$bfile'' WITH COMPRESSION = 4, COPY_ONLY`" `' "

    $bk = new-object System.Data.SqlClient.SqlCommand($q, $cn)

    $bk.CommandTimeout = 0              # Set query timeout to allow large backups to complete

    $bk.ExecuteNonQuery()

    $cn.Close()

 

We use these commands instead of the Backup-SqlDatabase command.

The backups are done, so it's time to connect to the destination instance using the SMO Server object, and determine the default file and log locations for the destination databases.

# Connect to the destination instance

$dst = new-object ('Microsoft.SqlServer.Management.Smo.Server') $dstinst

# Get the default file and log locations

# (If DefaultFile and DefaultLog are empty, use the MasterDBPath and MasterDBLogPath values)

$fileloc = $dst.Settings.DefaultFile

$logloc = $dst.Settings.DefaultLog

if ($fileloc.Length -eq 0) {

   $fileloc = $dst.Information.MasterDBPath

   }

if ($logloc.Length -eq 0) {

   $logloc = $dst.Information.MasterDBLogPath

   }

Using the Win32_LogicalDisk namespace in WMI, we can capture the size and free space on each of the disk drives on our destination server.  We'll store just the disk drive (DeviceID) and free space in note properties in a system object and add them to a collection we'll use during the restore process.

# Get the free space on all drives on the destination machine

$dtmp = $dstinst.Split('\')

$dmachine = $dtmp[0]

$dsk = Get-WMIObject -Query 'select * from Win32_LogicalDisk where DriveType = 3' -computername $dmachine

$free = @()

foreach ($d in $dsk) {

    $drv = New-Object System.Object

    $drv | Add-Member -type NoteProperty -name Name -value $d.DeviceID

    $drv | Add-Member -type NoteProperty -name FreeSpace -value $d.FreeSpace

    $free += $drv

   }

Now, for each backup we'll store the filename and database name in simple string variables and create a new backup device item object for the backup.  This backup device gives us a lot of information about the backup, which we'll use when restoring on the destination instance.  We'll also create an empty collection for the relocation objects we'll need to be able to 'MOVE' the restored database data files to the new directories on the destination instance.

Firstly, this is the way of doing this for native backup.

# Now restore the databases to the destination server

foreach ($bkup in $bup)

  {

  $bckfile = $bkup.Filename

  $dbname  = $bkup.Name

 

  # Use the backup file name to create the backup device

  $bdi = new-object ('Microsoft.SqlServer.Management.Smo.BackupDeviceItem') ($bckfile, 'File')

 

  # Create an empty collection for the RelocateFile objects

  $rfl = @()

We have to create an SMO Restore object to be able to read the contents of the backup device, so we'll do that, and add the backup device to the Restore Devices collection.  We can then use the Restore's ReadFileList() method to see what files need to be 'moved'.

   # Create a Restore object so we can read the details inside the backup file

  $rs = new-object('Microsoft.SqlServer.Management.Smo.Restore')

  $rs.Database = $dbname

  $rs.Devices.Add($bdi)

  $rs.ReplaceDatabase = $True

 

  # Get the file list info from the backup file

  $fl = $rs.ReadFileList($srv)

For each file we want to determine the filename and path, and its size on disk.  We can use PowerShell's Split-Path cmdlet with the -Leaf argument to get just the file name without the path (which is useful, since we'll be placing that file into a new directory on a new drive anyway) and again later with the -Qualifier argument on the default file or log path to determine the just the disk drive for our space calculation.  We'll create a RelocateFile object for each file, set its properties for LogicalFileName from the source file list and PhysicalFileName from the default file or log directories combined with the physical file name we split out, and we'll save the file size for the next step.

   $fl | foreach {

   $dfile = Split-Path $_.PhysicalName -leaf

   $rsfile = new-object('Microsoft.SqlServer.Management.Smo.RelocateFile')

   $rsfile.LogicalFileName = $_.LogicalName

   $rssize = $_.Size

   if ($_.Type -eq 'D') {

    $rsfile.PhysicalFileName = $fileloc + '\' + $dfile

     $ddrv = Split-Path $fileloc -Qualifier

    }

   else {

    $rsfile.PhysicalFileName = $logloc + '\' + $dfile

       $ddrv = Split-Path $logloc -Qualifier

  }

   $rfl += $rsfile

 If using SQL Backup Pro, you’d instead use

  Now restore the databases to the destination server

foreach ($bkup in $bup)

  {

  $bckfile = $bkup.Filename

  $dbname  = $bkup.Name

 

 

  # Create an empty collection for the RelocateFile objects

  $moves = ""

 

 

  # Get the file list info from the backup file

  $cn = new-object System.Data.SqlClient.SqlConnection("Data Source=$dstinst;Integrated Security=SSPI;Initial Catalog=master");

  $cn.Open()

  $ds = new-object System.Data.DataSet "dsFileList"

  $q = "execute master..sqlbackup N`' -SQL `" RESTORE FILELISTONLY FROM DISK = ''$bckfile''`" `' "

  $ds = new-object System.Data.SqlClient.SqlDataAdaptor($q, $cn)

  $da.Fill($ds)

  $cn.Close()


  $fl=$ds.Tables[0]


  $fl | foreach {

   $dfile = Split-Path $_.PhysicalName -leaf

   $moves += "MOVE ''" + $_.LogicalName + "'' TO "

    $rssize = $_.Size

   if ($_.Type -eq 'D' -or $_.Type -eq 'S') {

    $moves += "''" + $fileloc + "\" + $dfile + "'', "

      $ddrv = Split-Path $fileloc -Qualifier

    }

   else {

    $moves += "''" + $logloc + "\" + $dfile + "'', "

       $ddrv = Split-Path $logloc -Qualifier

  }

We want to make sure we're not going to run out of space, so we loop through the disk drive collection we stored earlier, and subtract the amount of space for the file from the FreeSpace property in the collection for that drive.  Then, after setting a variable called $spaceok to $True, we loop through the drives looking for any where the space has fallen below zero. If we find one, we set the variable to $False.

    #Reduce the free space on the destination drive to ensure we have enough space on disk before restoring

    For ($i=0; $i -lt $free.Count; $i++) {

        if ($free[$i].Name -eq $ddrv) {

          $free[$i].FreeSpace -= $rssize

          }

        }

   }

 

  # Check to see if free space ran out based on the restore size

  $spaceok = $True

    For ($i=0; $i -lt $free.Count; $i++) {

        if ($free[$i].FreeSpace -lt 0) {

          $spaceok = $False

          }

        }

 

 

After checking to see that our $spaceok variable is still set to $True we can restore the database. But if someone's connected to the database we won't be able to restore it, so we'll use the SMO Server object method called KillAllProcesses and specify the name of the database we're about to restore as the method argument to kick anyone out of that database.  Then we can restore the database.

   # Restore the database if free space is ok

  if ($spaceok) {

    # Get everyone out of the database

    $dst.KillAllProcesses($dbname)

    # Restore the database

    Restore-SqlDatabase -ServerInstance $dstinst -Database $dbname -BackupFile $bckfile -RelocateFile $rfl -ReplaceDatabase

   }

  }

With SQL Backup Pro, you’d do this instead:

 

  # Restore the database if free space is ok

  if ($spaceok) {

    # Get everyone out of the database

    $dst.KillAllProcesses($dbname)

    # Restore the database

    $cn = new-object System.Data.SqlClient.SqlConnection("Data Source=$dstinst;Integrated Security=SSPI;Initial Catalog=master");

    $cn.Open()

    $q = "execute master..sqlbackup N`' -SQL `" RESTORE DATABASE $dbname FROM DISK = ''$bckfile'' WITH $moves REPLACE`" `' "

    $bk = new-object System.Data.SqlClient.SqlCommand($q, $cn)

    $bk.CommandTimeout = 0              # Set query timeout to allow large backups to complete

    $bk.ExecuteNonQuery()

    $cn.Close()

   }

  }

 

The last thing we need to do is to run DBCC CHECKDB against all the databases on the destination server.  That's done via an SMO Database method called CheckTables.  Any errors returned by the CHECKDB will be returned to the console, and will also be posted on the server's error log for later reference.

# Get the databases from the destination, and iterate through them

$dbs = $dst.Databases

foreach ($db in $dbs) {

  # Check to make sure the database is not a system database, and is accessible

  if ($db.IsSystemObject -ne $True -and $db.IsAccessible -eq $True) {

   # Store the database name for reporting

   $dbname = $db.Name

  

   # Peform the database check

   $db.CheckTables('None')

  

   }

  }

  

  

The complete script can be downloaded here for Native Backup and here for Redgate Backup. Both are also from links on the right of the title to the article

 I've used a set of virtual machines I've built with a domain controller (where I've also put the file share to store the backup files), a client machine where I'm running the PowerShell console, and two servers running SQL Server 2012 on Windows Server 2012 Core.  I've set up a set of databases on WS12SQL01, the "production" server, and we'll restore those databases and test them on WS12SQL02.  The share where the databases will be stored is \\WinSrvrDC\BackupWork and the SQL Server service account on both WS12SQL01 and WS12SQL02 has full access to that share.

To test the help features, first I'll use Get-Help with the name of the script.

Now, to run the script with the proper parameters.

Using the Backup-SqlDatabase and Restore-SqlDatabase cmdlets allows you to see the progress of the backups and restores while the script's running.

As you can see one of my production databases is corrupted, and you can see the results in the command window, and you can go into the error log and see the same messages there.  Most importantly, you know that the production database has a problem, and you can take steps to resolve it before it causes greater harm should the application encounter that corrupted data.

By running this script on a regular basis you can test to make sure your backups are sound and usable, and that your production databases don't have hidden corruption, without impacting your production environment.

Allen White

Author profile:

Allen is a Practice Leader for Upsearch in Northeast Ohio. He's been both a developer and an administrator so understands both perspectives towards database technology, and loves sharing his experiences and helping people learn how to use SQL Server. Allen has spent over 35 years in IT and has been using SQL Server since 1992 and is certified MCITP in SQL Server and MCT. Allen has been awarded Microsoft’s MVP Award for the last six years. He's President of the Ohio North SQL Server User's Group and maintains a blog at http://sqlblog.com/blogs/allen_white/default.aspx.

Search for other articles by Allen White

Rate this article:   Avg rating: from a total of 14 votes.


Poor

OK

Good

Great

Must read
Have Your Say
Do you have an opinion on this article? Then add your comment below:
You must be logged in to post to this forum

Click here to log in.


Subject: Using sqlps for this, Nicely Done !!
Posted by: Edward J Pochinski (not signed in)
Posted on: Monday, August 26, 2013 at 9:32 AM
Message: I did this 3 years back with T-SQL and pulled the .bak files from a tape server to a restore server. Very nicely done with sqlps and wow much less code then I needed. I also had to build a pipe log that I FTP and this gets loaded & tracked via a web site.
EjP

Subject: Nicely Done
Posted by: timothyawiseman@gmail.com (view profile)
Posted on: Monday, August 26, 2013 at 4:30 PM
Message: I agree, this is nicely done. I rather like the scripts. But why have it create additional backups?

When I test my backups, I normally make a copy of one of the backups for production to test rather than take a new backup. That both saves the time to make the new backup and ensures that the actual backups I am relying are working properly.

 

Phil Factor
Searching for Strings in SQL Server Databases

Sometimes, you just want to do a search in a SQL Server database as if you were using a search engine like Google.... Read more...

 View the blog

Top Rated

SQL Server XML Questions You Were Too Shy To Ask
 Sometimes, XML seems a bewildering convention that offers solutions to problems that the average... Read more...

Continuous Delivery and the Database
 Continuous Delivery is fairly generally understood to be an effective way of tackling the problems of... Read more...

The SQL Server Sqlio Utility
 If, before deployment, you need to push the limits of your disk subsystem in order to determine whether... Read more...

The PoSh DBA - Reading and Filtering Errors
 DBAs regularly need to keep an eye on the error logs of all their SQL Servers, and the event logs of... Read more...

MySQL Compare: The Manual That Time Forgot, Part 1
 Although SQL Compare, for SQL Server, is one of Red Gate's best-known products, there are also 'sister'... Read more...

Most Viewed

Beginning SQL Server 2005 Reporting Services Part 1
 Steve Joubert begins an in-depth tour of SQL Server 2005 Reporting Services with a step-by-step guide... Read more...

Ten Common Database Design Mistakes
 If database design is done right, then the development, deployment and subsequent performance in... Read more...

SQL Server Index Basics
 Given the fundamental importance of indexes in databases, it always comes as a surprise how often the... Read more...

Reading and Writing Files in SQL Server using T-SQL
 SQL Server provides several "standard" techniques by which to read and write to files but, just... Read more...

Concatenating Row Values in Transact-SQL
 It is an interesting problem in Transact SQL, for which there are a number of solutions and... Read more...

Why Join

Over 400,000 Microsoft professionals subscribe to the Simple-Talk technical journal. Join today, it's fast, simple, free and secure.