Click here to monitor SSC
  • Av rating:
  • Total votes: 15
  • Total comments: 2
Joshua Feierman

The Mindset of the Enterprise DBA: 8 Ways To Centralize Your Work

12 August 2014

Although it is possible to provide good service to small to medium companies as a DBA without putting in place systems and processes to manage the workload and the demands of teamwork, it isn't so in the larger enterprise. As well as standardizing and documenting, the enterprise DBA needs to centralize the management of systems. Joshua Feierman explains.

   On Adopting the Mindset of an Enterprise DBA
The Mindset of the Enterprise DBA: Creating and Applying Standards to Our Work
 ▬► The Mindset of the Enterprise DBA: 8 Ways To Centralize Your Work
  The Mindset of the Enterprise DBA: Harnessing the Power of Automation
  "The Mindset of the Enterprise DBA: Delegating Work


In the previous installment of my series on adopting the mindset of an Enterprise DBA, we discussed the practice of Standardization. Based on the comments, I think we all agree that while this practice is sometimes a bit boring or cumbersome, it is worthwhile. In this installment, we move on to the second practice: Centralization.

Defining Centralization

Centralization is about defining and keeping your stuff in as few places as possible, preferably in one. To use a metaphor we database professionals are likely to understand, it is like normalizing database structures. When we practice good normalization, we ensure that information is not kept in several places; the same is true for centralizing our work and systems. There are several advantages to this practice, but two stand out in particular:

  • By keeping things in as few places as possible, we make maintaining them easier since we only have to touch them once.
  • We always know where to find things.

The practicalities of Centralization

Many of these could be their own article, so I'm not going to go into too much depth, but will provide links to relevant information whenever possible (and thanks to the wonderful SQL community, it usually is).

Keep Your Documentation in one Place

Do you remember all that documentation you produced in part one? (You did write all that down, right? If not, go do that right now. Go on, I'll wait. Yes, it's that important.) All that effort will do you no good if it's scattered around in different places and not organized logically. It's not so much where you put it, just that it's all in one place. Do you happen to like SharePoint? Great, make a site and a document library and put it there. Are you more partial to network drives or maybe just don't have anything else? No worries; put everything in some folders organized in a way that you will understand. Again, it's not how you do it, only that you do it at all.

Build a Repository

Every enterprise-minded DBA should have a repository. By ‘repository’, I mean some kind of database in which you keep useful metadata about your systems. Here are some examples of things to store in the database:

  • Server Names
  • Build numbers (a.k.a. patch level, both OS and SQL)
  • Usage - who uses what databases on the server and how do you contact them.
  • Environment - is this a production or test server?
  • Special instructions - is there anything special or unusual about this server?

The way that you structure it isn't important, just so long as you keep it updated religiously. I don't want to skip ahead too much, but you'd be wise to figure out a way to automate the updating process so you never have to worry about forgetting to change some detail. Believe me, this will come in very handy when a server goes down in the middle of the night and you need to quickly send a notice to all parties affected by the outage (or know if the outage is even an issue at all, perhaps the server is only in use in the daytime). If you want some pointers in this area, a good place to start would be something like Rodney Landrum's SQL Server Tacklebox or his SQL Server Pro article on the subject.

Use Source Control

Here's a place where we can take a point from the developers. If you are keeping a repository of scripts (you are, right?), they need to be in some kind of version control system. Here again, which one you use isn't important (though if you insist on using something like VSS you may want to seek therapy), only that you actually use it. There are numerous advantages to using source control, but two stand out:

  1. You have a built in backup for when changes are made, as well as a way to get an old version of things if need be. Say, for example, if you suddenly find a bug in the version of your maintenance package for SQL 2000, and you need to fix it for that one last holdout. Without version control you would have to manually script everything out.
  2. You have a built in mechanism for pushing changes out to other members of your team. All they need to do is update their checked out copy, and all the changes are pulled down immediately. No more copying things out from shared folders or the like.

If you're ambivalent about which version control system to use (and you don't have a corporately blessed one), I would suggest either Mercurial or Subversion. Both are very highly rated and I find them both easy to use and robust. Subversion in particular has a great set of documentation and tutorials.

Use Central Management Server

Central Management Server, or CMS for short, is a great feature of SQL Server that I sadly don't see a lot of people using. It lets you define groups of servers, which are stored in (of all things) a database on a central server. You can do things like execute statements across multiple servers, but honestly I find that the simple organization value alone is worthwhile. If you're smart, you could even drive your repository off the CMS database, which is actually just a few tables in MSDB. Setting one up is easy and well documented. My one caution would be that you want to ensure your CMS server is highly available, as it clearly won't do you any good if it's not accessible.

Learn PowerShell

I think that PowerShell is, beyond any other language, far and away the best tool for writing centralized utilities. What do I mean by that phrase? Simply put, the utility runs in one place, but connects remotely to other network resources to gather information about them, or to accomplish some task. PowerShell is excellent for this purpose because:

  1. It exposes WMI in ways that are far easier to understand than other languages like VBScript.
  2. It includes some very nice built-in remoting capabilities that help you execute commands remotely. Perhaps not quite as useful for a DBA as for a sysadmin, but gravy nonetheless.
  3. With a little help, PowerShell can have some very powerful parallel execution features that let you run things on multiple threads, thereby getting things done faster. Just imagine the difference between running a query against thirty servers one at a time, versus in batches of ten at once.

Here's a good example of how I used PowerShell in this manner. As a DBA, I wanted to get a morning status report on all my servers, to show me things like failed backups, databases approaching their size limit, or disk drives that were nearing capacity. Rather than rely on distributed alerts which, as wise DBAs have pointed out, can quickly overwhelm your inbox and give you multiple points of failure, I wanted one message every morning, so that I could see the state of things immediately after brewing that oh-so-important cup of coffee. So what's a DBA to do? I created a PowerShell based framework (based on Alan Renouf's excellent vCheck) which connected to my CMS server, spun off a bunch of threads, and got all my information in one place from my various servers. Problem solved.

Build a Tool Server

One of the most frustrating things about getting a new computer (or having an old one remade) is installing and configuring every piece of software required. The good news is, this one is easy to solve: get a server (it doesn't have to be a physical box, in fact I'd recommend a VM for failure protection), install everything you need there, then use it as much as possible for daily work. Yes, this means that you have some overhead of connecting in, but in today's world of remote desktop and application functionality, I'd argue that is minimal. In addition, you no longer lose your ability to work if your laptop goes kablooey. As a bonus, you can usually avoid the headaches around things like firewall rules or cross domain trust problems.

Use Policy Based Management

Policy Based Management, or PBM for short, is a framework developed by Microsoft to allow you to ensure compliance with rules setup in a central repository. It's a very powerful tool, though from my experience it does take a little getting used to. You can do both simple checks for compliance, as well as enforcement of rules (i.e. preventing a change from occurring), which is done via DDL triggers. PBM ships with a fairly comprehensive set of pre-built rules for looking at general best practices, but you can and should build your own.

Use SQL Agent Master Servers

A very powerful feature of SQL Agent since SQL Server 2008 is the concept of Master and Target servers. This functionality allows you to define and configure SQL Agent jobs in one master setup, which is then pushed down to multiple targets as you define. Think of it this way: you probably have one or more SQL Agent jobs which you need to run on all your servers one the same schedules. Perhaps you have monitoring jobs for things like TempDB space usage, or other things, and certainly you're going to have common maintenance tasks that need to be executed such as backups, DBCCs, and index maintenance. Rather than set them up individually on each server, you can define them once on the Master server, then push them down to all your targets. This way, when (not if) you need to make changes, such as adding a job step or tuning a parameter, you can make them once, rather than across each and every server. This works very well with several of the robust maintenance frameworks out there, such as Ola Hallengren's award winning set of scripts.

Concluding Thoughts

While centralizing our work may take some effort up front to get all the supporting pieces in place, the payoff is almost immediate. Knowing where everything is, being able to update it once, and having something approaching the "keep it in one place" model increases our efficiency and gives us peace of mind. In addition, it lays another block in the foundation of preparing for our next (and critical) step in the path to becoming an enterprise DBA: automation. After all, automating is far easier when you can set it up once, rather than multiple times in multiple places. This step, combined with the next, will probably have the greatest effect on your daily workload, in that you'll start to see your time free up since you have less busywork to do. Stay tuned and stay focused.

Further Reading

Policy-based Management

Central Management Server


Source Control for DBAs

Joshua Feierman

Author profile:

After completing a degree in neuro-psychology in college, Josh started his career in IT almost by accident. In what was originally a temporary job creating reports for clients of a financial company, he decided we would rather write some VBA macro code than repeatedly push the same buttons on a keyboard . After working in low level development and support roles, he transitioned to the role of a full time SQL Server DBA, supporting both production and development systems, and has never looked back. He currently works as a senior DBA for Gateway Ticketing Systems advising customers on maximizing SQL Server performance and availability. He also recently co-founded the independent software startup Do It Simply Software. He loves learning about what makes SQL Server tick, and how to make it go faster. For more information, follow him at

Search for other articles by Joshua Feierman

Rate this article:   Avg rating: from a total of 15 votes.





Must read
Have Your Say
Do you have an opinion on this article? Then add your comment below:
You must be logged in to post to this forum

Click here to log in.

Subject: Dead Links
Posted by: Anonymous (not signed in)
Posted on: Wednesday, August 13, 2014 at 9:26 AM
Message: Interesting article, but your PowerShell links are all dead.

Subject: Re: Dead Links
Posted by: Andrew Clarke (view profile)
Posted on: Wednesday, August 13, 2014 at 10:49 AM
Message: Fixed. The Content Management System took a silent revenge on us by taking an exception to an eight-bit character in the link

Simple-Talk Database Delivery

Patterns & Practices Library

Visit our patterns and practices library to learn more about database lifecycle management.

Find out how to automate the process of building, testing and deploying your database changes to reduce risk and make rapid releases possible.

Get started

Phil Factor
How to Build and Deploy a Database from Object-Level Source in a VCS

It is easy for someone who is developing a database to shrug and say 'if only my budget would extend to buying fancy... Read more...

 View the blog

Top Rated

Clone, Sweet Clone: Database Provisioning Made Easy?
 One of the difficulties of designing a completely different type of development tool such as SQL Clone... Read more...

Database Lifecycle Management: Deployment and Release
 So often, the unexpected delays in delivering database code are more likely to happen after the... Read more...

SSIS in SSDT: The Basics
 SQL Server Integration Services (SSIS) is a fast and reliable way of importing and exporting data, but... Read more...

SQL Server Security Audit Basics
 SQL Server Server Audit has grown in functionality over the years but it can be tricky to maintain and... Read more...

The PoSh DBA: Assigning Data to Variables Via PowerShell Common Parameters
 Sometimes, it is the small improvements in a language that can make a real difference. PowerShell is... Read more...

Most Viewed

Beginning SQL Server 2005 Reporting Services Part 1
 Steve Joubert begins an in-depth tour of SQL Server 2005 Reporting Services with a step-by-step guide... Read more...

Ten Common Database Design Mistakes
 If database design is done right, then the development, deployment and subsequent performance in... Read more...

Temporary Tables in SQL Server
 Temporary tables are used by every DB developer, but they're not likely to be too adventurous with... Read more...

Concatenating Row Values in Transact-SQL
 It is an interesting problem in Transact SQL, for which there are a number of solutions and... Read more...

SQL Server Index Basics
 Given the fundamental importance of indexes in databases, it always comes as a surprise how often the... Read more...

Why Join

Over 400,000 Microsoft professionals subscribe to the Simple-Talk technical journal. Join today, it's fast, simple, free and secure.