The process of doing SQL code-reviews used to be tedious and error-prone. Until SQL Enlight, it was a process that was difficult to automate for release and deployment. As it is now both a Command-line utility and an SSMS add-in, the database developer can see immediately the parts of the code that would raise eyebrows with the vigilant production DBA.
Probably the greatest cause of database performance problems is badly-written code. A fairly significant proportion of code problems are easily-identifiable “code smells” that are well-known and are easily avoided. Yet, large amounts of code include them. Yes, you can review all the code that passes through your hands before it ends up in your production system; or you can begin to automate the process. This is where SQL Enlight for SSMS comes into play.
SQL Enlight for SSMS is a code analysis tool from UbitSoft. The basic idea is simple. You first establish a set of best practices and guidelines for your code, or adopt the existing standards: Then you find a way of ensuring that your code meets those standards and guidelines. SQL Enlight works within SQL Server Management Studio (SSMS) to provide you with a mechanism for running these checks on your existing databases. It also supplies a way of checking databases from a command line. With the guidelines, and a process for checking your SQL code, you can then automate the process of auditing your systems to ensure that they comply with your coding standards, using, for example, PowerShell.
This tool also represents a new initiative from Red Gate Software. We’ve long been working within SSMS with various tools like the indispensable SQL Prompt or SQL Search. But we’ve recognized a need for a wider set of tools that allow for more specialized support within SSMS. So, we’re establishing the SSMS Ecosystem. Here’s a blog post explaining the concept in more detail. UbitSoft is one of the very first organization to jump at this new initiative. Let’s take a look at the work they’ve done.
Best Practices and Coding Guidelines
UbitSoft have spent years collecting all the published good advice, warnings of deprecation, and SQL practices that cause bugs or performance problems. If you were to change your code until it passed on all counts, you’d be spending a lot of time on doing things that may have no effect on the code. You’ll certainly want to know about, and address, a lot of these rules, such as the use of coding practices that slow performance down. You’ll probably need to get that code refactored. You may receive other advice with a shrug. Code Smells merely indicate that you need to check in more detail. It could be that the code is fine, perhaps doing something unusual, but perfectly legitimate in the circumstances. There will be advice you don’t agree with, other advice you merely want to know about without reacting, code smells that makes you frown, and anti-patterns that you’ve really got to fix before it can get to production. Each shop is different, but SQL Enlight allows you to decide what’s important for you.
Download, Installation, Configuration
You can get a 14 day free trial from UbitSoft to check out the usefulness of this software for yourself. Following the links above to the download page, the process is extremely straightforward. If you click on the download links and you’ll get an executable. By running it, I had a flawless install to my machine. When I opened up SSMS for the first time (it must be closed during the installation), I couldn’t immediately see anything new. Then I saw a menu choice at the top of the screen:
The installation of SQL Enlight went quite smoothly as you see. I could have accepted the defaults for the configuration, but I wanted to see what sorts of behavior I could control, so I immediately opened the Options screen.
The first tab on the Options screen is not for configuring the behavior of the software. Instead it’s the listing of all possible Rules. I’ll get to those in a minute. The second options screen, ‘Options’, can be accessed by clicking. Now, it’s much more likely that you’ll set the Options once and then never go back to them with most of the work you’ll be doing within the Rules. But, accessing the Rules through the ‘Options’ menu and then having to do extra work to then get to the actual options seems to be slightly counter-intuitive.
The first screen in ‘Options’ allows you to set general behaviors, such as importing or exporting rules, and setting up regular updates to the standard set of rules:
The next tab is for setting the ‘Context’, the behavior, of SQL Enlight. The first option is for ‘Analysis Context Load Mode’ with two choices, ‘Basic’ or ‘Full’. I read the descriptions of each of these modes and still didn’t completely understand what it meant. When I looked it up in the documentation, I found that the’Basic’ listing covered most of the standard database objects that you would expect while ‘Full’ added in a number of additional object types:
- Message Types
- Extended Properties
- Event Notifications
Now it makes sense. Instead of saying ‘Only the most commonly used schema information is loaded during analysis’ I’d like to see something more clear such as ‘Standard set of database/server objects used for analysis’ or something along those lines. You can see the options in the screen below. I didn’t understand what ‘schema information’ and ‘loading’ meant in the context of static code analysis until I saw the listing of objects.
You also get the means to set up default connections for test-analysis and for when a script will be valided. You can enable or disable the automatic validation by SQL Server prior to running SQL Enlight. That’s a nice touch:
The final tab allows you to control the behavior of SQLEnlight within SSMS. First, you can decide if you want the automated analysis to occur before either the execution or parse of the commands (that’s still independent of SQL Server behavior as was noted in the previous tab). I’ve enabled both for the experiment. You also get to decide which rules are applied. I’ve got mine set to the default which is for all rules that are enabled; Active Rules. You could instead decide which rules you want to run on the fly.
The tool uses an execution context, set on the preceding screen, to determine appropriate use of the code during its evaluations. You can disable that context information here. You can also set where, and if, it caches disk information.
That’s it. With the system configured, it’s time to check out the behavior within SSMS.
SQL Enlight will run its analysis in two different ways; from the GUI - running against your existing code and structures, and automated - another type of static analysis run from the command line for automation purposes.
But, before we can run the analysis, let’s talk about what it is that we are going to analyze, the Rules.
SQL Enlight comes with a standard set of rules. You can see these rules by going through the ‘Options’ menu choice. They are a very thorough set of standards that are fairly well established within the SQL Server community, so you can’t go wrong just using their defaults. Let’s walk through some of what we have available.
First, the rules themselves are broken down into groups:
The great thing about having these groups is that you can use them to determine what checks you run against your structure and your code. You can run all active rules any time you want. You can also run any individual rule any time you want, but, you may not feel the need to regularly run, say, the Maintenance Rules on servers because it’s unlikely that your maintenance routines, once established, mess up very much. However, you will want to run Performance or Naming rules on a regular basis, so you can decide immediately which ones you want to run.
Within the groups, you have a set of rules that can be enabled or disabled. The rules themselves cover a whole slew of different options. Let’s just look at one example to give you an idea of how all the rules will work. There is a rule called “SA0007 : Pattern starting with “%” in LIKE predicate.” This is a great check in your code. If you have a query that looks something like this:
SELECT… FROM… WHERE x.MyVal LIKE ‘%SomeValue%’;
You will inevitably get only index scans on that table for that query, so this rule checks for this antipattern within your code. It’s an excellent idea. By default this rule is enabled. You can choose to disable it. Further, some rules have parameters that let you modify their behavior. This is just such a rule. It looks for the existence of a comment just before the statement using the beginning wild card type of search. If there is a comment, the code analysis assumes that you have an explanation why you want to violate the rule in this instance and will skip the rule.
You can create your own rules if you’re so inclined. Currently, that is a somewhat difficult process that I don’t want to document for this simple review. You can also edit, save or remove rules from your system. The GUI also gives you the ability to search for specific rules.
You can look through the documentation to see all the different rules. You can also browse through them using the GUI. Let’s see the rules in action.
You have different approaches to get started with a static analysis session with SQL Enlight. I would assume that you are usually simply going to right-click on any given database and start the process of analyzing it by making a choice from the context menu:
For this test of the software, I’ll run the analysis of all active rules against my instance of AdventureWorks2012. I haven’t cleaned it up from a series of tests recently, so, in addition to any issues that Microsoft may have introduced, I should have one or two poor choices on display. When I select to begin the analysis of all rules, a modal window is opened showing the current status of the evaluations. I also see a script window open that creates a script of all the objects on the system that need to be evaluated from a code window. These consist primarily of code elements of the database such as views, stored procedures, triggers and functions. I’m not sure why that gets created and I don’t have to keep it around to see the output which appears in a window at the bottom of my screen:
For my one database I have over 600 warnings from the evaluations of the rules. And they’re interesting. For example, above you see a series of suggested missing foreign keys on a table,MyBusinessEntityAddress. Funny enough, it’s a table where I demonstrate the importance of enforced foreign keys for the optimizer. In short, these are real problems in my database. Code segments will immediately show up back in the scripted window, where you can highlight the suggested error and go straight to it within the code of the T-SQL window.
That is one thing that could prove to be a major problem for very large databases because you can create a script that is too large to open within SSMS. But, as configured, you get a quick view of the code that is problematic based on the analysis you ran.
However, that detail aside, this is a fantastic collection of rules that will immediately identify a large amount of very easy-to-fix issues in your system.
If you enable SQL Enlight through the ‘options’ screen to check your queries as you execute or compile them, you get even more from the tool. In fact, I’d go so far as to suggest doing this as a standard part of pre-checkin of all code. Just as you would write a unit test and ensure that your code passes the unit test before checking it into source control, you could also have it run through your SQL Enlight checks. I have a piece of code that I wrote for AdventureWorks2012
FROM Sales.SalesOrderHeader AS soh
JOIN Sales.SalesOrderDetail AS sod ON sod.SalesOrderID = soh.SalesOrderID
WHERE soh.Comment LIKE '%Dude%'
It’s not that important what the code does or even if it’s working for the purposes of this review. But, when I execute the code, I immediately get the SQL Enlight window showing that it’s doing checks against my T-SQL and then this window:
With that you get the opportunity to stop and fix the issues that have been identified. In the case of the sample code above, there a couple of issues, one minor and one potentially a huge issue:
In addition to showing the above list, the T-SQL edit window is updated to show you exactly where all the issues can be found:
Then it’s just a matter of going back through the code and making the adjustments where appropriate. This is how you should be able to manage your T-SQL code within SSMS just as if it were a true Integrated Development Environment (IDE).
While it is very useful to be able to simply browse the warnings and errors generated through SSMS, you’ll need a lot more than this if you’re planning on creating any type of automation to your build and deployment processes, especially when it comes to checking on all those coding standards and guidelines tthat SQL Enlight can audit for you. That’s where the command-line utility comes into play.
Because you have a command-line application available, you can run this as part of just about any process you want. You can call it from PowerShell. You can use the SQL Agent to schedule jobs. You can incorporate it into a continuous deployment process. The command line itself is reasonably straight forward:
Enlight.exe command [parameters] [/quite /nologo]
The commands you have available are:
The one we’re primarily concerned with will be Analyze.
As with any software, there’s a little more to it than immediately appears. The documentation is a little light, so I had to do a series of experiments and exchange a couple of emails with UbitSoft, but this command line worked:
Enlight analyze /targettype:serverobjects /database:AdventureWorks2012 /objecttype:database /objectname:* /report:”c:\reports\mydb.xml” /server:DOJO\RANDORI
Let’s break that down just a little. Some of the options are very straightforward, server and database shouldn’t need much explanation. You can use a trusted connection or supply a user name and password. The Target Type can only be files or server-objects. This means that you can run the analysis against a set of files, probably after you check them out of source control, just for an example. I’m running it directly against a server in this example. You can also limit the objects accessed through the ObjectType designator:
- Server Trigger
- Dml Trigger
- Database Trigger
- User Defined Function
- Database Prgogrammability Object
- Stored Procedure
You can also supply a specific Object Name, but in this instance, I went after all the objects.
This is just about perfect. It’s everything I would want. The one, the only, issue I have with it is that the output is XML only, but both PowerShell and TSQL can deal with that. Here’s a sample output:
<Target Name=&"[dbo].[ufnGetContactInformation]" Type="Function">
<Issue Rule="EX0018" Type="RuleViolation" Message="EX0018 : Missing Index with impact 88.1453 on table [Sales].[Customer] for columns([PersonID],[StoreID])." Level="Warning" Line="61" Offset="1" />
<Issue Rule="EX0018" Type="RuleViolation" Message="EX0018 : Missing Index with impact 88.1482 on table [Sales].[Customer] for columns([PersonID],[StoreID])." Level="Warning" Line="61" Offset="1" />
<Issue Rule="SA0010" Type="RuleViolation" Message="SA0010 : Use TRY..CATCH or check the @@ERROR variable after executing data manipulation statement." Level="Warning" Line="20" Offset="3" />
<Issue Rule="SA0010" Type="RuleViolation" Message="SA0010 : Use TRY..CATCH or check the @@ERROR variable after executing data manipulation statement." Level="Warning" Line="31" Offset="3" />
<Issue Rule="SA0010" Type="RuleViolation" Message="SA0010 : Use TRY..CATCH or check the @@ERROR variable after executing data manipulation statement." Level="Warning" Line="46" Offset="3" />
<Issue Rule="SA0010" Type="RuleViolation" Message="SA0010 : Use TRY..CATCH or check the @@ERROR variable after executing data manipulation statement." Level="Warning" Line="61" Offset="3" />
<Issue Rule="SA0011" Type="RuleViolation" Message="SA0011 : SELECT * in stored procedures, views and table-valued functions." Level="Warning" Line="18" Offset="19" />
<Issue Rule="SA0011" Type="RuleViolation" Message="SA0011 : SELECT * in stored procedures, views and table-valued functions." Level="Warning" Line="27" Offset="19" />
<Issue Rule="SA0011" Type="RuleViolation" Message="SA0011 : SELECT * in stored procedures, views and table-valued functions." Level="Warning" Line="42" Offset="19" />
<Issue Rule="SA0011" Type="RuleViolation" Message="SA0011 : SELECT * in stored procedures, views and table-valued functions." Level="Warning" Line="57" Offset="19" />
<Issue Rule="SA0020" Type="RuleViolation" Message="SA0020 : Always use a column list in INSERT statements." Level="Warning" Line="20" Offset="3" />
<Issue Rule="SA0020" Type="RuleViolation" Message="SA0020 : Always use a column list in INSERT statements." Level="Warning" Line="31" Offset="3" />
<Issue Rule="SA0020" Type="RuleViolation" Message="SA0020 : Always use a column list in INSERT statements." Level="Warning" Line="46" Offset="3" />
<Issue Rule="SA0020" Type="RuleViolation" Message="SA0020 : Always use a column list in INSERT statements." Level="Warning" Line="61" Offset="3" />
<Issue Rule="SA0078" Type="RuleViolation" Message="SA0078 : Statement is not terminated with semicolon." Level="Warning" Line="67" Offset="5" />
What I like about it is that each object has the full output of all rules, so you can see that this function, ufnGetContactInformation, has 15 different violations of the rules. There are several violations of certain rules like 'Always use a column list in INSERT statements’ Which had four different instances within the query. But the fact that I either have to browse XML or write a report mechanism against it is a pain. Sorry, but there it is. I’m in love with the concept, but I’d just like to see a little easier access to the results. Maybe a standard output such as JUnit so that continuous integration tooling can output standard reports from the results?
But, that’s really something of a nit that I’m picking. The main point is that I can completely automate validation of my code as part of any deployment process I have set up. That’s a huge win. Add in the fact that I can also create my own customized code checks and suddenly my deployment processes are more efficient than ever. I’ll go it one better. UbitSoft had the foresight to add in MSBuild and Nant tasks to assist you with your deployment automation above and beyond what’s supplied through just a command line. That truly makes this tool useful.
This is a very useful tool. I love the concept and I love most of the execution. I think the interface and the command line have a few rough spots that can be fixed over time. They’re not show stoppers, just irritants. There’s no escaping the utility provided by this tool, to be able to check such a wide variety of issues in such an easy manner, and one that you can automate.
It is so useful to be able to build it right into my automated deployment scripts for a much more useful process overall. I can decide which rules that I want to run when and can even, to an extent, change the way that the rules are applied. This makes for some very flexible auditing mechanisms.
While you’re likely to hit a few snags with the documentation and the slightly inscrutable interface, I’m positive that you’ll find the rest of the experience so far outweighs the minor shortcomings that you’ll be incorporating SQL Enlight into your own development environment in no time.