At Red Gate, we use the SmartAssembly automated-error reporting system in the applications we sell. When an error or exception occurs ‘in-the-wild’, it is reported back to Red Gate HQ, along with details about the error (stack trace, deployment environment, local variables, and more). Using the ‘SmartAssembly Sync for JIRA’ system, these errors are then automatically filed as bugs in our JIRA bug-tracking system. I should point out that Sync-For-JIRA is rather clever – it adds stack traces, it doesn’t file duplicates, etc.
We have been using this system now for about a year, and were recently surprised to discover that this system files more bugs than our top 20 testers put together. SmartAssembly actually finds about 15 times as many bugs as one of our more prolific testers.
66% of issues (bugs, enhancements, etc) raised in the past year come from SmartAssembly Error Reporting.
In the past year, we have learned (rather embarrassingly) of thousands of different errors in our products. In the past we would have had to rely on good-hearted customers sending us in limited amounts of reproductory data, but thanks to SmartAssembly Automated Error Reporting we can automatically get all the data we need to reproduce the bug.
However, the type of bugs found by this handy tool tends to be different: unhandled exceptions, crashes, and unexpected environments. On the other hand, ‘flesh-and-blood’ testers tend to find UI glitches, workflow problems, etc. So, SmartAssembly error-reporting definitely complements real testers, rather than replaces them.