Click here to monitor SSC

Simple-Talk columnist

A breathing-space for Software corpulence

Published 28 April 2011 3:16 pm

A guest editorial for Simple Talk

The greatest struggle for anyone designing, building or maintaining applications is the fight against complexity. It chokes the lifeblood from any application. Chuck Moore, one of the founding fathers of IT recently said, in an interview with Simple-Talk:

“Complexity is the problem. Moving it from hardware to software, or vice versa, doesn’t help. Simplicity is the only answer. … I despair. Technology, and our very civilization, will get more and more complex until it collapses. There is no opposing pressure to limit this growth.”

This is not the howl of a lone wolf. Brian Kernighan once wrote, “Controlling complexity is the essence of computer programming.” And Ray Ozzie too, “Complexity kills. It sucks the life out of developers, it makes products difficult to plan, build and test, it introduces security challenges and it causes end-user and administrator frustration.” Probably the most memorable take on the problems of software complexity comes from Tony Hoare. “There are two ways of constructing a software design. One way is to make it so simple that there are obviously no deficiencies. And the other way is to make it so complicated that there are no obvious deficiencies.”

A long time back, commercial-scale applications became too complex for mortals to track without using complex application-management applications. Project and configuration management systems were joined by bug-tracking systems, build management, automated testing, continuous integration and release management. With tools like these, we could all cope with more complexity, and run larger software teams. Integration was ever the issue, especially with the temptation to include the entire life-cycle of an application, including the architectural, deployment, and production-management phases; this is particularly true when sourcing your tools from different vendors. Within the past five years, Application Lifecycle Management (ALM) tools, such as Microsoft’s TFS, HP’s ALM Suite and IBM Rational Team Concert have promised the obvious solution: accepting a degree of single-vendor lock-in in return for a single integrated set of tools in which application providers can manage all aspects of their work.

This increasing automation is well-suited to promote ‘Agile’ methodologies, since it allows much shorter build cycles and provides a much closer cooperation between developers and testers. It certainly promotes the flow of information and makes it easier to spot potential problems.

My own nervousness comes partly from thinking of those ground-breaking tools such as distributed source control, software testing frameworks or development wikis that would never have penetrated through wall-to-wall management suites based on less open-ended ideas. I also wince at the subtle problems that would ensue from integrating cross-application databases into an ALM. Most of all, I wonder if there is now even less motivation to tackle the root-cause of the increasing inflexibility of application management, the uncontrolled complexity of application. Now that we can throw more silicon at corpulent software so cheaply, and use applications to manage the complexity of applications, where is the cost-benefit of keeping things as simple as possible?

Your views would be very welcome, as always.

8 Responses to “A breathing-space for Software corpulence”

  1. BuggyFunBunny says:

    So long as 1) coders rule the process and 2) coders are rewarded, however indirectly, by LOC counts, then complexity is guaranteed to rule. I first saw this with VAR software (i.e. vertical market “total solutions”, general ledger driven suites) running on early RDBMS in the very late 1980′s. Oddly, on *nix, which is itself built on the principle of small, independent tools working together. Further oddly, is that linux was built on a monolithic kernel (the part Linus did) at about the time that micro-kernel was au courant. Linus took a good deal of heat for years. As to which is inherently more complicated… but micro-kernels are little heard of these days.

    In the world of this site, BCNF datastores provide the basis for enforcing simplicity and modularization. That ole one fact, one place, one time voodoo that coders hate to hear about. Interface generation from schemas will be the silver bullet, but only for those brave enough, and with little enough legacy that can be ignored.

    I would offer, to emulate Ben’s father’s friend, one word: orthogonal. With that notion always in mind, complexity can be handled. It’s when integration, and all the sharing that entails, takes over that spaghetti rains down. Unix was a reaction to Multics; the naming might be apocryphal, but the motivation was real.

  2. thensley says:

    >> where is the cost-benefit of keeping things as simple as possible?

    In the cloud. You’re going to be charged for your CPU, disk usage, and bandwidth.

  3. mbird says:

    A related favorite quote of mine that many software developers could learn from

    ” Perfection is achieved, not when there is nothing left to add, but when there is nothing left to remove. — Antoine de Saint-Exupery “

  4. paschott says:

    What’s the benefit of keeping things simple? The biggest one is knowing that you’re going to need to touch that code again in the future. We have areas of our apps that nobody really wants to touch because it was poorly written the first time around, but have agreed that we’ll have people who will tackle those areas to enhance our ability to maintain the code – just because we want anyone on the team to be able to do that work. I also agree with thensley – when you move some of this to the cloud, you get charged for all of your usage. That’s a great incentive to keep things simple and small as much as possible. No more “pull back everything just because that’s what we’ve always done” because that adds up quickly.

    I have mixed thoughts on scaling up vs. code efficiency. Sometimes it’s more cost effective to buy a newer server until we have time to do a re-write. It takes someone who’s aware of that problem to keep an eye on that so we do eventually tackle the problem. The hard part there is delivering those new features and changes in a timely manner while tackling that legacy code. Customers may understand a period or two of few to no changes, but they’ll eventually start looking elsewhere if they keep seeing no major improvements, even if your code base is getting more and more efficient.

    I’ll still maintain that to a seasoned developer, the major benefit of simplicity is being able to go back and work on your code without pain. The speed of our business pretty much demands that.

  5. DThrasher says:

    There’s a difference between essential complexity and accidental complexity.

    Some problems are truly difficult to solve and require sophisticated solutions. To paraphrase Einstien: Everything should be made as simple as possible, but no simpler. I’ve worked on several applications where the problem domain was tough, with conflicting requirements and priorities, and the resulting business logic was tough for any layman to follow. But that’s precisely what made the application so valuable.

    What crushes many software projects is the accidental complexity. And accidental complexity, as its name implies, creeps in inadvertently despite our best efforts. Over time, the cruft accumulates, obscuring the essential details of the system. It’s software entropy, and like thermodynamic entropy, it takes work to overcome it.

    Better tools can give us more leverage over the problem of accidental complexity, but it will always remain a core development challenge.

  6. GDrauch says:

    Thermodynamics has something called entropy. One way of looking at entropy is it is a measure of the complexity of a thermodynamic system. The system ALWAYS gets more complex because entropy is always increasing. Only at absolute zero does entropy stay constant. It NEVER reduces. Software design is like that. Complexity never stays the same, it always gets worse…unless hell freezes over.

  7. BuggyFunBunny says:

    @DGrauch:
    You’ve got the notion of entropy exactly backwards. Entropy is the reduction of complexity, eventually to statis. Increasing entropy yields less complexity, as there is less going on in the system.

    You needn’t take my word for it. Here’s a quote from the Wikipedia:
    These processes reduce the state of order of the initial systems, and therefore entropy is an expression of disorder or randomness.

    Here’s the cite:
    http://en.wikipedia.org/wiki/Entropy

    Unless your assertion is that software complexity is a function of randomness. If so, then I suspect that most would disagree. The randomness of entropy is both Brownian in nature and a measure of energy in the system; i.e. entropy is manifest in the diminution of global energy in the system. Which, itself, is the essence of thermodynamics: systems devolve to absolute zero if no new energy is injected. The expanding universe is the ultimate manifestation, it’s devolving to an infinite expanse of just background radiation. Dante was right: the center of hell is cold, we won’t be burning in hell.

  8. paschott says:

    I seem to remember having an Admin who thought this way. If we just drop the database the users are trying to access when there’s a problem, that small problem goes away and we can close the ticket. The system is greatly simplified at that point and the problem can then be solved by simply removing access to the app. All problems solved and simplicity is achieved. :)

    Personally, I think I’d like to avoid entropy in our main projects. They keep us employed. Good reminder about the difference between entropy and chaos. Now if we could just find that balance between chaos and simplicity when we’re working on a system…..

Leave a Reply