Software Engineering: Just How Immature is it?

"Facts and Fallacies of Software Engineering" by Robert L. Glass has become a classic of Software Engineering as cherished as 'The Mythical Man-Month: Essays on Software Engineering' by Frederick P. Brooks. They seem as radical today as when first written, mainly because the software industry repeatedly fails to learn from its mistakes. Dwain Camps reviews the book.

‘He that will not apply new remedies must expect new evils, for time is the greatest innovator’

“The Essays” by Francis Bacon 1561-1626

I often wonder whether ‘Software Engineering’ can really be called “engineering” given the obvious immaturity of the science. The term ‘Engineering’ implies a deterministic process, whereas development seldom progresses from the chaotic.

When I was in university, I taught in an engineering faculty and took many engineering courses. Each engineering discipline that I encountered was generally governed by some set of fundamental rules. For example, structural engineers will be familiar with “statics” (the rules governing the forces on objects at rest) and “dynamics” (the rules governing the forces on objects in motion). These rules are all based in fundamental mathematics, more complex than 2+2=4, but mostly deterministic in their results all the same.

After thirty years or so in the business of developing software, I’ve found that the results of software “engineering” are rarely deterministic. In fact, the results, in many cases, could almost be characterized as random, and have been termed “chaotic” – as in the Capability Maturity Model (CMM) level one. Higher levels of the CMM emphasize repeatability as a goal, and it is an important aspect of the discipline of software engineering be able to produce software systems with a satisfactory and predictable outcome, in much the same way that the structural engineer seeks to build bridges that remain standing, and to do so repeatedly.

Some Well-researched Facts and Fallacies

I recently reread the book “Facts and Fallacies of Software Engineering” by Robert L. Glass. If you haven’t read that book, and you have any level of involvement in the management of software projects, I suggest that you’ll benefit from reading it. Note that it has also been listed as one of the top 10 books and resources to become a great programmer. Even though it was published in 2002, it remains solidly relevant in my experience. I’m not saying that you’ll enjoy reading it, because it may just shatter some of your perceptions about what you know of software engineering. I can promise you that it will probably formalize many things that in your heart you know, but were probably too scared to admit were true.

There are many nuggets of wisdom that he summarizes from the hard-earned experience of the sixty or so years that software has been developed. I particularly liked

  • His definition of what constitutes quality in a software development context.
  • His take on research in the software domain, particularly as I’ve always been what he calls a “practitioner.”
  • His clarity on software defects, and particularly his re-designation of “the testing cycle” to “error correction.” I’ve always thought that “testing phase” or “testing cycle” never really properly emphasized the goal of that step.
  • His strong belief in code inspection.
  • His observations on measurements and metrics.

He also has this incredible ability to express his wonderment at how we keep forgetting all of these hard-learned lessons, and repeating the same mistakes.

Perhaps most of all, I like his lack of fear of slaying sacred cows.

There is a common theme pervading much of that book: We fail to learn from experience. As human, thinking machines we possess the phenomenal capacity of learning from our mistakes. Why does this adaptive behavior fail to kick in with Software Engineering?

In the book, Glass presents fifty-five facts, bolstered by references to other writing. These facts cover topics that rang through management, the software lifecycle, Quality, Research and education,. Typical of these facts is ‘For every 25 percent increase in problem complexity, there is a 100 percent increase in complexity of the software solution’. Some of these facts are ‘truisms’ that are familiar to any seasoned professional programmer, such as ‘”Adding people to a late project makes it later’, but many are unfamiliar and often thought-provoking, such as ‘Understanding the existing product consumes roughly 30 percent of the total maintenance time’..

Some “Facts” of my Own

Instead of reiterating the facts and fallacies in that book, I will now presume to mention a few “facts” from my own experience.

Fact 1: Software engineering is immature as a discipline because it lacks fundamental rules to govern the activities of its practitioners.

This is probably one of those facts that you’d prefer to ignore, and it is one that goes against the grain for anyone that proudly wears the “software engineer” title. You may not accept this fact because the community already has CMM, and what is that if not some fundamental rules for developing software? Let’s just say that CMM also has its detractors, who may best be represented by its parody, that of the Capability Immaturity Model (CIMM). Or perhaps it is just that CMM itself is immature (and while this link too is a bit old, it is not at all dated, as the CMM has not significantly changed since the critique was written).

If you’ve read the Facts and Fallacies of Software Engineering, you’ll realize quite early on that many of them implicitly lead you to this conclusion, what I consider to be my much more general “fact.” Many of those facts are telling us precisely what things we are doing wrong, and precisely why these wrongs persist. Perhaps the problem is that failure is less obvious than in traditional engineering professions. If the Aeronautical engineer lacks fundamental rules then planes would drop out of the sky like autumn leaves, and Aeronautical engineering would be declared an immature discipline. Because failures in software engineering can be much more subtle, and because those failures may often not be immediately detected, it is no wonder that software projects often struggle and too often fail. For example, hidden requirements are often a subtle point of failure in a delivered software system, and while that doesn’t necessarily imply that the entire project was an utter failure, it is likely to cause plenty of rework that will cause at least some of the project metrics to go into the failure range.

Throughout the sixty or so odd years that programming, later replaced by software engineering, has been around, there have been many academicians and industry heavyweights that have attempted to add maturity by introducing or updating software engineering tools and processes. As a result, tools and methodologies abound. Since Glass made a clear case that tools tend to represent incremental improvements, one can argue that this represents overall improvement. In my opinion, since any tool or methodology introduced at this stage of the game is unlikely to be revolutionary, these incremental improvements may just be muddying the waters. For example, when they require you to increase the skills inventory within your development teams, decreasing the level of specialization of those same teams. And yet, because software engineers and technology companies love shiny new tools, they get adopted, used and too often discarded (as Glass pointed out) when that next shinier tool comes along. Not to mention the folks that make their livings by selling these tools – they have a vested interest in seeing new stuff come out so that it then gets purchased!

Fortunately, since I’m not the only person saying this and some of the people that are saying it are a lot smarter than me, there may be a light at the end of a very long and discouraging tunnel. Enter Software Engineering Method and Theory (SEMAT). Here we have an organized attempt by a serious group of professionals (academicians and practitioners in fact) that really have a handle on what discipline currently exists in software engineering, to actually formalize the methods and theory that should guide it, but oftentimes is seriously lacking in day-to-day project undertakings. Let us sincerely hope that, with their recent release of the fundamentals (mentioned in the conclusion of this article), that this offers the community a fresh and revolutionary new approach, that doesn’t suffer the failings of its predecessors.

Fact 2: The boundaries of the scope of any software project will always be consistently under-defined at a project’s inception, and will expand to include the project’s minimum requirements by the time the project is accepted.

This is better known as “scope creep” and is indeed the bane of every project manager. In Glass’s book, he is quite specific that project estimates are being done at the wrong time (at the start of the project), so my fact is closely related to his. You either:

  • Embrace this fact (as in Agile software development methodologies),
  • Absorb it and end up with cost/schedule overruns,
  • Contain/manage it through change requests (which may also result in cost/schedule overruns but at least you don’t look like such a crappy project manager),
  • Or you can end up with a runaway project wherein requirements never stabilize (as described by Glass).

To state the blatantly obvious, engineers solve problems and software engineers do this through software. Since you need to define any problem (“the requirements”) before solving it, one has to wonder why as software engineers we are collectively not so great at initially defining the problems we are trying to solve. Is this not a measure of immaturity, leading us back to my first fact?

Normally at the start of a project, the intrepid software engineer must provide an estimation and a scope of work. Glass’s facts about estimations suggest that they are overly optimistic at this point. I would say that the same thing is true for project scope, and undoubtedly this truism occurs for nearly all of the same reasons. Many of these reasons are political in nature, but others are due to the fact that the people asking for the work to be done merely have an inkling of what it is they want, or which of their many business problems the software system is to solve. Scope evolves (changes) along with the expectations of the systems’ ultimate end-users.

Sometimes the scope creep occurs because a project runs late and the business environment changes. Ultimately it doesn’t matter why it changes, only that it does and this impacts the project’s likelihood of success. Who’s measuring that success will be taking the amount of scope creep into account, and this may impact that measurement in a negative or a (counterintuitively) positive way.

The statement that at acceptance the project meets the minimum requirements is probably true because we assume that the system is only accepted by the users that commissioned its development if it does.

Fact 3: The most important activity of software testing is unit testing. If unit-testing was done better (more thoroughly), overall testing effort by an independent testing team would be reduced.

There are those that will probably argue with my opinion here, and rightly so as there are many important types of software testing. Some of these are important because of the niche they apply to (not all types of software testing apply to all types of projects). There are other types of software testing that are important but their breadth is too broad to expect that all of it can be done within unit testing. But perhaps the naysayers will at least agree that unit testing is required in all software development projects, so perhaps it is more widely important than most of the others.

Just like the waterfall software development methodology, software testing has a series of phases that it goes through, while testing different aspects of a system’s suitability to its purpose. Often, if software fails in an early stage of that testing, the later stages cannot be executed with any reasonable degree of effectiveness.

Unit testing is arguably the earliest of the software testing stages. The software engineers that are writing the code (developers) are usually the ones doing this unit testing, although I have seen instances of independent testing teams doing unit testing as well. Unit testing is the foundation of any software testing project. Code that has been poorly unit-tested will never get through later stages of testing.

I’m sure you’ve heard it said that developers are the least-well suited to doing software testing. Overall I would agree with this; however in the unit-testing stage it is imperative that the developers perform this function as thoroughly and completely as they possibly can, mainly so they don’t waste the time of the testing engineers that will follow.

So why shouldn’t software developers be responsible for testing? As the argument goes, they are biased towards testing things that they know will work, and ignoring things that they didn’t code for. Only someone independent to the coding step is unbiased enough to look for the latter.

If an engineer is only as good as the problems he or she solves, and we assume that every engineer strives to be the best that they can be, why are they considered unable to cast loose the chains of their own biases and thoroughly test the software that they produce? Could this be a training issue? Is it possible to instill the discipline necessary in developers, such that they can be relied upon to thoroughly test the software that they produce?

Since both coding and testing are learned skills, shouldn’t it be possible to train both facets into a developer’s (or tester’s) skill set so that they are sufficiently adept at both, and thus improve the results of unit testing? While there may be university courses that focus on unit testing, I’d be surprised if they’re common.

This then brings us back around to our first fact. How can software engineering be considered mature if professional software engineers cannot be trained to thoroughly test the software that they produce?

I will leave my valued readers to ponder this line of thinking, and decide for yourself whether my arguments have merit, or are simply circular and used to justify my original treatise. Keep in mind that I am not saying that independent testers aren’t important to the proper testing of software. What I am saying is that developers should be better at proving that their work is ready for the independents to take a crack at. And that they probably would be if the discipline of software engineering was a little bit more mature.

In Glass’s book, he suggests that unit testing often suffers from schedule pressures, i.e., that developers have pressure on them to complete their coding task(s). I won’t argue with that. I will suggest that there’s another factor at play here too: ego.

Let’s try a thought question. How would you rate your driving skills? If you said “better than average” you’d be among the 95% of drivers that think their driving skills are better than average. It is statistically impossible (assuming driving skills fall into a normal distribution) that more than 50% of drivers are better than average. I would argue that if you ask developers this question (not about driving skills of course, but rather their developer skills), you’d see a statistically impossible number rate themselves as better than average. This means that developers probably believe that once they’ve coded something, because they’re better than average, it probably works so they don’t even do the minimum unit-testing required to confirm it. How immature is that?

And in shops where developers are distinct from testers, I’d say that this can exacerbate this problem, because “developers develop” and “testers test.” Less unit testing is certainly going to make the testing team’s job easier, when their metrics are “number of bugs identified.” That’s because they’re identifying bugs that should have been caught during unit testing. How many times have you seen a software system fly through its testing cycle, and then zoom even more quickly through its User Acceptance Testing (UAT) because it has very few issues to find? I’ve seen it at times, and let me tell you that there is no better way to relieve schedule pressures than to get through both of those software development phases quickly!

Fact 4: Process cannot replace initiative, innovation or technical expertise.

CMM is mostly about process, and since it is about the most widely accepted current example of “fundamental rules and practices” in the software engineering discipline, I’m taking a direct shot at it. Fundamentally process is a way of applying the hard-learned lessons and experience of what works and what doesn’t across the great unwashed masses. OK, perhaps that was neither fair nor nice. But coming into the profession there’s always been this huge glut of fresh-faced and inexperienced engineers straight out of university, that have all the academic knowledge and none of the seasoning that the experienced professionals have. So process is meant to guide them, and help them to avoid mistakes and pitfalls.

But is that all it is? Remember what I said about developers’ skills and the bell curve above. If half of all engineers right out of university (or seasoned engineers for that matter) are of lesser than average skills, perhaps much of this process stuff is meant to help uplift or make up for shortfalls in their skill sets. There’s nothing wrong with that, although it seems like the elephant in the room that nobody wants to talk about.

I would say that there’s a corollary to this fact:

Great software is not developed by average developers.

I would expect that the best you can hope for with a team of all average developers might be slightly better than average software, assuming that enough process is included to allow the work effort to proceed without too many glitches.

Technical expertise usually comes with experience. Who do you go to in your company when you need to solve a particularly challenging technical problem that you’re struggling with? I know, your ego says that never happens! It is likely that if it does this will be a pretty senior person on your team. And this person is probably senior because he’s been around the block.

Sometimes seniors become seniors early because they exhibit the traits of innovation and initiative. These are the movers-and-shakers. Everybody who’s been around for a while knows at least one (and maybe it’s you). It is highly unlikely that a person becomes a senior person, unless they develop deep technical expertise in something, if their sole redeeming quality is that the know how to follow process.

Justification and the Fallacies I Chose to Omit

Unlike Glass, I have not done extensive literature searches to back up my “facts.” Instead I am relying on lessons learned through many of the projects I have personally managed, or in others that I’ve been simply involved in. I do however encourage my readers to engage in a bit of informal peer review by posting comments to this article.

I have omitted listing any specific fallacies because most of my facts can be restated in the negative as a fallacy (although that is not what Glass did for his fallacies).

I have also omitted many other facts, some of them elucidated by Glass and others that have simply become very well-known to me after many years in the industry. Many of these facts might also lead one to the conclusion that software engineering is inherently immature, so it begs the question of whether I selected my facts to back up my conclusions. Since that is the general treatise of this article, it would seem inevitable that I have done so.

However I will also say that I cannot think of a single fact that I’ve come to believe after years of experience that would back up the alternative argument: that software engineering is a mature discipline. I’ve seen tools, methodologies and trends that are attempts to improve maturity. But what I have not seen is any panacea that works in 100% of the cases 100% of the time. While that is probably an unattainable goal, ask yourself if you’ve ever seen something that will work in a significant fraction of the cases, even most of the time.

2+2 always equals 4. That is something that works 100% of the time. Addition is a rule that works in 100% of the cases. Arguably (and Glass makes a great case for this), software development is significantly harder than addition. Johann Wolfgang von Goethe said “everything is hard before it is easy.” Perhaps this can also be construed as a measure of maturity.

Conclusions and Recommended Reading

I’m not really expecting anyone to come along and tell me I’ve proven my case that software engineering is immature. I’ll be pretty happy if all I have are a handful of people nodding their heads in agreement as they read through my “facts” and then deciding maybe they should read the facts and fallacies of a true champion in the software research field.

Another thing that would make me extraordinarily happy is to learn that SEMAT has taken a strong foothold, especially if it begins to really address, within my lifetime, some of the challenges that are keeping software engineering immature.

If perchance you do agree with me that software engineering is not as mature as people would like to think, your next step (if you haven’t already done so) is to visit the SEMAT web site and download “the Essences Kernel.” At this point, I believe you’ll find that some of it is a bit on the abstract side. However, you do need to start somewhere and perhaps this is it. By all means, read some of the background and vision espoused by the organization’s members. You can even see that list and I’d bet that there’s a few names there you’ll recognize (like for example Robert L. Glass).

After you visit SEMAT and read the Essences Kernel, decide for yourself if the goals expressed by that organization are worthy and whether this is a good first step.

Then you should come back here and express your opinion of my opening question.

Tags: ,

  • 7121 views

  • Rate
    [Total: 11    Average: 4.5/5]
  • Robert young

    It’s the parent
    Engineering, the real ones, are by definition the implementation of science. Chemical engineering :: chemistry. Electrical engineering :: physics. Civil engineering :: statics and mechanics. And so on.

    Software zealots like to have it both ways: we are engineers who discover new stuff every day, aka we do the science, too. Balderdash. The only structured bit of science in commercial software was the relational model. It ran up against COBOL, and thence java, and so has been largely exiled.

    We need more structured data and less structured programming.

  • Eric

    Advances in Testing Practices
    Just witness in past ten years, that growing adoption of Test Driven Development approach being the fruit of many practitioners who promoting engineering discipline in software development. And I would see the "professional" status of developer nowaday could be judged not only on the computing efficiency his code achieve, but also the comprehensiveness the test cases he constructed along with his deliveries of the code. I think it could be one area which shows real progress in software engineering discipline, if compare with 20 years before.

    It could be a dream job for me (i.e. not yet there), if I could worked on development team having equipped with tools that could alarm the Project manager, with details tracking and follow-up ticket sent to involved parties automatically, should the product under tracking fail an automatic built last night .

    Even though It does not help practitioner to win all, or, not necessary most impotant battles , though still so cool to me…

  • David K Allen

    Yes, to unit testing
    I agree with the author about the importance of unit testing as well as the follow-up functional testing. I think writing acceptance criteria on stories, and reviewing them with the team, is a helpful to mitigate blind spots in developer testing.

    And I agree with the commenter Eric that TDD is a powerful tool. But it is rare for me to find someone who feels strongly enough about TDD to do it. If I found such a place, I would love it.

  • Steve Naidamast

    I agree with the article but have some of my own notes…
    I completely agree with the author’s contentions in his article; software engineering as practiced by the profession is still a rather immature endeavor.

    However, I cannot place the fault on software engineering itself considering that many good books have been written on its practice, with many good concepts.

    Software engineering itself is not a hard and fast concept but incorporates basic, common-sense solutions to the many problems that the software development community faces on an international scale.

    That being said, the problem facing the adoption of good software engineering practices basically falls into the realm of simple common-sense.

    At the top of the problem list you have senior management that did everything possible to destroy the vital hierarchical foundations of Information Technology organizations (ie: business analysts, system analysts, quality control, etc.) by outsourcing such functionality on a massive scale as well as simply just eliminating such positions both in the cause for cutting costs.

    Technical management for the most part in the United States is fairly poor and rather ignorant of the very processes they are overseeing. I can count the fingers on hand as to how many good managers I have worked with in my 41 years in the profession.
    But let’s not stop here.

    Software vendors are right up there as well with their penchant for pushing out new technologies on an ever increasing rate whether they are needed or not. Nonetheless, any decent mature technology that has acquired a good knowledge base within its supporting community is then shunted aside as “legacy” tools and techniques with subsequent effect of encouraging all developers using such tools to “move on”.

    And then we have the developers themselves, many in the younger generations, who simply can’t seem to grasp the idea that mature technologies have a lot to offer over the newer tools since they are simply mature. So they help along with the vendors in ridding the profession of good tools and techniques for their narrowly focused desire to implement the latest and “greatest” technologies so they can feel “cool” and “up to date”.

    The constant trauma of such a volatile process makes it nearly impossible to implement sound software engineering practices in organizations simply because there is no real maturity to the development environment itself and few managers to guide such process.

    Of course we have the constant age discrimination in the field, which only adds to the problem by eliminating true “senior” personnel for younger, less experienced personnel.

    In other engineering professions this level of trauma simply does not exist. For example, in the aeronautical engineering profession it takes approximately 7 years to design and produce and new prototype based on existing, sound engineering principals. If you upset this process by introducing any level of trauma as we have in our profession you land up with what happened with Boeing’s design of the 787 Dreamliner; and not that this is a bad plane. On the contrary it incorporates some very new technologies, which individually are quite sound. However, instead of taking into consideration the amount of time that would be needed to produce such a design as a result of new technologies and new processes, the plane was given about the same amount of incubation time as that of its predecessors.

    Serious and dangerous issues have occurred with the 787 as a result and I believe a grounding of the entire fleet was recently warranted as a result. This is something that has never happened to the Boeing Corporation.

    Let’s take another example in the same engineering arena with the development of the McDonnell Douglas F-15 Fighter. It is a phenomenal design and why, because its designers and production managers understood the necessity as to how this plane had to be built, which was completely around its ALR avionics package that was key to the entire aircraft.

    Today we have the F-35, which is a design disaster for all of the same reason we have so much software project failure; to many cooks in the kitchen, subsequently too many changing requirements, and not enough emphasis of what the US Air Force actually needed. In this rush for a 5th generation fighter the US has also discontinued the absolutely terrific A-10 Thunderbolt that the Army relied on heavily for ground support.

    Many in our profession can understand the issues with the F-35 for the reason that they have experienced similar issues with their own software development work.

    To cite a specific example in our own profession I submit the current and ongoing divisiveness in our field over Standard ASP.NET and ASP.NET MVC, which I have written on many times.
    Standard ASP.NET was designed around certain requirements and as a result quickly became a standard in Internet software development. To attract people from the Java Community and its own Castle Project community, Microsoft introduced ASP.NET MVC around 2010. This attracted a substantial number of newer developers and fairly soon we had Standard ASP.NET being called a “legacy” technology. The MVC variant was the better and correct way to program the Internet.

    So who elected any of the MVC proponents to determine that MVC is superior to its more “mature” sibling?

    Yes, there are issues with Standard ASP.NET but there as many with ASP.NET MVC that its proponents simply ignore. The result is a far more complex Internet development environment with less capable and mature tools all doing the same thing that Standard ASP.NET did and still does (while also still being refined by Microsoft).

    In this case an entire decade of knowledge is being replaced with a completely newer set of tools that in all reality benefit actually no one; especially the business organizations that are to be supported.

    This is the direct result of industry hype that is a result of many of the factors I have already outlined.

    In reality, there is no real need for the theories that comprise the SEMAT software engineering concepts, though there is no reason to consider them unsound.

    However software engineering is following a set of development guidelines that are based upon common-sense. You don’t need a host of patterns and processes to determine that one is following sound software engineering practices and even so, using them without common-sense can get you a bad product just as easily.
    Stephen McConnell probably wrote the bible for software engineering, with his 1996 “Rapid Application Development”, which does not in any way infer a way to get software projects completed at unheard of speeds. Instead this publication denotes a set of tested practices that is used properly will result in a quality product.

    Subsequently, until the people in our profession mature and stop attempting to accumulate expertise with the latest tools via the hype by the vendors (and the open-source projects) that produce these tools and instead promote sound, mature technologies instead, true software engineering will never become an environmental practice in our profession. It simply has too many factors undermining it to make this a reality…

  • Steve Naidamast

    I agree with the article but have some of my own notes…
    I completely agree with the author’s contentions in his article; software engineering as practiced by the profession is still a rather immature endeavor.

    However, I cannot place the fault on software engineering itself considering that many good books have been written on its practice, with many good concepts.

    Software engineering itself is not a hard and fast concept but incorporates basic, common-sense solutions to the many problems that the software development community faces on an international scale.

    That being said, the problem facing the adoption of good software engineering practices basically falls into the realm of simple common-sense.

    At the top of the problem list you have senior management that did everything possible to destroy the vital hierarchical foundations of Information Technology organizations (ie: business analysts, system analysts, quality control, etc.) by outsourcing such functionality on a massive scale as well as simply just eliminating such positions both in the cause for cutting costs.

    Technical management for the most part in the United States is fairly poor and rather ignorant of the very processes they are overseeing. I can count the fingers on hand as to how many good managers I have worked with in my 41 years in the profession.
    But let’s not stop here.

    Software vendors are right up there as well with their penchant for pushing out new technologies on an ever increasing rate whether they are needed or not. Nonetheless, any decent mature technology that has acquired a good knowledge base within its supporting community is then shunted aside as “legacy” tools and techniques with subsequent effect of encouraging all developers using such tools to “move on”.

    And then we have the developers themselves, many in the younger generations, who simply can’t seem to grasp the idea that mature technologies have a lot to offer over the newer tools since they are simply mature. So they help along with the vendors in ridding the profession of good tools and techniques for their narrowly focused desire to implement the latest and “greatest” technologies so they can feel “cool” and “up to date”.

    The constant trauma of such a volatile process makes it nearly impossible to implement sound software engineering practices in organizations simply because there is no real maturity to the development environment itself and few managers to guide such process.

    Of course we have the constant age discrimination in the field, which only adds to the problem by eliminating true “senior” personnel for younger, less experienced personnel.

    In other engineering professions this level of trauma simply does not exist. For example, in the aeronautical engineering profession it takes approximately 7 years to design and produce and new prototype based on existing, sound engineering principals. If you upset this process by introducing any level of trauma as we have in our profession you land up with what happened with Boeing’s design of the 787 Dreamliner; and not that this is a bad plane. On the contrary it incorporates some very new technologies, which individually are quite sound. However, instead of taking into consideration the amount of time that would be needed to produce such a design as a result of new technologies and new processes, the plane was given about the same amount of incubation time as that of its predecessors.

    Serious and dangerous issues have occurred with the 787 as a result and I believe a grounding of the entire fleet was recently warranted as a result. This is something that has never happened to the Boeing Corporation.

    Let’s take another example in the same engineering arena with the development of the McDonnell Douglas F-15 Fighter. It is a phenomenal design and why, because its designers and production managers understood the necessity as to how this plane had to be built, which was completely around its ALR avionics package that was key to the entire aircraft.

    Today we have the F-35, which is a design disaster for all of the same reason we have so much software project failure; to many cooks in the kitchen, subsequently too many changing requirements, and not enough emphasis of what the US Air Force actually needed. In this rush for a 5th generation fighter the US has also discontinued the absolutely terrific A-10 Thunderbolt that the Army relied on heavily for ground support.

    Many in our profession can understand the issues with the F-35 for the reason that they have experienced similar issues with their own software development work.

    To cite a specific example in our own profession I submit the current and ongoing divisiveness in our field over Standard ASP.NET and ASP.NET MVC, which I have written on many times.
    Standard ASP.NET was designed around certain requirements and as a result quickly became a standard in Internet software development. To attract people from the Java Community and its own Castle Project community, Microsoft introduced ASP.NET MVC around 2010. This attracted a substantial number of newer developers and fairly soon we had Standard ASP.NET being called a “legacy” technology. The MVC variant was the better and correct way to program the Internet.

    So who elected any of the MVC proponents to determine that MVC is superior to its more “mature” sibling?

    Yes, there are issues with Standard ASP.NET but there as many with ASP.NET MVC that its proponents simply ignore. The result is a far more complex Internet development environment with less capable and mature tools all doing the same thing that Standard ASP.NET did and still does (while also still being refined by Microsoft).

    In this case an entire decade of knowledge is being replaced with a completely newer set of tools that in all reality benefit actually no one; especially the business organizations that are to be supported.

    This is the direct result of industry hype that is a result of many of the factors I have already outlined.

    In reality, there is no real need for the theories that comprise the SEMAT software engineering concepts, though there is no reason to consider them unsound.

    However software engineering is following a set of development guidelines that are based upon common-sense. You don’t need a host of patterns and processes to determine that one is following sound software engineering practices and even so, using them without common-sense can get you a bad product just as easily.
    Stephen McConnell probably wrote the bible for software engineering, with his 1996 “Rapid Application Development”, which does not in any way infer a way to get software projects completed at unheard of speeds. Instead this publication denotes a set of tested practices that is used properly will result in a quality product.

    Subsequently, until the people in our profession mature and stop attempting to accumulate expertise with the latest tools via the hype by the vendors (and the open-source projects) that produce these tools and instead promote sound, mature technologies instead, true software engineering will never become an environmental practice in our profession. It simply has too many factors undermining it to make this a reality…

  • Steve Naidamast

    Comment Appears 3 Times
    My comment has appeared on this page 3 times. Yet, I remember quite distinctly hitting the "Post Comment" button only once.

    I guess this page was not "engineered" very well…

  • Robert young

    Commentary
    If you Refresh while on the article page, the engine replicates your last comment. Keep hitting Refresh, and the comment replicates. Consider it Tribble Commenting. A note to the Editor, and he’ll zap the extras. I’ve done it a few times.

  • Phil W

    Yes, immature
    It’s been 60 years and code is still written (in fact mostly assembled) into complexities that are rather untestable. You still can’t be sure that your personal data won’t be stolen from some web site. This is a sorry state of affairs, especially when compared to hardware engineering of devices built from proper re-usable components. If there ever is software engineering it’ll look like hardware engineering where you pick a bunch of well-tested standard parts and plug them together into your product with a minimum of new components you add yourself.

  • Robert young

    Smarter Than A Fifth Grader?
    — If there ever is software engineering it’ll look like hardware engineering where you pick a bunch of well-tested standard parts and plug them together into your product with a minimum of new components you add yourself.

    but, but, but… isn’t that what Object Oriented Design/Programming gives us? Isn’t it???

  • puzsol

    Out with the old, in with the new
    Steve Naidamast gave a good example of technology churn, asp.net into asp.net mvc…. but having gone through the transition I can say the user experience is worth it, and so is the developer experience. In my opinion, it’s a case of the right technology to meet the user demands. If all you have are forms for entering data, the tried and true technology is great. The minute you want some part updated via ajax, or want to add some custom user-side scripting into it, it’s a pain. The paradigm of the whole web form re-constructed from the postback in order for the sending of a couple of bytes of data just wastes bandwidth and cpu, and the performance suffers. MVC actually works with the browsers rather than trying to shoe-horn a windows application like layer over it.

    Yes the technology is nowhere near as easy to use, and there are still a whole lot of development standards that could/should be implemented. But if you ask me, it is still the right (or at least way closer to it) technology for the modern use of the web. Perhaps it’s too flexible at the moment, but that’s what I love about it. I will never go back to classic asp.

  • puzsol

    Testing is the key
    I didn’t disagree with anything Dwain wrote, is it the whole truth? Probably not.

    One thing I’m struck with the difference between other engineering disciplines and software is the testing aspect. If you are building a plane, the real world will show whether it works or not… but hopefully you put the design through some wind tunnel testing and knew that already, and before that ran some general algorithms over the proposal to see if it was even worth making a model….

    What strikes me is the lack of generic tests that can be applied to software… you can have something that passes all it’s unit tests, but fails the integration test or user acceptance tests… and they all have to be custom written (by the same people who wrote the code usually)… and then there is a human matching the output with the specification… I think the business requirements driven testing is definitely a step in the right direction, but still no guarantee of correctness.

    So my question is, where is the wind-tunnel for (all) software?
    Is it possible?

  • Anonymous

    Requirements
    I am closing in on forty years in software development. From straight assembler to Java and C#. All of the technologies worked or could be made to work. The consistent problem has been we do not know what we are building. It is not that there are not requirements, it is that we in the industry are unable to define them. Just questioning the reasoning and requirements behind a request causes the parties being questioned to soon try to find others to do the work because we ask too many questions while trying to define the requirements from what were submitted as supposed requirements. One of my favorite requests was "Make it work like this product". When asked about the documentation of the requirements for the product or someone who could even explain how it worked, the answer was "No clue. We need it by x. We already sold it to a client.". I personally believe this is one of the largest reasons we as an industry have so many product and project failures. We build a lot of stuff but most of it does not seem to meet the needs of those that requested it as we complete the software and finally start defining requirements.

  • Anonymous

    Requirements
    I am closing in on forty years in software development. From straight assembler to Java and C#. All of the technologies worked or could be made to work. The consistent problem has been we do not know what we are building. It is not that there are not requirements, it is that we in the industry are unable to define them. Just questioning the reasoning and requirements behind a request causes the parties being questioned to soon try to find others to do the work because we ask too many questions while trying to define the requirements from what were submitted as supposed requirements. One of my favorite requests was "Make it work like this product". When asked about the documentation of the requirements for the product or someone who could even explain how it worked, the answer was "No clue. We need it by x. We already sold it to a client.". I personally believe this is one of the largest reasons we as an industry have so many product and project failures. We build a lot of stuff but most of it does not seem to meet the needs of those that requested it as we complete the software and finally start defining requirements.

  • Anonymous

    Requirements
    I am closing in on forty years in software development. From straight assembler to Java and C#. All of the technologies worked or could be made to work. The consistent problem has been we do not know what we are building. It is not that there are not requirements, it is that we in the industry are unable to define them. Just questioning the reasoning and requirements behind a request causes the parties being questioned to soon try to find others to do the work because we ask too many questions while trying to define the requirements from what were submitted as supposed requirements. One of my favorite requests was "Make it work like this product". When asked about the documentation of the requirements for the product or someone who could even explain how it worked, the answer was "No clue. We need it by x. We already sold it to a client.". I personally believe this is one of the largest reasons we as an industry have so many product and project failures. We build a lot of stuff but most of it does not seem to meet the needs of those that requested it as we complete the software and finally start defining requirements.

  • Robert young

    RE: Testing is the Key
    — What strikes me is the lack of generic tests that can be applied to software…
    — So my question is, where is the wind-tunnel for (all) software?
    Is it possible?

    And the reason for that is: there is no (again, save the RM) underlying science which would be the basis for testing. Science provides the "theory" while engineering the "practice". Here’s the equations for the supercritical airfoil: https://en.wikipedia.org/wiki/NACA_airfoil

    Engineers tried to implement this as far back as the 1940s. The science was ahead of the practice. Science tells the engineer what to do, but not how. That bit is the engineer’s remit. Testing’s purpose is to tell the engineer whether he has implemented the science correctly. It doesn’t tell the engineer that the science is wrong.

  • MartyP

    Thanks
    Just had to say… Loved the article and the resulting comments. I agree with about 99% of points made here.

  • GetOffMyLawn

    Software Engineering Challenges today
    Hi thanks for the article,

    In addition to the technical challenges that are growing especially in modern web development in the cloud, continuous integration and development, etc., I find the biggest challenge is team collaboration. We have a lot of bright millenials coming into software development with some great technical chops but they are quite immature when it comes to collaboration, project management and leadership. I find many organizations promoting rock stars with superior technical abilities into leadership positions where they have zero experience or training.

    Sometimes that sort of non-technical skill set makes or breaks a project. You can have the smartest technical people and still have a project be an abysmal failure because they don’t focus on the right things. They don’t tend to read about Agile or LEAN principles and focus on optimization of the organization as it relates to its purpose. They don’t seem to evaluate tools, frameworks, etc. with a focus on efficiency and team-enablement. They seem to have a tendency to focus on their own personal preferences and aspirations for this as a career field and in a business software development, that’s not what you’re paid to do. If you want to do that, you need to go do your own project. This field isn’t about the passion or trade of software development. It’s about producing widgets that can be sold. That’s also how you get your pay check and have a job to continue to go to.

    I’m usually the old gray hair on team and I have to fight this battle endlessly with the young people. It’s like I’m the only one thinking about certain relatively common sense things that completely escape these technically bright people. It is very frustrating.