Click here to monitor SSC

Simple-Talk columnist

Software Tuned to Humanity

Published 18 July 2013 4:26 pm

ShoppingIntheFutureI learned a great deal from a cynical old programmer who once told me that the ideal length of time for a compiler to do its work was the same time it took to roll a cigarette. For development work, this is oh so true. After intently looking at the editing window for an hour or so, it was a relief to look up, stretch, focus the eyes on something else, and roll the possibly-metaphorical cigarette. This was software tuned to humanity.

Likewise, a user’s perception of the “ideal” time that an application will take to move from frame to frame, to retrieve information, or to process their input has remained remarkably static for about thirty years, at around 200 ms. Anything else appears, and always has, to be either fast or slow. This could explain why commercial applications, unlike games, simulations and communications, aren’t noticeably faster now than they were when I started programming in the Seventies. Sure, they do a great deal more, but the SLAs that I negotiated in the 1980s for application performance are very similar to what they are nowadays.

To prove to myself that this wasn’t just some rose-tinted misperception on my part, I cranked up a Z80-based Jonos CP/M machine (1985) in the roof-space. Within 20 seconds from cold, it had loaded Wordstar and I was ready to write. OK, I got it wrong: some things were faster 30 years ago. Sure, I’d now have had all sorts of animations, wizzy graphics, and other comforting features, but it seems a pity that we have used all that extra CPU and memory to increase the scope of what we develop, and the graphical prettiness, but not to speed the processes needed to complete a business procedure.

ElliotComputer

Never mind the weight, the response time’s great!

To achieve 200 ms response times on a Z80, or similar, performance considerations influenced everything one did as a developer. If it meant writing an entire application in assembly code, applying every smart algorithm, and shortcut imaginable to get the application to perform to spec, then so be it. As a result, I’m a dyed-in-the-wool performance freak and find it difficult to change my habits. Conversely, many developers now seem to feel quite differently. While all will acknowledge that performance is important, it’s no longer the virtue is once was, and other factors such as user-experience now take precedence.

Am I wrong? If not, then perhaps we need a new school of development technique to rival Agile, dedicated once again to producing applications that smoke the rear wheels rather than pootle elegantly to the shops; that forgo skeuomorphism, cute animation, or architectural elegance in favor of the smell of hot rubber. I struggle to name an application I use that is truly notable for its blistering performance, and would dearly love one to do my everyday work – just as long as it doesn’t go faster than my brain.

16 Responses to “Software Tuned to Humanity”

  1. Robert Young says:

    It’s the revenge of infotainment. While gamers know no bounds with liquid cooled over-clocked i7s, commercial applications still toe the line of a VT-100 attached to *nix and its RDBMS over RS-232 at 14.4K.

    While Windows was bloating up, so was the symbiosis of MicroSoft and Intel, aka, wintel monopoly. One needed software bloat to justify inevitable leap-frogging of cpu power, while the other needed increased clocks to make the software barely usable.

    With what amounts to thin-client paradigm back in the saddle, both Intel and AMD, and possibly ARM, are facing the bandwidth problem. The bottleneck is the wire, and they’ve nothing to offer to extort monopoly rents.

    Finally, kiddies spend their time writing all that infotainment code. When was the last red-blooded, built from scratch, commercial application that you can name? Everybody who isn’t writing Angry Birds is patching COBOL and java “legacy” apps. With legacy frameworks.

  2. ilias says:

    I couldn’t agree more. One of the worst offenders are the bloated IDEs. IMO, the software that you write your code in subconsciously sets the benchmark for how fast the eventual application will turn out to be. The only application that I use for its speed over anything else is perhaps sublime text.

  3. sreindl says:

    hmmm. I’m usually a little bit concerned about “everything was better in the past”, but …
    I agree that in the last years (decade) UI centric/wanted features was the target of everything (I do not know anybody who has just every button in Visual Studio or Word up to now). I started coding on a PDP-11 with Pascal and 48 kb main memory per process (+ procedure swapping). The editor (EDT) was small and sufficient. Later I used Cobol, Fortran and C on VMS and Unix with overblown editors like emacs and EVE (on VMS). This was the time when developing became easier (code completition, bracket search) and more complex (using mouse instead of keyboard). There’ve been versions of Word where it was not possible to work without the mouse. In the meantime I like the features of modern IDE’s but are I’m using only about 20% of the power of the tool. The question is if it would be possible to downstrip those tools (like Microsoft did at some placed with windows 8) to features that most of the users really need and to move the rest to plugins (like TeX or emacs do).
    In my opinion the problem is that development of tools & applications currently is based on wanted features and not on needed features. As a result we see overblown software (O/Ses, IDEs, office applications …).
    Coming back to the beginning. In the past you focused on needed features due to limited resources (memory, CPU, disk speed/size, terminal capabilities …), in the last years the focus was on UI and wanted features. IMHO this has to change as (I personally) do not want to buy a new computer with every release of Visual Studio/Eclipse/….

  4. Bryant McClellan says:

    I won’t claim that everything was better in the past and I don’t think Phil is either.

    I, too, remember Wordstar and CP/M and the miserable attempt to convert Wordstar to the Windows paradigm. Sure it was a PITA to remember all those formatting codes (which still exist but are hidden in massive XML documents in Office products) but, like everything else, once you got the hang of it, it was pretty easy to work with.

    Still, there must be a reason that visual Studio and SQL Server Management Studio (among others) still have a command-line switch to turn off the splash screen to make them load faster. Someone obviously recognizes bloat when they see it but apparently can’t get rid of it. Perhaps that and some of the other observations are reflections of the Semmelweis Reflex, among other things?

  5. dessyf says:

    I don’t know much about everything mentioned here, but I do know that I prefer faster to prettier! Thanks for the article.

  6. Keith Rowley says:

    Thinking through every program I use on even a semi regular basis and I can’t think of a truly FAST one either.

    Now I am discontented. I was happy (or at least contented) with the speed of most of my software till you told me it was slow. :-)

  7. Bob Fazio says:

    I’m almost 50 now, and I can relate to everything your saying about things we did/do for performance. As long as we are not talking about tuning that sacrifices stability of the application, I agree 100%.

    If we are sacrificing maintainability, know that I can’t think of any example at the moment, if it can be proven to pose an issue, I might be willing to make that sacrifice.

    Even though every developer deals with deadlines, know that even temporary code, often goes on to live forever. It takes a special programmer to take crappy code that works, and fix it, just because it’s the right thing to do. So be that special person, and do it right the first time.

  8. Lee Dise says:

    Back in the halcyon days of my programming youth, I worked on a large national-defense project. And a database programmer employing Cullinet’s IDMS and either COBOL or PL/I, I inhabited that project’s intellectual slum. Even though our deliverables were software systems, the comp-sci guys comprised only a minority of the staff, the remainder being physicists, mathematicians, and aeronautical engineers. “We’re not just coders,” these actual scientists would proudly proclaim, “We’re engineers who do code.” We the coders, whose job it was to integrate their modules into a single deliverable, tended to agree: they’re engineers hoodoo code.

    The hoodoos often took the form of inefficiency. My boss, Tom, who was a mathematician by training but a Fortran programmer by trade, once challenged one of our young buck mathematicians, complaining of a poorly-performing module. The youth stood his ground: he could see no problem with his code, and bet a pitcher of beer that Tom couldn’t improve on it by even, say, 10%. Tom responded, “That’s not worth my while. Let’s make it interesting, if I can’t make it 100 times faster, you win the bet.”

    “No way!!!!!” the young math guy hotly replied, “You’re on!”

    It took Tom about two hours to make the module 104 times faster, and the sheepish hoodooer of code was forced to make good on his failed wager at our next pub rally.

    Of course, the output of our program was just a series of text reports. No pretty pictures of missile trajectory arcs. No graphical mushroom clouds. No images of Slim Pickens riding a bomb all the way to ground zero yelling “Yee-hah!”. No friendly little “Help” widget with a smiling icon of Hans Bethe explaining with detached interest what this bomb would do to that target. The output looked like something only a math, engineering, or physics geek could love; it took not an instruction manual to understand the output, but an education, maybe two or three.

    You couldn’t deliver a program like that today, not to any user. Personally, I blame Bill Gates – the Walt Disney of silicon — and his insipid talking paperclip. People like to anthropomorphize their machines, and a talking computer makes even more sense than a crooning cricket or an apoplectic duck.

    This is software’s Baroque era. We long ago mastered building software huts with software thatched roofs and software thresholds, and are now spending our hours designing picturesque porticos and ornate gargoyles. Performance is as much taken for granted today as when Thomas Jefferson presumed that Monticello would keep out the rain.

    As much as it offends my sensibilities, economics dictates that performance is not a problem until a user complains about it. There is so much code and so little time. Like my buddy Tom, you have to focus your attention on a few things that you know might have an adverse impact, and otherwise trust your staff to do a workmanlike job. The principle is still the same today as it was 25 years ago: one I/O = 18 bazillion instructions. Plan your database accesses accordingly.

  9. snaidamast says:

    I have to agree with the author of this article. However, my take on it is from a slightly different perspective.

    Most cleanly designed business applications today run at very acceptable speeds and it is unfair to compare them to gaming applications, which are designed around a completely different set of paradigms. For example, if you were to design a First-Person-Shooter, one of your overriding concerns would be performance.

    However, in the business world, technical managers and the like all tout the need for high performance of their applications while at the same time loading down the development schedules with many unneeded features simply to appease users.

    On the other hand, you also have to consider the environment where most business applications are being developed currently, which are for the Internet where in practically all cases performance will be constricted by many variables such as hardware, server routes, and the like.

    You cannot demand performance and then develop for the web where the primary and overriding engineering concern was concurrency.

    With the touting of smart-devices as well as the Internet, an oft ignored environment of the desktop based client-server application is touted as “old” when it is in fact the best area to gain substantial performance metrics for applications that require it,even with some of the bloat that is often found in web-based apps.

    However, client-server simply doesn’t have the appeal for most developers to suggest that many departmental level applications should be considered within this environment over that of the web. Its inherent architecture just has superior performance metrics than something similar on the web and the comparative costs for such applications are substantially less. The idea that you have to actually do a minor install with the application is no longer valid with modern development environments.

  10. jerryhung says:

    When one has to push out products faster than ever, speed/performance loses to cost of time.

    Take cars for example, not everyone want a Corvette for its performance, but people want cars for their looks, reliability, and cheaper costs

    Nowadays, many things are good enough that nobody cares about that 1 second anymore, unless you’re saving millions of 1 second (bank or ERP)

    It’s not wrong really, as businesses continue to satisfy their customer’s needs, even when customer needs change over time (from speed, to better experience/UI at expense of other things)

  11. Timothy A Wiseman says:

    It seems there is room for both. Performance is always an issue, but it isn’t the only thing. For instance, developer time can be a scarce resource so I happily trade the speed of execution for the speed of writing in Python on a regular basis. This is especially true for code that will be run in the background while no one is particularly waiting on it, which is true for a fair number of automated reports and software meant to analyze large data sets.

    Similarly, many users and some managers are pleased by the graphical niceties, so I add them in even if it would run faster without them. Making my users happy is far more significant than making it run faster or achieving some form of code purity.

    Of course, I always think about performance and when I have time to play I like to go back to old code and see if I can make it do the same thing faster. But while performance always matters, it is rarely the only thing.

  12. paschott says:

    I’m going to agree with Lee. Yes, we’ve lost some aspects of performance tuning. Some problems lie in the fact that too many coders do not understand how to truly use a database to do work. Some in the ever-decreasing time to release. Of course, I’m blessed to work with people who try to force the time to re-write and optimize older code to make things better.

    That being said, I think there are quite a few people who will take the trade offs of “good enough” performance over writing assembly code when that’s just not needed for most businesses today. Re-using code can be really helpful and memory/SSD drives are pretty cheap to help speed things up. Of course, that’s no excuse to write bad code, but we have much better technology now to help increase speed in other ways.

    And as much as we may miss those days when a word processor would be just a word processor, I also appreciate the fact that I can now make great-looking publications at home without a lot of fuss and use those pretty graphical tools to arrange things, work with multiple documents, and refer to other resources while doing so.

    As for the new development technique, I think there’s a lot to be said for the DevOps practice – where Developers and IT people work together to tune the software for the good of the business with the best server setup combined with the best software. There’s a sharing of information and resources to make things better rather than a pointing of fingers saying “if only that team would do their job”. It’s helped quite a bit in our company and broken down some of the walls that typically separate the two groups to deliver a better experience for everyone.

  13. Joss says:

    I’ve been in a battle to improve performance on a product our company produces. The “agile” attitude of the team is to first make it work; then make it fast. Those are almost mutually exclusive for large code sets written under sprint deadlines.

    C# and .NET are inherently slow. The very concept of IL with JIT makes them slower. That they are “new world” Microsoft automatically makes them suspect.

    Evernote, a product that sings, was rewritten in C++ after performance came to be a serious problem. That’s a real lesson to be learned from.

  14. Robert Young says:

    Bill Gates is reputed to have said, paraphrasing, “if Windows is too slow, let the hardware fix it”. I can’t track a cite for that, but this amounts to the same thing:

    Software gets slower faster than hardware gets faster.

    — Wirth’s law

  15. paschott says:

    @Joss – you may want to change your Definition of Done if the bare minimum requirement is “it works”. You can work with the team to agree that “working” includes certain performance constraints as well. Of course, it may take several instances of issues raised about performance to get that to happen, but you can get there. We had to bring that up for several pieces of code that “worked” until we hit a large customer with many rows. The good thing is that we were able to fix it as part of a sprint and we brought up the idea that code needs to be usable with larger datasets or when the system is under load or whatever constraint is affecting performance. That was added to our definition of done so just saying “it works” isn’t enough to be shippable code.

  16. DeafProgrammer says:

    I have to agree with Lee. If you look at the history of Edgar F. Codd, he studied mathematics. Google the Codd’s theorem which is a result proven in his seminal work on the relational model. It equates the expressive power of relational algebra and relational calculus. It still works today. Like what Lee says “Plan your database accesses accordingly” !!

Leave a Reply