Over the history of personal computing, it seems that the best software was written by a team of five or fewer programmers. CP/M, the first PC operating system was built by one guy, Gary Kildall, though he got parts from others such as Gordon Eubanks. MSDOS originated in QDOS, again written by one person, Tim Paterson.

PCs flourished because they could be used as word processors, with software called WordStar, written by one person, Rob Barnaby. Microsoft Basic took two people to write it, Bill Gates and Paul Allen, though I gather that Paul did most of the work. Charles Simonyi, who created MS Word, once told me, in the late eighties, that there were only three people working on the core product. The rest were working on libraries and drivers.

The first and greatest version of Lotus 123 was written by one person, Jonathan Sachs in assembler code, in six months. The great Borland products were all written by one programmer; Turbo Pascal, for example, was written by Anders Hejlsberg. I met and interviewed the person who created Borland Sidekick a year ago. I’ve said enough to make the point.

Charles Simonyi, like many others, did his work using a project management style he called metaprogramming. The metaprogrammer was the designer, decision maker, and communication controller in a software development team. Other programmers in the project were not allowed to make any design decisions about the project, but had to concentrate on writing the code as described by the metaprogrammer. If they hit a problem, they would take it to the metaprogrammer, who would come up with an answer or transfer the question to another programmer. He believed that this was the most efficient method of organizing programmers to write software, but it’s a long way from the current Agile belief in consensus and discussion.

Microsoft grew wealthy because Bill Gates knew all the best programmers around and gave them good job offers. He cherished them and, if they succeeded, they were well-rewarded. As Microsoft expanded, it forgot the ‘metaprogramming’ principles that gave it the Office suite.

I remember vividly the time in 1987 when I visited the Microsoft team tasked with producing their new operating system, OS/2. I was ushered into an enormous open-plan office, packed with earnest-looking programmers tapping away busily or engaged in serious technical discussions. It was Microsoft’s second attempt to replace MSDOS, and after the Xenix debacle they were determined to make it stick. Of course, the third and ultimately successful replacement for MSDOS, Windows NT, was still a way ahead, and was to be developed by a small team from DEC led by Dave Cutler.

The eighties, in particular, was a time of dazzling progress in the development of new tools and applications, including the introduction of relational databases, spreadsheets and object-oriented computer languages. The current decade just isn’t. Why not?

Commentary Competition

Enjoyed the topic? Have a relevant anecdote? Disagree with the author? Leave your two cents on this post in the comments below, and our favourite response will win a $50 Amazon gift card. The competition closes two weeks from the date of publication, and the winner will be announced in the next Simple Talk newsletter.


  • Rate
    [Total: 5    Average: 5/5]
  • pavlindrom

    Short and to the point, makes me want to go home and write some code that doesn’t benefit anyone right now. But then again, maybe one person could write it because the bar was so low for so may of these application, and now with the UX revolution things can’t possibly be done in isolation in the time allotted. Also makes me think of open source alternatives to very polished and well established products – that seems like a waste of time, and encourages despair in the person undertaking the work. Where the lone programmer seeking fame should put their efforts is into project ideas that are new, radical, and possibly useful if completed. Sorry, 1c short of the 2c requirement. What could I do? I only had my wallet to pull from…

    • Carlos Rodriguez

      been there, done that; I wrote a complete mobile system for the restaurant industry, placed IT in 23 restaurants around the Atlanta and Tennesseearea. long story short, the system was based on Palm Pilots but then Palm decided to go against MS and MS ate Palms lunch. got frustrated, stopped the project,after not being able to get any financing, I did not/refused to move to the new iPHONE era and now the rest is history. 🙁 :(.

      PS: I still don’t see anything like what I designed and created with my son and a student Intern.

      • pavlindrom

        Have you thought of resurrecting it? It doesn’t have to be coded in Obj-C, or in Swift, you can use one of many third party libraries. You still need a Mac though to compile, emulate, and package it 🙁

        • Carlos Rodriguez

          Actually I did resurrected it, all I needed was to migrate the UI portion from c++ as used in the palm pilot to HTML and Cordoba, will be using React or Angular in the near future, the problem I have is with getting some funds to market it. I demoed it to my last customer in NY restaurant and He loved so much so he came to Atlanta to view last year.

  • Peter Schott

    I think the issues now would be more that the people who want to be metaprogrammers _shouldn’t_ be metaprogrammers. The ones back then that were driving all of these great apps and changes had the knowledge of what the program should do as well as the knowledge about how to do it. Today, that’s rarely the case. You may have someone with vision and no knowledge or knowledge with little vision, but not as many who can really see everything and guide those efforts.

    I’ve run into several architects in my career who put in place systems that were full of somewhat patchy ideas, but they got through because they came from the architect. I’ve worked with one or two who could possibly fit the idea of a metaprogrammer – knowing enough to guide all aspects of a project and having vision. Sometimes they’re recognized, but often not put in the correct positions. Worse, they’re sometimes moved to the wrong positions because management thinks they can manage/train others to do that. They then get frustrated and leave with nobody able to take their place. 🙁

    • Carlos Rodriguez


  • Camilo Reyes

    I think the metaprogrammer has been superseded by healthy competition. Sure, stay at your job working alone but get inundated with a plethora of vendors willing to do the job for you. Organizations undercut individual contributors, offer less compensation, and is not as challenging. In the eighties there were a few who knew computers, let alone know how to write good programs. Nowadays it is a competitive field where you are more valuable as a member of a team. Teams can solve problems in unexpected ways, and deliver faster. To meet expectations, teams drive towards standardization with a common set of tools. The tools have improved over the last decade and will continue to do so because of more digital data and more demand. It’s not worthwhile to be the “lone programmer” anymore.

  • Mike Fowler

    Interesting question. Many things likely happened. The first that comes to mind is that technology became commercialized. The hacker ethos is largely gone. Technology is a means to an end for many instead of the end itself.

  • I worked with Charles in the early days of Office (then called the “Multi-tool” series, believe it or not). While Charles espoused the concept of the meta-programmer in his doctoral thesis, it did not really fly at Microsoft. The predominant model of work, was to break up the problem (like, “create a great graphical spreadsheet program” – Excel), into many largely independent tasks, and give them to the quite capable, though young and lightly experienced, programmers they were hiring. Charles, and other senior managers at MS then coordinated this work by divvying up tasks to those that were capable of solving them.

    Charles did experiment with a form of pair-programming in the early days; the developers that were paired with him largely found the process intimidating and stressful; after a few days working like this, most reverted to working independently.

    I think the predominant work style of the day was more collegial; developers would bounce ideas off each other when they ran into tricky problems, but would then go back to their (private) offices to pound out the code.

    The original Excel team also had our programming “stars” (Doug Klunder, in particular); but each of the developers I worked with created high quality and often elegant solutions in each of their assigned feature areas.

    • Fascinating. Yes, I’d imagined that Charles’ views would be rather at odds with the Microsoft culture at the time. He’d have been better off working at Digital Research, where Gary used the same technique to produce his PL/1 compiler. I had a very mundane task on the internationalisation of the Multi-tools for CP/M, working for Xerox who then were working with Microsoft on the project of a European version. I only met Charles later when Word for Windows was released.

  • Keith Rowley

    I would love to be the meta programmer but would hate to be the person working under them, not allowed to make any decisions or provide any creative input, that sounds like hell to me. The problem is that being the programmer working under the meta-programmer sounds like a horrible job.

    • Keith, I’ve run such a team back in the late 90’s and we produced some amazing software including a cross-platform competitor to Crystal Reports, a better framework for cross-platform C++ GUI development than anything on the market and an end-user product loved by tens of thousands of people. I had a team of mostly very junior programmers working with me.

      I think you made a bit of a leap from “not allowed to make any decisions” to (not) “provide any creative input”.

      Nothing I’ve read in the article above or elsewhere claims that Simonyi dictated things to the degree where people could not make creative input.

      In the team I mention above, I made the final decisions but a lot of creative ideas were from group discussion. As part of that, I refined my idea of a Design Decisions Diary (written up in SE mag a couple of times) explicitly so the decision making process was recorded WITH people’s alternative ideas going “on the record”.

      • As Charles explained it to me, it was much more about preventing programmers getting in each others way when working on a project. The metaprogrammers were closer to technical architects in that they were responsible for determining the modules and interfaces and specifying how they should work. This was achieved by program design. In the case of MS Word and Excel, a lot of it was achieved via a Domain-specific language and its precise definition. I went on to use exactly the same technique myself when implementing a large telephone banking system to a major retail bank. Every programmer likes to think they are creative designers, but the vast majority, in my experience, are much happier working with clear instructions to a precise testable specification. Agile sometimes makes demands that are unrealistic. Metaprogramming is designed to allow creative input and open-ended work for every team member, but it doesn’t demand it. Metaprogramming, done well, can make project management a lot easier too, though that isn’t always the case.

        • One of the amazingly productive things about the design decisions diary recording ALL the alternatives is that it acts as a venting mechanism. It means you can get people’s bright ideas and both

          1) Keep them for revisiting if the way you choose to do things is not good enough,
          2) Satisfy their need to be heard – they get at least a minor “win”.

          The latter is important and I’m not being cynical. I have worked in very dysfunctional agile situations where the dynamic is to beat people into submission, with a lot of good feedback lost along the way. If good ideas from less dominant agile members are being recorded, it both defuses emotion and also avoids wasting as much time on the struggle.

          • rogerthat

            In any larger scale software project, there has to be key decision makers that end up identifying the path to go on. I’ve never worked with a person with “metaprogrammer” in their title, but I’ve worked with plenty “architects” who essentially laid out the path to go — that is not a prison of mindlessness, that is preventing a barrage of tug of wars over multiple personal ideas on the best path.
            The best leaders are those that listen, clarify the why of the path and help the other developers understand and agree to the path.

  • Keith Rowley

    As a side note, have you ever looked at the writing program Scrivener? This is a great example of an amazing software product primarily written by one person that has been created in the last few years.

    My point being that these programs are still out there, you just have to look a bit harder for them to find them in the hash of other products out there.

  • willliebago

    Hey there are some recent fun apps created by a single developer – Minecraft & Flappy Bird 🙂

  • Eric Huggins

    The underlying assumption in the question proposed seems to be that Agile’s focus on group participation has somehow stifled innovation – when in fact Agile is a Project Management methodology and not a project visioning methodology. A well-managed Agile Project consists of well-written features and stories that can be implemented by a team of programmers on a priority basis – as defined by the customer or stakeholder. While Agile assumes the majority of customers do not know what they want before they see it – it does not preclude a case where a customer knows exactly what they want and just need to manage it – arguably in just the hierarchical fashion described.

    By definition – a metaprogrammer is an individual with vision, and in each of the cited cases, a unique vision. Metaprogramming is in fact, still a thing. NodeJS was envisioned and developed by a single individual, Stephen Wolfram controls the vision and direction for the Wolfram Language. In both the current and past examples – the key factor is vision. As @paschott:disqus noted previously, not everyone has the capacity or capability to be a visionary metaprogrammer. A number of other factors are also at work: the collective “GooglePlex” of Internet knowledge was not available in the early days of computing; sifting through the far larger and far more complex programming landscape is more difficult and time consuming today; and the rate of environmental change is much greater than in earlier times.

    Innovation seems to me more a product of evolution than happenstance, by that I mean that ground-breaking innovation can often only occur after environmental change sets the stage. The programming innovations @simpletalk-sso-cad4da0873d38570bc9118fa4fcc932f:disqus cites were all enabled by the democratization of computing brought on by the rapid rise of the PC. Changing required tools to take advantage of the newly available resources. The rapid rise of cloud technologies and rapidly expanding array of tools seems well-poised to spark the next wave of innovation, quite possibly via another metaprogrammer visionary.

    • Where did you get Agile from as the limiting factor? It is not mentioned anywhere in the article. Nothing in the metaprogramming articles I’ve read say that they require all design up front – it’s about how development decision-making proceeds.

      “envisioned and developed by a single individual” is not metaprogramming – because it’s about more than an individual programmer doing all the work.

  • Carlos Rodriguez

    While at DEC, I worked with Dave Cutler where he was working on what became the OPENVMS operating system and later the template for what is now WINDOWS NT or just plain WINDOWS 2012 and all of its derivations. At the time I was working on TOPS 10 and DEC RDB now part of the Oracle database engine. Dave Cutler worked all by himself in a locked and very secret room, decorated with camouflage and the only way to get into the room was with an escort. in Dave’s case, he was the manager, the programmer and anything else in between. Oh man, those were the days… just sharing some memories….


    I hadn’t heard of this described as “metaprogramming” (which has other meanings) but as Chief Programmer Teams. I think it was in the late 80’s when this was discussed in detail in a SE history course I was taking but the concept supposedly dates back to IBM around 71 and was described in the Mythical Man Month. See for some interesting contrarian opinions.

    In Lammers’ “Programmers at Work” Simonyi talks about Hungarian Notation and metaprograms. You can read the entire interview online now

    ” In the Bravo days, the supervision of the programmers was very, very direct. Once I actually wrote an incredibly detailed work order, which was called a metaprogram. It was almost like a program, except written in a very, very high-level language. We hired two bushy-tailed guys from Stanford as ‘experimental subjects.’; They wrote the program exactly the way I wanted it to be written and we accomplished two things. First, it was easier for me to work in this incredibly high-level language, essentially programming these people. Second, they really learned the program much better than if I had finished it and given them the listing and said, ‘Study this program.’ They learned it because they wrote it. See, everybody could claim that they wrote the program. I wrote the program and they wrote the program. That was really great.

    I think the best way to supervise is by personal example and by frequent code reviews. We try to do code reviews all the time.”


    First, it is really hard to to be dazzled by one thing when the background is full of spotlights being swung around by rabid squirrels.

    There is a LOT of noise in the modern software ecosystem. So maybe some things which are equally good advances just don’t get noticed as much?


    Looking at the intro on which points to the online Lammers book, it seems someone there (unnamed blog author) agrees, talking about the book:

    “when you read the book you see that our profession has not evolved much over the last quarter of a century. We still face the same challenges of handling complexity, good design is still critical and still very hard, we still debate whether software is engineering, art, science or a craft, we still use programming languages that mirror the execution model of our computers, we still use only textual programming languages that has to be parsed from text like the punch cards legacy forced us to do, we still only have very primitive tools, quality is still a big problem, finding and nurturing programming talent is still a challenge, programmer training and education is still broken, and so on. So in this regard it is a depressing read.”

    And, sadly, I tend to agree with them, having started work in 1983. We have had at least one lost decade due to J2EE and similar thinking of trying to turn programming into a commoditised, factory operation.

    I think there’s a hopeful future, however. Languages are improving. There’s some IDE progress, after decades of stalling (still catching up to Smalltalk’s environment and C++ tools such as ObjectMaster) especially with live environments such as Playgrounds and LightTable. These are all enablers that will help entrepreneurs overcome the drag of the “glamour tax” that requires a lot of extra work put into polish.

    One other thing that *has* changed – lots more people are programming, not just professionals. We have a lot of “little wins” which mean as much to some folks and their friends as the big highlights did to the professionals.

  • rogerthat

    You’ve struck a high interest – this month will be hard to judge.
    I suspect the real issue here is your bias that includes your vast PC experience.
    While there are still vast issues with the cloud, that is a relatively new venture and way beyond the reach of a spreadsheet.
    In more concrete terms, how about the maturity of virtual machines, the ability to spin up an entire farm of new, ready and running servers with a few keystrokes and mouse clicks?
    Moving the focus to personal devices, the phone has become the most used platform and has changed every day life.
    There are apps that let you see who is at your door, allow you to communicate with someone across the world in real time.
    Virtual Reality is getting a real start…
    See how long you can go without using Google (of for those dedicated Microsoft fans Bing).
    While I see your point that the PC realm could be considered to be mature (not stagnant), I see tremendous innovation all across the computing landscape.
    My take is that the PC is no longer the main development platform – most demand / interest is now cloud-based, device-based.

  • buggyfunbunny

    — I worked with Charles in the early days of Office (then called the “Multi-tool” series, believe it or not).
    that’s because MS Multiplan was its version of VisiCalc, long before Office was even a twinkle in Gates’ eyes. which was made by Bricklin and Frankston. Excel was a much later effort, when it became obvious that “DOS ain’t done til 1-2-3 won’t run” wasn’t going to succeed.

    beyond that specific example, what you’ve described, if you’re old enough, is the distinction from the 60s (and earlier) mainframe world of Analyst and Programmer. the Analyst wrote a spec sheet, which the Programmer coded, either in assembler or COBOL, generally. the problem with the PC world is that few of the early “innovators” had any experience doing significant applications. after all, the PC (and earlier microprocessor machines; my favorite being the Tektronix 4051) was built as a single-user, single-program device without a legitimate OS. QDOS was, explicitly, “quick and dirty”; we got the virus epidemic just because it gave any code access to the hardware. games, and 1-2-3 needed such in order to run. the irony, of course, was that MS was a unix licensee (long before the DOS adventure) and consider its version, called Xenix, as its real OS product. MS, at the beginning, was a tool maker, not an application maker. Borland gave Gates fits. IBM was the impetus to “adopt” QDOS. and Apple to make Windows and all those GUI applications.

    the bottom line, so to speak, is that “metaprogramming” is just half-century old industrial drone coding with a sexy new name.