Don Knuth and the Art of Computer Programming: The Interview

Fifty years after starting the 'Art of Computer Programming', (TAOCP), Don Knuth is still working hard at the project. He has almost completed the first five volumes. It is considered amongst the "hundred or so books that shaped a century of science". Richard Morris asks him how things are going, and to find out more about his many achievements.

In the nearly fifty years since beginning the book, ‘The Art of Computer Programming’, that has almost defined computer programming as much as it has defined him, Donald Knuth has received awards including the Kyoto Prize (1996), the Turing Award (1974), and the National Medal of Science (1979). He is an extraordinary man. As well as inventing ‘Literate Programming’ and writing the most important textbook on programming algorithms, he is also famous for designing and programming one of the most widely-used digital typesetting systems ever, even designing the fonts that went with it. He also pioneered the use of ‘Open-source’ software.


Knuth is a man of engaging charm and enthusiasms who combines a knowledge of history, music, art and mathematics with a unique insight into the art of computer programming.

Don Knuth has always viewed the stages of writing ‘The Art of Computer Programming’ as the most important project of his life.

Knuth’s full-time writing schedule means that he is ‘pretty much a hermit… concentrating intensively and uninterruptedly on one subject at a time, rather than swapping a number of topics in and out of his head. I’m unable to schedule appointments with visitors, travel to conferences or accept speaking engagements, or undertake any new responsibilities of any kind.’

The irony is that computer science nearly lost Knuth to its ranks because of his love of music (his house is built around a two-storey pipe organ that he designed himself) and says he intends to return to it once he has completed the expected seven volumes of ‘The Art of Computer Science’.

Don, the last time we spoke we touched on complexity and you said that simplicity was the only answer to this. Why is it important to find simple solutions to software problems?
It is important to find simple solutions instead of stopping as soon as a first solution is found.

I suppose people usually stop early because most problems do not have a simple solution; thus they figure there’s no reason to spend any time looking further. If we start with the assumption that a simple solution does exist, we’re much more likely to find one.

You have an essay about developing TeX where you talk about going over to a pure, destructive QA personality and trying your hardest to break code. Do you think most developers are good at that?
You are right that it takes a certain kind of mentality to create test cases that will detect subtle bugs. For example, I know that I’m not sneaky enough to ever become an expert on cryptography or computer security.

On the other hand I have been reasonably successful at designing torture tests for software that I’ve written, mostly by
   (1) imagining myself as the enemy of the system, rather than as its friend;
   (2) thinking of inputs that are legal but bizarre and unlikely ever to be useful;
   (3) embedding an incredibly complicated construction into another that’s even less scrutable.

Some parts of my test programs for TeX and METAFONT required many hours of thought before I could convince myself that the program did the right thing. But in the process I discovered bugs that I’m pretty sure wouldn’t have been found by any other method that I’ve ever heard about.

Even better results will presumably be obtained if several different people independently create the torture tests. I can imagine that test creation would be a satisfying career.

I guess I do tend to use systems in unexpected ways, and I get some satisfaction when this reveals a flaw. For example, I remember having fun while visiting my sister and playing with a `shoot-em-up’ game that my nephew showed me.

Since I’m kind of a pacifist, I tried shooting bullets at the wall, instead of trying to kill the hidden attackers. Sure enough, I could spell my name on that wall, by making bullet holes in an appropriate pattern. But later when I came back to that same position, after wandering around further in the game, my name was gone! I think that was a bug in the software.

Long ago when I was a teenager, probably in 1952 or 1953, I saw my first “computer game”, a demonstration of tic-tac-toe at a museum in Chicago where visitors were challenged to beat a machine.

The exhibit was designed by AT&T, and I think it was controlled by relay switches rather than by a real general-purpose computer.
‘Hmm. No Angry Birds here!’
(Don Knuth with an IBM 650 in 1958)

I knew that I could never win if I made normal responses; so I intentionally made the stupidest possible moves. The machine got to a position where it had two ways to beat me by completing three in a row. I decided not to block either possibility. In response, the machine made both of its winning moves … and this was clearly in violation of the rules. So I had won a moral victory.

Thus, it appears that an encouragement to “think outside the box” helps for designing test cases, just as it does in many other situations.

If you were to start over and design TeX today, would the advances in computing or your understanding change the design in dramatic ways or would it turn out mostly the same?
I’m not sure if anybody can still write such a program today, without paying a fortune to license patented ideas. But suppose we ignore such problems; then I would keep the system basically as it is now.

The only serious mistake that I regret making with respect to TeX is that I used binary arithmetic internally but decimal arithmetic in the user interface; I should have done all computations in decimal.

What would you suggest to someone who wants to be a better software > designer? Decide if they want to make money or whether they want to do science? Do you think we need better incentives to make better software?
I think existing incentives are working fine, except of course that I wish more people would discover the advantages of literate programming.
Is there any truth in the rumour that Chuck Moore had an influence on your thinking about algorithms or you having any influence on Chuck?
We haven’t intersected at all, as far as I know.
What’s your process look like when you’re doing exploratory programming and readying it for publication?
I write about five programs each week, on average, and I continue to enjoy making them “literate”. I often tear up the first draft or two, after I see how things are going, because algorithms tend to be full of surprises. I often make close studies of code written by the leading experts in whatever kind of application I’m currently studying; but I also try to rethink everything from scratch. I make many mistakes, but afterwards I try to help my readers to avoid them.
How do you define the idea of designing a programming language? Is it a tool to express ideas or a tool to express goals?
I think of a programming language as a tool to convert a programmer’s mental images into precise operations that a machine can perform. The main idea is to match the user’s intuition as well as possible. There are many kinds of users, and many kinds of application areas, so we need many kinds of languages.

I discuss language design in detail in Chapter 11 of my recent book ‘Selected Papers on Computer Languages’. I was asked in the 1960s to write about this topic, and the best way I could figure out to explain principles of good design was to violate them all and to present the design of BL\I, “Bad Language One”. But I hesitated for more than 40 years before publishing anything about BL\I, because I was afraid people might implement it and start to use it. Finally, when preparing that book, I decided to take a chance.

I realise this is an enormous question, but what is the link between the design of a language and the design of software written with that language?
The software writers who have particular ways of thinking about algorithms need a language that matches their thoughts so that they can efficiently transform those thoughts into working code.

Different thought processes result in different structures.

What are the major problems in the computer industry today?
D’uh. I’m just an academic who writes about programming; I never have understood industry or economics.
Do you prefer freedom or order? Do you prefer one way to do a single thing, or a thousand ways to reach the same goal?
(a) Freedom AND order. (b) I guess I prefer maybe three ways, having different characteristics, together with the knowledge of how to convert each of them into the other two.
In what ways do you think you have influenced the computer industry as a technologist and what lessons should people learn from your experience?
I can’t say what influence I’ve had, but I can summarize what I’ve been trying to accomplish during my career: I’ve attempted to organize many of the best ideas that have been discovered related to programming, and to explain them effectively, because I believe computer science is a beautiful body of knowledge.

I’m glad that this knowledge has proved to be very useful, although I would actually want to organize it and explain it even if the only motivation were intellectual curiosity and a desire to explore fascinating patterns.

Tags: , , , , ,


  • Rate
    [Total: 2    Average: 5/5]
  • Alan G. Stewart

    Name On The Wall
    I’ve actually done the shoot my name on the wall trick myself. The fact that it disappeared was not a bug, but a method of dealing with the limitations of the computer – it didn’t have the memory to keep track of every small change (bullet holes, broken windows, etc) in a round of play, so after a period of time, it forgot the older changes. Newer games, designed for computers with more memory, do remember them.
    Games are an interesting place to look at "torture tests". An amazing number of people play them, and they try every possible thing. In fact, sometimes the real game is "find the glitch". It’s you against the designers and programmers.
    Good testing requires a special mindset. Some years back (more than I care to remember), I had a junior programmer who just couldn’t seem to build very good code. He just didn’t have the right mindset. I didn’t want to let him go, but I couldn’t keep him were he was. In desperation, I assigned him exclusively to testing – and he was brilliant at it. That same "out of left field" point of view that was such a problem while writing code was exactly what was needed to find the problems.

  • Terry A. Davis

    Hey Knuth? You wanna work on TempleOS.?
    I’m concerned about PSect in TempleOS. It’s okay, now, but it doesn’t scale well. My solution is not to make it have to scale.

    You see how I’m in Hotel California? I have to do everything. Therefore it will never have to scale. It took 10 years to write 100,000 lines. In another ten years, there will be 200,000 and it should still be okay.

    I guess I switch from “next_added/last_added” to binary tree.

    I cannot overstate how central these hash tables are to the operating system. I really don’t want a lock.

    We can just bracket with PUSHFD;CLI;…POPFD. That protects against same-core interference. Multicore is still, potentially, a problem. We operate master/slave multicore, not SMP, so we can just make a policy that other cores mustn’t do that! LOL

    SMP is good for multiuser main frames but foolish for home computer video games. I like to joke “It’s better to run one app fast, not two.” Core#0 is master and basically explicitly places helper tasks on other cores. It sounds like insanity to have a scheduler shifting tasks around between cores! Stupid.

    Linux set-out to make a multi-user main frame.

    I set-out to make a soupped-up 64-bit commodore 64.

    My founding principles were ring-0-only 64-bit identity-mapped. I wanted no paging but 64-bit mode requires it, so I identity-map.

    Believe it or not, 640×480 is not real mode.
    Believe it or not, identity-mapped is not monotasking.

    It’s finished. I worked ten years and wrote all 130,000 lines of code. A Commodore 64 ROM was 20K, let’s say 20,000 lines of code. Mine is 130,000 lines of code.

    For the past three years, if you asked me, “Is it ready to be a ROM?” I would have said, “Yes”.

    I’m in Hotel California.
    I’ve been a professional operating system developer since 1990 when, at age 20, I was hired to work on Ticketmaster’s VAX operating system in VAX asm. Operating systems include compilers and other tools. My boos Denny, wrote our PASCAL compiler in one year; Troy wrote the assembler and linker.

    I’m a professional and thought nothing of doing a compiler, while everybody else seems Jedi-mind-tricked these days.

    Linux code is ugly. Those excited by the concept of “open source” are in for a disappointment. Mine is simple enough for a mere mortal. The C64 was simple and not intimidating.

    Art is an arms race and high res hurts programmers who operate without artists. High res is an arms race because it puts you right at the limit of performance. God wants kids to feel good about offerings.

    Multiple GPU support is out of the question. If you don’t do all GPU’s, it’s worthless for developers wishing to ship product. Anything higher than 640×480 is impossible without GPUs if you do full screen 3D 60fps graphics, certainly for amateurs like kids doing offerings in God’s temple. I know you will argue, but there’s a reason GPU’s exist! You’re like the fools who do asm-only. Don’t you think the industry knows better?

    I decided no PCI devices, just single-driver devices. PCI devices make it ugly.

  • Paul Dulaney

    Some memories
    I was studying electrical engineering at Berkeley in the late ’70’s when I read a flyer announcing a short series of lectures to be given on the Berkeley campus by Prof Knuth regarding his new typesetting programs (TeX and MetaFont). I had always been a typography nut, so I was eager to attend. After 30 some odd years I don’t remember much. I think the main thing he said about TeX was that it used dynamic programming to minimize the badness of paragraph line breaks. With regard to Metafont, he said he had stumbled upon a simple but effective means of deciding whether or not a pixel should be filled: If the curve passed through a diamond whose vertices were the midpoints of the pixel sides, then the pixel should be filled in.

    To my thinking, The TeXBook remains the best technical manual ever written. Thanks, professor!

  • Anonymous

    Terry A. Davis
    Great interview.

    STFU, Terry A. Davis!

  • Anonymous

    I truly admire this guy’s contributions to software and computer science. But it’s truly amazing how many professional coders are, or appear to have dyslexia. Never mind being literate, that’s truly beyond a lot of people, whether they write code or not.

    I love the idea of literate programming, but it’s just not practical, wading through a book to find a snippet of code. I do use a literate tool, funnelweb, which is far more usable than the original pascal or cweb version. The idea lives on but thankfully we don’t have to suffer the full feature set. Which you might say resembles controlling a huge pipe organ. Nice if you’ve got the time.

  • Anonymous

    Apologies for using the word *truly* so many times.


  • Adrian Tawse

    Simple Solutions
    There may not be simple solutions to all problems but there are always an infinite number of more complex ones (bad ones). These have been the bane of my life. So I still say "Find the essential simplicity in the problem".

  • JEB

    Don Knuth was my engineering partner, MJO’s role model I now know why – thanks for the interview and Mick, where are you??

  • Michael Sh., Haifa

    Don’t praise this AOCP too much:
    He was completely wrong in his vision of computer science: 1) programming developed to be based on high-level languages and not on isoteric assemblers, 2) programming developed to be structured and object-oriented and not Kirkhoff- law-based spaghetti code, 3) and above all, programming became engineering, not any art or wizzardcraft. But as you know, sitting in the ivory tower, one can see only very narrow horizons…

  • Robert young

    RE: Don’t praise this AOCP too much
    Yes, there has been a level of reverence for Knuth that puzzles me. His instance on an assembler of his own devising (he says because any real assembler or language would eventually go out of fashion, and he’d have to re-write; yes, he would and a good thing to), has been criticized for some time. There are much more relevant algo books in C/C++ that would serve one better.

    As such, the transferability of his constructs is up for debate. Moreover, real cpu’s these days don’t even run there own assemblers, but RISC-y microcode, so all that bit fiddling might be for naught. If we had the RISC-y ISA to inspect.

    AOCP is the "Ulysses" of coding; nearly everyone worships it, but few if any have ever read it, and fewer still understand it. And no Molly’s soliloquy at the end.

    I for one prefer the RM and SQL (close enough for government work).

  • Voidy

    RE: Don’t praise this AOCP too much
    TAOCP is neither a technical reference, nor programming language manual, nor a textbook on algorithms. It is a brilliant computer science monograph, though. That’s why I study it first and foremost toboost my problem-solving skills, expand my horizons, and acquire the right mindset for cracking even seemingly impossible problems. Consider it a self-improvement activity, much like yoga exercises or meditation – it does have TAO in its name, after all.

  • Geoff Stephenson

    RE: Don’t praise this AOCP too much
    Anyone who has read TAOCP and doesn’t understand the function of the assembler in the context of the book is unlikely to get much else from it or understand why it was so important to programmers in the 1960s. Object orientation has nothing to do with the programming language it is how you use it. Just like Joyce uses the same language as you but gets rather different results.

  • JonathanR

    Great books but are they useful nowadays?
    I bought a couple of his books and found the algorithms very good and very clever. However, now computing seems to have moved away from needing to know any of this as it’s all wrapped in packages and libraries that we just need to learn how to call. It’s only a very small fraction of programmers that will need to know the details of the algorithms to write the libraries.

  • zafarali

    to make frendship
    hi, i dont have any idea what u are actcully talking about because i m new in programming thank you

  • ignorantOne

    Usefulness of AOCP
    I am just making this comment to tell my experience. Firstly I want to tell that I don’t have any computer background. I am basically an electrical engineer & I enjoy reading the book.

    Now coming to the point. When I first time came accross the pointer in C, I was hardly convienced about the power of pointer.It was a lab course in my engineering "Microprocessor", where we used to program in memonics; the power pointer & physical manipulation of ram-memory and rom-memory was clear to me.

    I agree with the opinion that data manipulation can be best demonstrated by Assembly language.

    Now those who in favour of a high-level language, I differ because
    1) High level languages are good to put at work whereas Assembly language is definitely having a cutting edge for teaching at least, apart from performance
    2)Using Assembly level language removes black-boxes how machine works that lacks if you use a high-level language

    I agree the problems in the book are bent towards elegant ways of attacking a problem or rather elegant ways of tacking computational problems.
    Programming is a just way to implement it.

    Thanks all.