Chuck Moore: Geek of the Week

05 August 2009
by Richard Morris

Charles Moore is one of the greatest software developers. The  'Forth'  language he invented is still in use today, particularly by NASA,  and has never been bettered for instrumentation and process control. He still argues persuasively that the only way we can develop effective software quickly is to embrace simplicity. Like Niklaus Wirth, he remains a radical whose views have become increasingly relevant to current software development

A descendant of the language that Charles Moore developed was, at one time, the most widely implemented of all computer languages; it was not so much the language of commercial applications but of process control, instrumentation, and of computer peripherals. Forth was extraordinarily advanced for its time, and is still being developed. It spawned other languages such as STOIC and PostScript, and proved so versatile that even a commercial spreadsheet was written in it. In a sense, if you use a PostScript printer, or send .psf or .eps files, you’re using a software technology that Charles Moore first developed.

"I despair. Technology, and
our very civilization, will get
more and more complex until
it collapses. There is no
opposing pressure to limit
this growth."

It may sound hard to believe but four decades ago, when computing was almost exclusively the province of a handful of scientists, giant corporations, and the military,  it was the Governments that were the chief funders of personal computing research. They bought the machines on which the early researchers worked and were visionary advocates who imagined that technology held far-reaching economic implications, way beyond the immediate military uses.

It was the era in which ideas being pursued at two laboratories located on opposite sides of Stanford University would re-shape the way we communicate.

At one end of the Stanford campus was Doug Englebart’s Augmented Human Intellect Research Institute, dedicated to the concept that powerful computing machines would be able to substantially increase (augment) the power of the human mind. In contrast John McCarthy’s Artificial Intelligence Laboratory at the other end of the university began with the goal of creating a simulated human intelligence.

Charles ‘Chuck’ Moore, best known for the work in developing the Forth software language, was a disciple of McCarthy and his development of simple but powerful languages and applications owes a lot to his tutor’s clear mind.

Born in McKeesport Pennsylvania, near Pittsburgh, in 1938, Moore grew up in Flint Michigan he was granted a National Merit scholarship to MIT where he joined Kappa Sigma fraternity.

Awarded a degree in physics he arrived at Stanford where he studied mathematics for 2 years from 1961. He then worked on Fortran II for the IBM 704 to predict Moonwatch satellite observations at Smithsonian Astrophysical Observatory. Around this time he began learning Algol for the Burroughs B5500 to optimize and as Charles H Moore and Associates, he wrote a Fortran-Algol translator to support a timesharing service.

"Get a bare-bones application running quickly.
 Demonstrate it and get feedback from users.
Then modify and expand capability: much
more satisfactory than planning in advance"

Moore co-founded Forth Inc. in 1971 with Elizabeth Rather, after which he developed the Forth-based chip that was taken to market in the mid 1980s by Harris Corporation as the RTX2000, derivatives of which are still being widely used by NASA. At Computer Cowboys, he designed the wonderfully named Sh-Boom microprocessor, and then co-founded iTV, an Internet Appliance manufacturer.

During the1990s, Moore used his own CAD software to design several custom VLSI chips, including the F21 processor with a network interface. More recently, he invented ColorForth and ported his VLSI design tools to it. His software is still in widespread use today.


RM:
"Chuck, I am right in saying Forth came out of Fortran? What was the background to developing Forth because designing a language in the 1960s was not something to be done lightly was it?"
CM:
"Fortran was the principal language in the late 60’s. But the work I did on my pre-Forth interpreter was done in Algol at Stanford. The first Forth was indeed written in Fortran on an IBM 1130 at Mohasco Industries. But the second Forth was written in Cobol, again at Mohasco. And the third Forth in assembler at NRAO.
The name Forth derived from fourth (as in generation) not from Fortran.
Forth became a language, not just an interpreter, at NRAO when I wrote a compiler for it. I was indeed intimidated by doing that since industrial-scale teams wrote compilers. But Forth was a very simple language and I approached it bottom-up, making sure I could add features as they became necessary.
The reason for writing a compiler was to make Forth faster. Not that it was too slow, but the scale of projects kept growing. I very much wanted an interactive language, so didn’t want the classic sequence of edit, compile, link and load. Forth simplified that to edit and load. That meant that library routines such as FFT had to be efficiently described in Forth. Forth has always been a language that compiles directly to RAM. The only object code is the kernel."
RM:
"How did you get to work at the National Radio Astronomy Observatory (NRAO) and was your success down to Forth?"
CM:
"I met George Conant while I was working at SAO (Smithsonian Astrophysical Observatory) in Cambridge. I had a part time job computing predictions for the first artificial satellites. I learned Fortran to do this and wrote an input interpreter to avoid recompiling the program.
George was impressed and some years later, when I left Mohasco, he hired me to work for him at NRAO. I had since expanded that interpreter into Forth and proceeded to use it to program data collection and telescope control. He was disturbed by this because all other programming was done in Fortran.
Was my success do to Forth? Absolutely it was. Your characterization of weeks instead of years is correct: not only fast development of an application, but quick and easy modification of it.
The environment was particularly conducive to this. I spent several months during each thunderstorm season in Tucson, testing software on the radio telescope. I had to be productive to get anything done before observing resumed.
I’ve retained that style: Get a bare-bones application running quickly. Demonstrate it and get feedback from users. Then modify and expand capability: much more satisfactory than planning in advance."
RM:
"Space-related applications of Forth include an amazing array of applications which include use in: the Shuttle, Rosetta and Deep Impact missions as, spacecraft flight system controllers, on-board payload experiment controllers, ground support systems and hardware or software used to build or test either flight or ground systems.
What’s your favourite application and why? It must have been an enjoyable time at the NRAO?"
CM:
"You seem to know more about Forth’s space-related applications than I do. I’m very pleased with them.
I think the reason is that Forth is well suited to resource-constrained situations. It doesn’t need lots of memory and doesn’t have much overhead. It can take full advantage of whatever hardware or interfaces exist. Moreover its inherently interactive nature shortens development time and improves reliability. So it’s appropriate for portable or mobile applications.
I was told of one application with synthetic aperture radar. Forth allowed debugging while on orbit.
I liked working for NRAO. It was fun because I could produce quick, impressive results. On the other hand, that was never a requirement. NRAO was associated with universities. Its headquarters was on the University of Virginia campus and the Tucson office at the University of Arizona. It was a very academic environment. That became frustrating because there was no pressure for results. I thought commercial applications would be more rewarding. But since then I’ve met too many people who want to make a career out of a project instead of completing it. That’s life.
My favorite application has to be OKAD, my VLSI design tools. I’ve spent more time with it that any other; have re-written it multiple times; and carried it to a satisfying level of maturity.
It’s an ambitious undertaking involving some 1400 blocks (Kilobytes) of source. The dictionary has to hold a thousand definitions. Actually there are many more, but not resident at the same time. With that many words, they can’t all be short, mnemonic English. So there are spelling conventions for naming things, which is generally unsatisfactory. But it works, with a little help from comments.
It’s an excellent example of factoring: breaking a problem into smaller pieces. This is very much a fractal process, with the lowest level being the machine-coded words that make everything else work."
RM:
"After Forth came colorForth. Why did you develop it?"
CM:
"In about 2001 I was starting off on a new project: designing a multi-computer chip. I wasn’t real happy with OKAD 1. In particular, John Rible had been urging me to switch to a hierarchical design with asymmetric tiles. To do this I needed to sharpen the tools I was using. A simple version of Forth would do what I wanted, but I thought to try some new ideas. In particular, a smart editor that would simplify the compiler. Move some of the interpreter to the editor where it needn’t be repeated for each compile. This led to adding a tag to each word, preparsing words, packing characters, displaying in color. I have written many Forths and it doesn’t take long. A few weeks and I had not only colorForth, but the basic design of OKAD 2 and the chip. Of course, it took years to perfect."
RM:
"What are the chief differences between Forth and colorForth."
CM:
"ColorForth is a dialect of Forth. If you’ve mastered Forth, the transition to colorForth is quick and easy. The hard things about Forth are managing the parameter stack and factoring the application. These are the same. But there are significant differences. Charley, an enthusiastic supporter, recently said that colorForth is to Forth as Forth is to C. It’s simpler, prettier and more efficient.
Forth has some ugly punctuation that colorForth replaces by coloring source code. Each word has a tag that indicates function; it also determines color. This seems a small point, but it encourages the use of functions, such as comments or compile-time execution, that would be inconvenient in Forth.
By having words preparsed, the compiler is twice as fast. Another small point, since compiling is virtually instantaneous, but this encourages recompiling and overlaying the modules of an application. Smaller modules are easier to code, test and document than a large one.
There is a great similarity between colorForth and classic Forth: 1024-byte blocks. Factoring source code into blocks is equivalent to creating paragraphs in English. It breaks a wall of text into pieces that highlight related ideas. Many Forth implementations abandoned this advantage and organized source into files of arbitrary length. Bad idea. Hugely debated.
In addition, the text in a block can be displayed on a monitor without scrolling. A quantized unit of source code all visible at once. With that text in color, it’s easily readable and a joy to work with.
Forth has gradually become more complicated to suit the taste of contemporary programmers and the complexity of modern computers. ColorForth resets the style to simpler times. A good idea? Not many users yet."
RM:
"Would you consider developing a new language from scratch?"
CM:
"No. I develop languages all the time. Each application requires one. But they’re all based on Forth. Forth has been called a language tool-kit. It is the starting point for an application language, much as the infant brain has the ability to learn any human language. Forth can do anything, but some things are easy. For example postfix notation. Many people, including me, have implemented infix notation. That’s not hard, but it doesn’t lead to any useful result. I believe in simple. I believe in efficient. And Forth embodies these."
RM:
"Do you think education is the answer to developing better software and that somehow we will get away from the 'we must do it first and ship it now no matter how buggy it is' way of thinking?"
CM:
"Education is a good thing. But its lack is not the root cause of buggy software; lack of experience is. Software is never rewritten. Projects last longer than expected; programmers get bored or burned out; management moves on the newer challenges. The attitude of ‘good enough’ reflects reality.
Instead of being rewritten, software has features added. And becomes more complex. So complex that no one dares change it, or improve it, for fear of unintended consequences. But adding to it seems relatively safe. We need dedicated programmers who commit their careers to single applications. Rewriting them over and over until they’re perfect. Such people will never exist. The world is too full of more interesting things to do.
The only hope is to abandon complex software. Embrace simple. Forget backward compatibility. A simple word processor is easy. Would anyone learn to use it? It seems that there’s a window of willingness to learn to type. Once learned, people are not willing to relearn."
RM:
"The technology industry has developed out of all recognition since you began your career what do you think are the good things about this sea-change and what are the bad?"
CM:
"I learned to type on a key-punch. The keystrokes were long and slow and noisy. For a while I had an IBM Electric typewriter. The touch was superb: soft and quiet with real tactile feedback. When I tried an IBM keyboard, I was bitterly disappointed. It clattered. Keystrokes were vague and noisy. Feedback was a joke. Didn’t cost much, though. Technology seems to be a random walk. Some things get better, some worse. It’s like evolution, not goal-directed. Cell phones get smaller, ‘till you can’t use them; then they get bigger.
An example is USB, the Universal Serial Bus. It’s a success that people must cope with. It’s also a disaster. The universal protocols it dictates require a huge amount of software to support. To develop such software requires reading thousands of pages of specs. Which are neither complete nor accurate. So what happens? Someone develops an interface chip that encapsulates the complexity. And then you must learn to use that chip, which is at least as complex.
Complexity is the problem. Moving it from hardware to software, or vice versa, doesn’t help. Simplicity is the only answer. There was a product many years ago called the Canon Cat. It was a simple, dedicated word processor; done very nicely in Forth. Didn’t succeed commercially. But then, most products don’t.
I despair. Technology, and our very civilization, will get more and more complex until it collapses. There is no opposing pressure to limit this growth. No environmental group saying: Count the parts in a hybrid car to judge its efficiency or reliability or maintainability.
All I can do is provide existence proofs: Forth is a simple language; OKAD is a simple design tool; GreenArrays offers simple computer chips. No one is paying any attention."

© Simple-Talk.com