F E A T U R E S    Issue 1.05 - September 1995

The Physicist

By Stewart Brand



He's only 35 years old, but Nathan Myhrvold has enough to say about computing to last him at least another 35. With a PhD in physics from Princeton University, he taught himself programming, read Donald Knuth, and founded a software company with grad-school cronies. He holds the most prestigious of jobs in a company built on prestige: after joining Microsoft in 1986, Myhrvold became chief "technologist" by 1989; the head honcho overseeing 650 serfs in research and advanced product development. "I work directly for Bill, and research works directly for me," he says. And this year he's one of six executives promoted to Microsoft's Bill-centered "office of the president." Meanwhile, Myhrvold remains a nerd among nerds who reads science magazines and dreams of deploying the insights of physical science to master that still unkempt discipline of economics.

Stewart Brand met up with Myhrvold during the TED conference this past spring to hear him hold forth on everything from software efficiency to sci-fi.

Wired: What has come to be called Moore's Law was first proposed by Gordon Moore in 1965. He said that the number of components on a microchip had doubled every year since 1959, and that the trend would continue until 1975. These days, Moore's Law is treated as a general statement that computers get drastically better every year - faster, cheaper, smaller - and that this will occur indefinitely. I take it that you see Moore's Law as something quite fundamental?
Myhrvold: The way Moore's Law occurs in computing is really unprecedented in other walks of life. If the Boeing 747 obeyed Moore's Law, it would travel a million miles an hour, it would be shrunken down in size, and a trip to New York would cost about five dollars. Those enormous changes just aren't part of our everyday experience.

The whole hardware industry has experienced the phenomenon in which every time computers get cheaper, they appeal to a new set of users; every time they get more powerful, old customers upgrade. But it turns out that like hardware, software also has to undergo something like Moore's Law. I did a study of a variety of Microsoft products: I counted the number of lines of code for successive releases. Basic had 4,000 lines of code in 1975. Currently, it has perhaps half a million. Microsoft Word was at 27,000 lines of code in the first version. It's now about 2 million. So, we have increased the size and complexity of software even faster than Moore's Law. In fact, this is why there is a market for faster processors - software people have always consumed new capability as fast or faster than the chip people could make it available.

How do you measure software efficiency?
If you write your own software, which is generally true for mainframes or supercomputers, for every dollar that you spend on software development, you get about a buck's worth of software. It's a 1 to 1 ratio. Now, if you look at minicomputers, you will find that the people in those industries typically have 1,000 to 10,000 customers. They can afford to spend considerably more on development and still maintain a decent return on investment. When you spend US$10,000 to buy a minicomputer software package, the company probably spent $100,000 to $1 million developing it. So, if you're buying from that market, for every dollar you spend, you get somewhere between $10 and $1,000 worth of software. Then you come to the PC industry, where for every $100 you spend, you get a piece of software that cost someone $100 million to develop. A dollar buys a million dollars' worth of software.

Call it Myhrvold's Law. As the software becomes more complex, doesn't the likelihood of problems tend to increase? And when you always have to have new software, won't it always have new problems?
We have gone from 27,000 lines of code to 2 million lines of code for the same money. If I say I've got two versions of Word - that old one from 1982 that's perfect, with zero defects; or the new one that's got all this cool new stuff, but there might be a few bugs in it - people always want the new one. But I wouldn't want them to operate a plane I was on with software that happened to be the latest greatest release!

What ever happened to the old aesthetic that fewer lines of code must be better? Has it disappeared because of Moore's Law?
No. You absolutely want that, at any point in time. There was a hilarious phase during our relationship with IBM when we ran into a lot of difficulty because the key metric for programmer productivity at IBM was the number of lines of code produced. Our people would go in and reduce the number of lines of code, generating negative productivity in IBM's eyes. Bill used to call this "the race to build the world's heaviest airplane."

So, we now have 25 years of Moore's Law and 25 years of Myhrvold's Law. Will we have 25 more years of both?
On the hardware side, I'm pretty confident there'll be another 20 years at least, which is another factor of a million. A factor of a million reduces a year into 30 seconds. Twenty years from now, a computer will do in 30 seconds what one of today's computers would take a year to do. So, for particularly big computational problems there's no point in starting. You should wait, and then do it all in 30 seconds 20 years from now! That is the hardware side. The growth of software is certain, because it's only limited by human imagination.

Well, what kinds of problems do you see requiring that much computational power?
Everything will. At every point in the history of this technology, people have argued that we don't need that kind of power. And every single time, they have been dead wrong. There's no feedback mechanism yet for people to stop asking that question, despite the fact that it has such a miserable track record.

When you double something every year, you're asking the culture to handle a violent, recurrent change in the rate of change. Instead of an extra something every year, you get two, four, six, eight, sixteen, thirty-two times as much. Is that always a good thing?
I tend to be an optimist about this. It's a requirement for the job. This is about enabling people to communicate, enabling people to do stuff. There was an old TV special from the 1960s I saw not long ago. They had a young Walter Cronkite, with dark hair, interviewing people who were terrified that computerisation was going to steal their jobs.

I remember that. The secretaries union at Stanford fought against the university buying word processors because they would be put out of work.
Of course, it didn't really happen that way. I'm not saying you can't find someone who is affected negatively, but the net positive so far has been very big. I would make an evolutionary argument that says the thing driving this is a positive feedback cycle. We like what we get, we want more, which spurs people to create more. If there is a lack of positive feedback, the thing just slows down.

You started as a scientist, a physicist. What disciplines do you most identify with now?
I noticed to my surprise and delight that, carried over from past years, my accountant still had physicist down as my occupation on my tax forms. But I also think of myself as a programmer. So astrophysics, then computers. Do you see it as a challenge to blend what you think would be interesting in science with what is possible with computers?
I think we're in a phase where some of the theoretical underpinnings of our society are changing. We're in the process of understanding how society will be impacted by the phenomena of widespread information and ubiquitous communication. How will business be changed by that? Can we create such a thing as an effective, scientific version of economics?

I don't mean to be rude to economists, but economics has not been something that can be reduced to engineering. Economists are at the stage of weather forecasters: they come up with a new explanation every year. Yet, some of the real basic stuff can be done by looking at the mathematics of emerging behaviour and evolution.

Who are you paying attention to mostly?
The coolest single thing is Tom Ray's stuff. (See US Wired 3.02) It demonstrates the robustness of the evolutionary process. Ray has created a system that replicates many key features of biological life, at least in terms of evolution. That this is possible is not so surprising. But it was done so easily, with such a straightforward approach - his initial stab at defining an instruction set actually worked the first time out.

Where does this lead? Do you eventually get artificial life models of the economy that provide valid guidelines for business, or what?
Ray and others have proposed that people will create software husbandry. They'll go off somewhere and breed programs. Almost eight years ago, I and a couple of people at Microsoft wrote a program like that - it uses some genetic things for finding short code sequences. Windows 2.0 and 3.1, NT, and almost all Microsoft applications products have shipped with pieces of code created by that system.

For example?
One is doing a binary coded decimal conversion. You have numbers encoded in binary and you want to change them to ASCII. You've got to shift a bunch of bits around and divide by 10 and then do all kinds of stuff. This BCD program found a way of doing that much more efficiently. A human programmer comes at a problem with a mind-set that causes him or her to solve it one way; in fact, there's a large space of other solutions - and evolution can find them.

Does your Advanced Technology Group in Microsoft do basic research, or is it more product oriented? We have a group called Microsoft Research that operates at the same level of exploration as a university or some of the industrial research labs like Bell Labs or Xerox PARC. We've got around 100 people doing a lot of forward-thinking stuff. We spend quite a bit of money in the areas of speech understanding, natural-language understanding, and various ways of reasoning under uncertainty, new paradigms for programming - stuff we may not productise for five years, or ever.

How much does research drive Microsoft?
Well, we're in this technologically driven world, and we're all surfing this wave of technology. There are two ways you can go with that. You can put a guy who doesn't understand technology in charge of your company. He's going to be up on that surfboard while a bunch of guys on the beach shout, "Go to the left! Go to the right! Over to the side!" It's a bad way to manage a technology company, in our view. One of the reasons we think we're going to be successful - and god knows, maybe we won't be - is that the guy who's on our surfboard, Bill, understands that wave of technology and grew up with it. Bill controls and manages the evolution of technology across the company. I work directly for Bill and the research group works directly for me - so, fundamentally, when you have a new thought in research, you have to convince two people, me and Bill.

As a manager of researchers, do you have any rules of thumb about what projects might be promising to pursue? By "rules of thumb" I mean the kind that I used as an editor - you publish something only if it is new, true, important, and well written.
The difference between research and advanced development is the difference between doing something that no one thinks is possible, but you do, though you don't know when. That's research. Doing something that's possible - people agree that it is, but no one's ever done it - that's development. When we evaluate new research proposals, some of it's based on how cool they are.

What makes a project cool?
It's intellectually interesting; it does something that really would be an advance.

Every company has some mottos or slogans that everyone in the company knows about. What's one of yours?
Programmability is a drug. I'm an addict. Everyone is. All successful programs are programming environments of one sense or another. You're using the fantastic capability of computers to be customised for a specific purpose, and you are programming them, whether you know it or not. The magic of programmability is something that most parts of society, most parts of our culture, haven't touched yet. And when they do, they're going to get just as damn hooked as I or anybody else is. The magic of having the world's most pliable, flexible machine whose characteristics can utterly change just by sliding in a disc - that is intrinsically powerful.

That's an empowering, if busy, future. What else?
Positive-feedback-cycle economics. This phenomenon is essential to understanding how you move into the phase where you have million-to-one leverage rather than one-to-one leverage.

What do you read? Books? Magazines?
I read a ton of science things. Science, Science News, New Scientist, Scientific American, Discover. If it's about science, I get it. I don't read newspapers. Is science fiction something that's of interest or use to you? I've read science fiction novels off and on for years. But I've always thought it should be more useful to what I do than it is. We've had science fiction authors like Greg Bear and David Brin come over to Microsoft and give talks; to date that's been less successful than I would have liked. I think the real thing is smart, free-thinking individuals - of which science fiction writers may be an example, but they don't have any particular lock on it.

Is there anything all of your peers believe that you don't, anything you ramble on about so people mutter, "Oh god, there's Nathan going on about X again"?
I'm very interested in economics. We're getting to a point where there has got to be a way for us to understand the dynamic processes in economics at the same level we understand dynamic processes in computing and the physical sciences. And perhaps biology.

This is a physicist talking. A biologist would never say that.
Today's biologist wouldn't. Before Newton, physicists wouldn't have either.

There is a fundamental difference between systems that build on their own unique histories, like life, and systems that don't, like fire or the behaviour of atoms. Economics is more like biology than physics.
Well, then, I guess I'll have to disagree with a peer!

Stewart Brand is a co-founder of The Well, the Hackers' Conference, and the Global Business Network. His 1987 book, The Media Lab, is still in print.