Alan Kay’s title slide, up during the intro says Is the Best Way to Predict the Future to {Invent,Prevent} It? with the {Invent,Prevent} alternating between each other. He jokes that this afternoon’s talk can be summed up by the fact that he has to wear two microphones to speak instead of one. The talk was billed as “Software Engineering vs. Computer Science, so I’m anxious to see how his title relates to that. I have some more photos too.
Much of what is wrong about our field is that many of the ideas that happened before 1975 are still the current paradigm. He has a strong feeling that our field has been mired for some time, but because of Moore’s law, there are plenty of things to work on. The commercialization of personal computing was a tremendous distraction to our field and we haven’t, and may not, recover from it.
Almost nothing exciting about computing today has to do with data structures and algorithms One of Alan’s undergraduate degrees is in molecular biology. He can’t understand it anymore despite having tried to review new developments every few years. That’s not true in computer science. The basics are still mostly the same. If you go to most campuses, there is a single computer science department and the first course in computer science is almost indistinguishable from the first course in 1960. They’re about data structures and algorithms despite the fact that almost nothing exciting about computing today has to do with data structures and algorithms.
The Internet is like the human body. It’s replaced all of its atoms and bits at least twice since it started even though the Internet has never stopped working. Attacks on the ‘Net aren’t really attacks on the ‘Net, they’re attacks on machines on the ‘Net. Very few software systems, maybe none, are built in ways that sustain operation in spite of being continually rebuilt and continually growing.
The future five years out is easy to predict because all of the forces acting on computer science are trying to keep it the same as it is now. Likely, the future will be more of what we have now.
Are computer science, software engineering, OOP, etc. oxymorons? Alan reminisce about Bob Barton, an early Utah professor. Bob said that “systems programmers are the high priests of a low cult” and “there are few things know about systems design, but the basic principle of recursive design is: make the parts of the same power as the whole. Bob Barton has a classic paper that contains seven of the most important things that people know about software today. Another quote: “My job is to disabuse you of any firmly held notion you held when you came into this classroom.” The best way to get people to think is to destroy their current thinking. Preconceived notions are largely reactions to what vendors are saying. Alan says that his course from Barton was the most valuable one he took in college because Barton garbage collected their minds.
Engineers should read a book about how the Empire State building was done. Including the demolition of the building on the site before, the Empire State building was built in 11 months by 3000 people. We don’t know how to do this in computing. Whatever we think engineering is, it can’t mean the modern use of the term. I don’t know of a single computing system that is attached to you that if it fails it will almost certainly kill you. That’s what happens with jet engines. That’s engineering
Hurricane Katrina is Alan’s new favorite story. When the city started flooding only four pumps kept pumping until the levee actually broke. The youngest pump that kept going was made in 1929. The newer pumps all stopped well before that. Try to imagine a computing system that will be working 90 years from now? It’s impossible to imagine.
“American’s have no past and no future, they live in an extended present.” This describes the state of computing. We live in the 80’s extended into the 21st century. The only thing that’s changed is the size. Windows XP has 70 million lines of code. It’s impossible for Alan to believe that it has 70 million lines of content. Microsoft engineers don’t dare prune it because they don’t know what it all does. Cathedrals have 1 millionth the mass of pyramids. The difference was the arch. Architecture demands arches.
Computers are artifacts. in order to have a science of computing, we have to make them. This isn’t unheard of. To have a science of bridge building, people had to start building them so they could be studied. He shows a clip of the Tacoma Narrows bridge. After studying the failure, the bridge was rebuilt and hasn’t come down. This reminds Alan of software systems. We’re much better at building software systems than we are at predicting what they will do. There are no good models. If we were scientists, we’d be trying to build models.
“Science is not there to tell us about the Universe, but to tell us how to talk about the Universe.” (Niels Bohr). Science helps us to be more reasonabler about reasoning. It’s set up so that we get better and better maps (abstractions) not perfect maps.
Alan uses John McCarthy and Lisp as an example of real science in computer science. He showed us that you can build a system that’s also it’s own metasystem. Lisp is like Maxwell’s equations. Many of the things that are wrong about Java is that it lacks a metasystem and that the metasystem that’s been tacked onto it is missing key parts. To find the most interesting things about our field you have to go back 30 or 40 years.
Alan used McCarthy’s method to design an object oriented system. He spent only a month implementing it because of the metasystem.
We build finite artifacts, but the degrees of freedom grow faster than we can reason about them. Thus, we’re left with debugging.
We are creatures who live in a world of stories and we’re always looking for simple stories o explain the world. He shows a clip, called Private Universe, of Harvard grads (at graduation) describing trying to explain the seasons. Almost everyone thought the seasons were caused by an elliptical orbit of the earth around the sun. It didn’t matter whether or not the students were science majors or now. Interestingly, people know that the seasons are reversed between the northern and southern hemispheres. Their stories are inconsistent with what they know, yet they persist in believing them, even though they have the knowledge that contradicts their theory.
When people react instantly, they’re not thinking, they’re doing a table lookup.
Engineering predates science by thousands and thousands of years because we don’t have to understand things to engineer them. He uses one of my favorite quotes: an engineer is someone who can make for a dollar what any fool could make for two.
Making computing into a science means that we have to understand what to do about our beliefs. When we talk, we do nothing but tell stories that people will find interesting. There’s danger in that because stories can create resonance without being scientific.
Handwritten books were prohibitively expensive. Guttenberg’s books only cost “2 to 3 times a clerks yearly wage.” Aldus Manutius created portable books. He measured saddle bags. It had to be cheap so that loosing it wasn’t a tragedy. What made the big change was that people discovered you could argue effectively with books. This took centuries. There’s a big gap between a technology and its powerful media. The real revolution isn’t a revolution in technology, but a revolution in thought.
Imitating paper on digital media isn’t the real computer revolution. What is the real computer revolution? The only reason personal computers look like they do is that we thought of them as time sharing terminals (which look like typewriters with screens) without the mainframe. Alan’s epiphany was that computers don’t have to look like that.
PowerPoint is a terrible idea PowerPoint is a terrible idea because it takes everything interesting about what computers can do and brings it into an expensive form of paper. Alan now jumps into a demo of the system he’s using that isn’t PowerPoint. He draws a car and then animates it with a script. This is based on what children do with his system, Squeak.
He discusses the power of parallel programs and loose coupling for building systems. We need to teach these concepts more than data structures and algorithms.
Squeak is written in 230,000 lines of Smalltalk. They think it could be 20,000. There are 59,000 methods in 3.5Mb of object code, for about 59 bytes per method. There are about 5 million objects and it was implemented by ten people. The system is self-bootstrapping, so it’s all in there. Learning how to do this ought to be part of computer science but regrettably it’s not.
Q: What are one or two things we can do to be better computer scientists?
A: A lot of the success at PARC was because we didn’t know what we were doing—specifically, we weren’t trying to make a product. The silliest thing you can find at research university is today’s laptop. We invented the Alto and it allowed us to do things that weren’t commercially viable until the 80’s. A research computer is two cubic feet of the fastest computing on the planet. You make it, then you figure out what to do with it. How do you program it? That’s your job! Because real thinking takes a long time, you have to give yourself room to do your thinking. One way to buy time is to use systems not available to most people.
UCLA has one computer science department, but 25 full departments of biology (not counting medical school stuff). Why? Biologists are smarter then we are. When things are bogging down, the best thing to do is to go create a new department. To do creative work in computing, you must get past what you think is normal. Write down the 20 things you think are true of computing and try to demolish them.
Q: Are all the video codecs in the stats you give for Squeak?
A: Yes, it’s self bootstrapping. The problem with Java is that it’s a paper spec. Squeak is defined in Smalltalk, as a simulation. It takes less than two days to port Squeak to a new piece of hardware. This Smalltalk book shows how the VM can be created in itself. This is the metasystem.
Lisp is the most important idea in computer science. The good guys (late binders) lost in the late 70’s. The early binders won.
Good ideas don’t often scale.
Most people who graduate with CS degrees don’t understand the significance of Lisp. Lisp is the most important idea in computer science. Alan’s breakthrough in object oriented programming, wasn’t objects, it was the realizing the the Lisp metasystem was what we needed.
Q: What do you mean by “object” and “object oriented system?”
A: In 1966, the thing that knocked me over was that Sketchpad was maintaining dynamic relationships. That’s the key to building large system. The other thing that happened in 1966 was thinking about the ARPANet. The idea of having a million machines without any central control. Simula provided a programming form of what SketchPad was doing. My background in biology and math led me to think about algebra and tissues. Barton taught me “The basic principle of recursive design is making the parts as powerful as the whole.” These ideas led me to an insight on November 11, 1966: the timesharing people got it right with processes, but it was too big.
The interesting thing about the object idea is that it’s everywhere. The perversity of science is that world doesn’t change just because we get a different perspective of it.
The secret of PARC’s success was to design the best virtual machine we could and then to build hardware that optimized that. We’ve got that concept backwards today.
via http://www.windley.com/archives/2006/02/alan_kay_is_com.shtml