This is a really long entry about some lectures I attended today. Extremely abbreviated versions follow for those who are more interested in silly stories than serious computer science stuff.Update: Phil Windley also did a recap, and he had access to a transcription, so his is more complete:I just got back from two lectures given by Alan Kay at the University of Utah.The first one was mostly about how the terms "computer science" and "software engineering" are misnomers for what is currently being practiced by almost everyone in the computer field.He started out by naming one of his teachers, Bob Barton, who "didn't like grad students" and "didn't like talking". Bob's philosophy was that there are a lot of good written things about computer science, that in order to learn them you need to have them in written format so you can stop and think and go back and re-read at will, and that lecturing in class only fills students' heads with stories that don't accurately reflect reality. So, he said, I'm not going to lecture in this class. Instead what I'm going to do is teach you how everything you think you know is wrong. Alan said "he garbage collected our brains".The theme of "everything you think you know is wrong" was repeated throughout both lectures, but I more or less missed it until the Q&A section after the first one. He showed a video clip of several Harvard students (and one professor) at graduation, being asked to explain why we have seasons. Every single one of them (including one who studied astronomy and physics) said that it's because in the winter, the earth gets farther from the sun. These are Harvard graduates - can we really expect ourselves to be better in our field of "expertise"?He then named a few examples of actual engineering and science and compared them to the way "computer science" and "software engineering" are being practiced today:The pyramids in Egypt - these were an example of non-engineering. They basically brute forced (using about 200,000 slaves) a huge pile of junk together and covered it with a pretty user interface - limestone. The analogy here probably doesn't need to be expounded upon, but just in case, he also mentioned that Windows has 70 million lines of code in it, with nowhere near 70 million lines of content. But they're terrified to take anything out because they don't know what something else might rely on. All they can do is add to it.He contrasted the pyramids with the Empire State Building. If I recall correctly, it was completed to occupancy by 3,000 people in less than eleven months. A program as huge (for a program) as the Empire State Building is (for a building) getting completed by so few people in such a small amount of time is absolutely unheard of. (Yes, 3,000 people is a huge number for a software project, but I imagine in skyscraper construction it's a small to medium sized team.)He also talked about jet engines and how fundamentally simple they are. Any third grader can understand how they work. And they're consistently built so well that thousands of people every day strap themselves onto one of them and go hurtling across a couple thousand miles of ocean without thinking twice. That is engineering. What we (computer geeks) do doesn't come close.His "new favorite" was the pumps that ran in New Orleans when Hurricane Katrina hit. There were a handfull of pumps in the city designed to keep water out in the event of a flood, but only two of them kept working the entire time. One was made in 1920, the other was made in 1912. All of the newer pumps failed before the levees broke. I think there are some systems that control trains in Europe that have been running continuously for a couple of decades, but such systems are by far the exception in software.People with backgrounds in mathematics, physics, and chemistry (which made up the bulk of the early computer scientists) "know B.S. when they see it", he said. But computer scientists today, who generally don't receive a strong education in the hard sciences, never learn what it means to actually be an engineer or a mathematician. As he spoke about this he made a circular/cycling motion with his hands, which I inferred to mean that computer scientists are stewing in their own recycled bullshit. We learn it from our teachers, grow up and get teaching jobs, and repeat it to our students. Kind of like environmental scientists, but I won't go there. Not today.Finally, he pointed out that there have been no significant advances in computer science for several decades. He's got some background in biology, and for years he tried to keep up with the field. Every couple of months he'd go over the new developments in biology, but several years ago things were moving way too fast for him to keep up. Not so in computer science. We're still mostly catching up with Lisp.Speaking of Lisp, he named it as the single most important idea in computer science.At the end of the lecture I asked him to list the one or two most important things I could do to become a better computer scientist. His answer, in short, was to make a list of everything I think I know about computer science, and then do my best to smash all of them. He made the analogy of going to a different culture. At first everything looks strange, but eventually you start to see that the way they do things are just different ways of solving the same problems we face in the US. More importantly, the way they solve the problems is no better than the way we do; it's just different. But in computer science, we see only the way we've been taught, and have no other perspectives.I'm leaving out a lot of things he said about the development of printing technology, how a Gutenberg bible was the equivalent of an early workstation, how about a hundred years after Gutenberg, an Italian (whose name I'm forgetting) determined that a book should be cheap and portable and starting making books the appropriate size to fit in a gentleman's saddlebags (the size which persists today in paperback books, incidentally, just like railroad tracks are the same width as chariots), and how those books were the equivalent of personal computers. Books before Gutenberg started making them cost the equivalent of (in modern day money) millions of dollars to purchase - the book itself was worth more than the gems inlaid into the cover. A Gutenberg bible was cheap - it only cost two or three years worth of a clerk's wages. Libraries had between a dozen and a couple hundred books, and no shelves - each book had its own table, which it was chained to so you couldn't steal it. The analogy between early books and early workstations is obvious.The second lecture was about building the $100 laptop, bringing it to children in the third world, and the impact he hoped it would have on education there. There was a lot of talk about the specs of the laptop, a little bit about how he'd had the idea for it back in the 1950's, and some discussion about how to prevent the laptops (which would be given to the children for free) from being resold to buy food. One thing he mentioned was personalizing the laptops so thoroughly that they would be impossible for a third party to use.When he mentioned this, I immediately thought of A Young Lady's Illustrated Primer (also called The Diamond Age) by Neal Stephenson. Apparently a lot of other people did too, because in the Q&A following that lecture, two other people mentioned it. The first time they did, Kay gave sort of a disgusted grunt and said "Stephenson was such a latecomer." The second time it was mentioned, he stopped and explained why it bothered him. He's friends with both McCarthy and Stephenson, and apparently there isn't a single idea in The Diamond Age (except maybe nanotech...) that McCarthy didn't have first. "If you want to read something good", he said, "go read McCarthy's papers on the subject."Afterward we all went to the alumni building for free food. I didn't even bother trying to get some time talking to him, though I do plan to email him about getting his laptops into Morocco during their next run of 100,000 units. There was a handfull of Comp Sci students there that I hung out with for a bit, and they told me about a lunch that they were all at earlier in the day with Alan Kay. Apparently, one of the students said to Kay, "I don't see why you need anything more than C to write an operating system with." This is like saying to Einstein, "I don't see why we need anything better than Newtonian physics to describe the universe." A good 15 minutes of humiliation followed.