Here's a list of some of the books I read while writing Lauren Ipsum, a children's novel about computer science, and its upcoming sequel. Lauren and the Jargonauts will be about how communications and the internet really work. This list is a mix of the history of technology & social change, logical intuition, and the fundamentals of computer science itself.

Neal Stephenson

The internet is the largest and most interesting artifact created by mankind. Every country, every town, every no-longer-lonely island contains a piece of it. The rest lies under the waves. There aren't many tellings of the story of how undersea cables came to be, but this piece by Stephenson is my favorite. I had no idea about the eternal war between fishermen and cablemen, about how unlikely and fragile the whole system is, the cutthroat deals that go into every thread across the water, or why the British dominate the cable industry so hard. It's a book-length article that appeared in Wired, available for free online.

Arthur C Clarke

Many of the same basic facts of the story of the undersea cable network are given in this book, but in more technical detail, eg every major development in repeater technology. The second half wanders quite a bit into personal reminiscence, but perhaps that's expected from the guy who invented satellites. While gathering links for this post I discovered that it's pretty hard to get a copy of this book. I found mine in the dollar bin at my local used bookstore. Go figure.

John Steele Gordon

This narrative focuses on Cyrus W Field, the man who single-handedly convinced everyone that a telegraph cable across the Atlantic was possible. This book gives nearly zero time to the technology, which was disappointing. On the other hand, the other two books gave zero attention to Field.

Thomas Khun

A Googler recommended this to me after reading Pascal's Apology. Scattered throughout the sometimes dense language are quotes designed to piss scientsts off:

Mopping-up operations are what engage most scientists throughout their careers... The scientific enterprise as a whole does from time to time prove useful, open up new territory, display order, and test long-accepted belief. Nevertheless, the individual engaged on a normal research problem is almost never doing any one of these things.

He has a point. The gist is that science is a social process that progresses in stages. First you have a pile of facts. There is no mental framework, or paradigm, to tell people which ones are important, or to whom. So a motley crew of interested souls pick a few facts and try to explain them. Sometimes they get lucky, but almost never in the way they expect.

For example, early scientists researching electricity studied different phenomena at random. Some played with bits of chaff, others with balls of sulfur and cats. Everyone had a pet theory and none of them were correct. The ones who imagined it was an invisible fluid decided, by analogy, to try to catch some in a jar. This borders on magical thinking. But after many failures and sheer dumb luck that one happened to work. Once the fluidists could store electricity in a Leyden Jar, they knew they were onto something. They were still wildly wrong from our perspective, but just right enough to point them in better directions.

To be accepted as a paradigm, a theory must seem better than its competitors, but it need not, and in fact never does, explain all the facts with which it can be confronted... Throughout the eighteenth century those scientists who tried to derive the observed motion of the moon from Newton’s laws of motion and gravitation consistently failed to do so.

Once there is a good-enough paradigm to hang your facts on, the normal business of science can commence: the mopping up, the building of devices to observe phenomena that the paradigm says must exist. Filling in the details. It all becomes a puzzle, and attracts the sort of people who like both a mental challenge and the reassurance that there actually is an answer, one that fits pre-determined rules.

Kuhn's book goes into great detail about the mechanisms of science, and is much more even-handed than the quotes suggest. His point is that science evolves via punctuated equilibrium. For long periods there is an overarching paradigm that drives what new facts are discovered, as well as which ones get ignored as instrument error. Then along comes a paradigm shift that changes what a fact is, and the game starts anew.

After reading this book I'm more convinced that computer science is in very early days, almost pre-paradigmatic. For example, why isn't there a standard diagram of how different data structures are related? Every one of the millions of people who've rubbed up against them have had to develop their own ways to cope, and it shows.

Clifford Pickover

Summaries of hundreds of important discoveries in math, and also has very pretty pictures. I use it mainly as a way to generate ideas for characters or situations. (I also use the Dictionary of Algorithms and Data Structures.) For instance, glancing through the book I came across Hilbert's Grand Hotel. From there I came up with Prince Hilbert, son of Count Modulo of the Infinite Isles, who needs to find an empty Isle to rule.

Jeremy Kubica

I really like this book. It's a collection of stories adapted from Kubica's blog of the same name, expanded, updated, and woven into a complete adventure story. "In rare cases, there might even be laughing involved." The morals of the stories range from basic algorithms, to deep computational intuition, to pragmatic lessons from software engineering.

Charles Petzold

This is a classic book, and informative. I think it makes the mistake of going through the system "bottom up", from switches to circuits to binary logic, etc. Many people love it but when I put on my beginner's hat I found my attention wandering too much.

Fazlollah Reza

I found a first edition of this book on a sidewalk seller's blanket in Venezuela. It was paired with another book in a series about electrical engineering. To my lasting regret I bought only this one. It was written in the late 1950s, not long after Claude Shannon blew the doors wide open with his theory about coding and error correction. This book was one of the first attempts to summarize the state of the field, and it has aged well.

John Freely

When taking in the long view of history, individual ideas are less important than how they move and mutate. Freely's book is a very good overview of the development of scientific tradition from its birth on a Greek colony off the coast of Turkey, to its heyday in Athens, its "third act" in Alexandria on the northern coast of Africa, its assimilation by various Muslim empires, and later fusion with what was left in scattered monestaries in Europe. Restless knowledge thrives at the edges of civilization, not the center.

I especially love that this book gives equal time to triumphs and wrong turns. Sure, Aristarchus got the structure of the solar system right long before Copernicus, and you can shake your head at all the silly people who didn't listen to him. But it's important to remember that the other theories sounded just as reasonable. Copernicus's initial model didn't give more accurate predictions than Ptolemy's; people liked it because it was a bit cleaner. And as much as Tycho Brahe admired Copernicus he went to his grave believing that only Mercury and Venus orbited the Sun, which orbited a stationary Earth.

Growing up in the West it's easy to get the impression that Muslims merely kept great Greek thought safe until civilization returned. This is not just wrong, it's insulting. Aladdin does a good job of reviewing the original contributions of Muslim scientists, which were in turn adapted by Newton, Harvey, and everyone else in Europe, though not everyone gave credit.

Another interesting fact: the Byzantine and Muslim empires both managed to shoot themselves in the foot with spasms of religious fundamentalism. In Byzantium, Greek science was associated with dirty old paganism and snuffed out as Justinian solidified his hold on power. In the Islamic world, al-Ghalzali's popular attacks on scientific tradition drove people back to alchemy and mysticism. Eventually the Mongols sacked Bagdhad and burned all the books.

It gives one pause, when reading stories like this, to realize how much of our history and culture was rescued by people running for their lives, and how much more has been lost forever.

Armand Mattelart

I often buy dollar-bin books solely on the title. This one turns out to be an excellent review of how technology, demography, economics, politics, and trade flowed and changed around the world as that world slowly learned how to talk to itself. I didn't know, for instance, that all the major news wires were founded around the same time, just after the birth of the global telegraph network they depended on. They also made an agreement early on to divvy up the world into oligopolistic chunks, which is why Agence France-Presse is big in Latin America but not in the States.

Henry Petroski

I've had this slim little book for over twenty years. It's a wonderful and sometimes frightening story about how and when and why things go wrong in physical engineering. Every time a plastic dongle snaps or a metal hinge falls victim to fatigue I think about Petroski making charts of the failure rates of his child's toys and his kitchen knives. It was written in the late 80s or early 90s, and gives an amusingly short dismissal of computer modeling. Apparently it takes away an engineer's feel for structure.

David Flannery

This one was recommended to me by a redditor after having read Lauren Ipsum. The square root of two is a deep and subtle subject. You could write a book about it, but David Flannery already has. (There are two other books with the same title. Be careful; one of them is just a list of the first 5 million digits!)

It starts with a seemingly simple question: what is the square root of 2, as in the actual value? Like any good story it leads not to the simple answer you expect, but to a profoundly different way to think about the question itself. The story is written as a dialog between a master and a student, in the tradition of Galileo's Salviati & Simplicio. This style can be annoying at times because Flannery didn't write the voices to be different enough. The student makes rather intelligent leaps of deduction several times in a row, and rarely really goes off on a tangent the way a real student might.

Brian W Kernighan

I was surprised and pleased to learn that Brian Kernighan, one of the most famous programmers in the world, wrote an introductory computer science text and self-published it. The subtitle is What a well-informed person should know about computers and communications, and it is exactly that. Like Petzold's Code it is structured bottom up, but it's shorter, less strict about the progression, and overall is better-written. There is a simple metaphor or example approximately every three paragraphs, which helps enormously to anchor in the reader's mind not just the mechanism but the relative importance of whatever he's discussing.

Willard Quine

I found this one in a second-hand clothing shop in London. I never knew Quine was such a good writer, but I haven't finshed this book yet. I have a hard time slogging through formal logic even though I know there is really good stuff there.

Neville Dean

Pithy and good. I took it on vacation and filled a small sketchpad with notes and equations, thumbfingered and slow, but happy to stretch my mind a bit. About halfway through I finally understood why Carroll's What the Tortoise Said to Achilles is actually a proof of infinite regress buried in the mechanism of modus ponens. I liked the idea so much that Ponens became a character in Ipsum along with his cousin Tollens.

Paul Lockhart

This is a phenomenal new book, published a couple of weeks ago. Paul Lockhart became famous for Lockhart's Lament, a privately-circulated polemic against the current math education regime. Measurement is his vision for what it should be.

It's written as a conversation between him and the reader. All along he stresses that math is an art form consisting of asking questions, posing creative answers, and then finding even more creative ways to prove they are true. Or false.

What I really love about this book is the ha-ha-only-serious mood; he feels genuine joy every second he spends doing real mathematics. He cheerfully makes battle with the fundamental ideas of the universe, knowing full well that he knows nearly nothing at all. It truly is his favorite thing in the world, second only perhaps to getting other people to feel the same joy.

Your first essays in this craft are likely to be logical disasters. You will believe things to be true, and they won't be. Your reasoning will be flawed. You will jump to conclusions. Well, go ahead and jump. The only person you have to satisfy is yourself... You will declare yourself a genius at breakfast and an idiot at lunch... The real difference between you and more experienced mathematicians is that we've seen a lot more ways that we can fool ourselves.

Douglas Hofstadter

The book that every programmer has heard about, many own, and some have even read. I have to confess that I had this book for years before making it all the way through. I didn't understand it the first few times through, either. It alternates between chapters with exercises and discussion, and dialogs between Achilles & Tortoise.

This book gave me the idea to steal those characters for my own book. It turns out Hofstadter stole them from Lewis Carroll so I guess it balances.

Hans Magnus Enzensberger

This is fundamentally a good book, and lushly illustrated. It goes a long way to introduce mathematical intuition, not just facts. My major complaint is that it makes up a whole set of new Jargon instead of the real terms, eg 4 vroooom instead of 4 factorial, and hangs many puns on top of them. All Jargon, even your own supposedly easier Jargon, is going to seem random to most people. I mean, radish instead of square root? You might as well use the real words.

John Seely Brown & Paul Duguid

This book masterfully describes and then demonstrates the flaws in the technoutopian world envisioned in the late 1990s. The computer is the least important bit in the equation. What matters are the people who use the technology, and how.

Malba Tahan

A fabulous, lyrical story about math and reason. Written in the style of the Arabian Nights, you wouldn't think that it would teach you much in such a short space. But it does, and makes you smile too. I do wish the author spent more time on the mathematical reasoning, but it was intended as a puzzle book so giving away the secret was not the point.

James Burke

I found myself re-watching the accompanying TV series a lot while writing Ipsum. Burke is a master at explaining how a piece of technology works and why it's important. The Pinball Effect is a later book in the same vein. The margins contain a remarkable cross-index of all of the ideas, people, places, and technologies mentioned. This allows you to jump from one page to any other in a few hops, cutting your own path through the web of technology. I don't use the phrase tour de force often, but this book is one.