Once upon a time, a "computer" was a human being, usually female, who did calculations set for her by men in suits. Then, in the 1940s, something happened: computers became machines based on electronics. The switch had awesome implications; in the end, it spawned a technology that became inextricably woven into the fabric of late-20th- and early 21st-century life and is now indispensable. If the billions of (mostly unseen) computers that now run our industrialised support systems were suddenly to stop working, then our societies would very rapidly grind to a halt.

So the question of where this Promethean force sprang from is an intriguing one, as interesting in its way as the origins of the industrial revolution. And, as with most such things, we have a creation myth – which starts with Alan Turing and his idea of "a single machine that can be used to compute any computable sequence" and then forks into two versions. One is British and goes via the "Colossus" computer built by Turing's wartime colleague, Tommy Flowers, at Bletchley Park to enable the cracking of German Enigma codes. The other version is American and starts with the construction of the ENIAC machine at the University of Pennsylvania in 1943 and continues through the industrialisation of that technology by companies such as Univac and IBM who made the huge mainframe computers that powered and shaped the industries of the mid-20th century. The two versions then converge with the arrival of Xerox, Apple, Intel and Microsoft on the scene, and we eventually arrive at a world in which nearly everything has a computer in it somewhere.

In a remarkable new book, Turing's Cathedral, intellectual historian George Dyson sets out to give this creation myth a revisionist makeover. He focuses on a small group of mathematicians and engineers working on the hydrogen bomb, led by John von Neumann at the Institute for Advanced Study (IAS) in Princeton, New Jersey (but not at Princeton University), who not only built one of the first computers to realise Turing's vision of a universal machine, but – more importantly – defined the architectural principles of a general-purpose "stored program computer" on which all succeeding computers were based. Dyson's argument, crudely summarised, is that the IAS machine should be regarded as the fons et origo of the modern world rather than the ENIAC or Colossus machines that preceded it.

It sounds technical – and it is – but actually Dyson's account of how the Von Neumann machine was conceived and built is a beautiful example of technological storytelling – as good, in its way, as Tracy Kidder's The Soul of a New Machine (about the creation of a Data General minicomputer) or Steven Levy's Insanely Great (which told the story of how the Apple Macintosh came to be). But because George Dyson is a kind of undercover polymath, Turing's Cathedral is much more than a chronicle of engineering progress: it includes fascinating digressions into the history and physics of nuclear weapons, the fundamentals of mathematical logic, the mathematical insights of Hobbes and Leibniz, the history of weather forecasting, Nils Barricelli's pioneering work on artificial life and lots of other interesting stuff.

Accidents of birth and temperament gave Dyson a head start in this particular venture. His father, Freeman, is a celebrated theoretical physicist; his mother, Verena Huber-Dyson, is also a mathematician; and his sister, Esther, is a prominent technology investor and commentator. As a child, George lived at the IAS because his father was occupant of one of its prized professorial chairs. He ran away from this high-octane environment when he was 16 and wound up in British Columbia building kayaks to an ancient design. In the years since then he has oscillated between boat-building and exploring the history of technology. His 1997 book, Darwin Among the Machines, is one of the most thoughtful books I've read on the implications of distributed, networked computing power.

Turing's Cathedral is a worthy successor to that earlier book. Having finished it, I emailed George Dyson to explore some of the ideas in it that had intrigued me. Here is an edited transcript of our online conversation.

JN Why did you embark on the book? It was a huge undertaking.

GD I had no idea how much work it would be when I started! But I believed that the role of the engineering work performed at the IAS was under-appreciated. And although I was using computers, I did not truly understand them, and the way to truly understand something is to understand how it began.

JN But it's not just the engineering work that's been under-appreciated. After I'd finished the book, I went back to look at the accepted "popular" histories of digital computing, and it seems that the IAS machine has effectively been airbrushed out of the picture. In most accounts, the story starts with the ENIAC machine in Pennsylvania and the Colossus machine built at Bletchley Park. But these were not stored-program machines and so were not really ancestors of the computers we use today, whereas the IAS machine was. So were you also trying to rescue Von Neumann's architecture from the oblivion accorded it by popular history?

GD There are several levels on which to answer this. First of all, the book is not about the "first" computer. It is an attempt to tell the story of what really happened, not to establish who (except for Turing, in the mathematical sense) was "first".

Secondly, there was an important twist to the story: the Von Neumann group designed the IAS machine, and developed the codes to run on it, and were then delayed by hardware problems for a couple of years. And during that period, while under great pressure to start running bomb calculations, they realised that they could go back and reconfigure the ENIAC as a true stored-program computer, so that it would run the kinds of codes they had written for the IAS machine. And this worked really well – so well that, like the proverbial time-traveller who goes back and kills his grandmother, they may have diminished their own prominence as pioneers. "Oh, that was already done on the ENIAC," some people say!

The third level, as I hinted at in several places, is that for a long time the IAS actively avoided drawing attention to what had happened there. Partly this was distaste for engineering, and partly it was reluctance to get drawn into the ENIAC patent dispute (the largest case in US legal history, at the time). Personally I think it was also at least partly a result of the H-bomb work. Oppenheimer was in many ways a willing martyr to the public perception that he had opposed the development of the hydrogen bomb. It didn't fit with this public image to draw attention to the fact that much of the critical numerical work that led to the H-bomb had actually been performed, under his directorship, at IAS.

JN How long did the book take to write?

GD It is now exactly 10 years since I decided to go to Princeton and start digging up material, and (thanks to Charles Simonyi) was invited to spend a year at IAS. I love doing research, I enjoy editing, but I have great trouble forcing myself to do the writing that is necessary in between. I cannot write at my boat-building workshop, because of the distractions, and I cannot write at home, because there are no distractions. So I end up going back and forth a lot, and eventually something begins to take form. From there it is all downhill, with something like 30 rewrites before anything is ready for print. The sobering thought is that the Bigelow-Von Neumann group conceived, designed, built, and began solving serious problems with their computer in less time than it took me to write about it!

JN Where did the title come from?

GD I owe the title to Alan Turing's views (as he expressed them in 1950) on how we should approach true machine intelligence: "In attempting to construct such machines we should not be irreverently usurping His power of creating souls, any more than we are in the procreation of children: rather we are, in either case, instruments of His will providing mansions for the souls that He creates."

In 2005 I visited Google's headquarters, and was utterly floored by what I saw. "We are not scanning all those books to be read by people. We are scanning them to be read by an AI," an engineer whispered to me. And at that moment, I started thinking, "This isn't Turing's mansion, this is Turing's cathedral!" And that became the title of the book.

JN You write very intimately about John von Neumann. Does this imply that you knew him well as you were growing up? Or is it just a reflection of the extent of your research into him and his contemporaries?

GD This intimacy is mostly a result of being granted access by the Von Neumann family to two decades (1937-1957) of private correspondence between Johnny and Klári von Neumann (née Dán): stacks of handwritten letters, recording both technical and intimate details of everything that was going on in their remarkable lives in those remarkable times. The power of handwritten letters is amazing (and I owe thanks to Gabriella and Béla Bollabas of Cambridge for their careful translation of the Hungarian sections – the letters drift back and forth between English and Hungarian, according to the subjects being discussed).

Von Neumann had essentially left the IAS for his work as atomic energy commissioner in Washington by 1955, when I was two years old, so he was not a figure in my childhood. One of my earliest memories, however, is of being taken to a cocktail party and being placed in a crib in a child's bedroom, and I remember standing unhappily at the bars of the crib, unable to escape. A very cheerful, friendly man came into the room and spoke to me, and gave me a sip of his drink. Maybe this was Von Neumann, though probably it was someone else!

JN The book made me realise something that I hadn't properly understood up to now – the intimate relationship between military requirements and the origins of computing. This is something that I guess most people nowadays don't know: they think computing began with IBM or maybe with Bill Gates. And your story is suffused with the complex inter-relationships between warfare and applied mathematics.

GD We may well owe the original development of the human mind to the development of command buffers for storing the sequence of movements necessary to hit a moving animal (or a fellow human) with a rock – with language developing as an opportunistic adaptation of those idle command buffers for something else. So, yes, poetry and violence were probably intertwined from the start.

This inter-relationship is epitomised by what happened at Los Alamos: if the scientists designed the weapons, they could do all the pure science they wanted with the rest of their time, no questions asked. And we owe most of the great developments of the past century, from computing to our understanding of genetics, to work that originated in such military labs.

JN Another theme that comes over strongly relates to WH Hardy's famous misconception about the "uselessness" of pure mathematics. You trace very clearly the progression from Hilbert to Gödel to Turing to Von Neumann to the IAS machine. My guess is that nobody at the time could have supposed that arguments about the foundations of mathematics would ever have a practical outcome.

GD Yes! It is quite astonishing, for instance, that Turing, who was more or less an outcast, except among a small group of fellow logicians, during the two years he spent in Princeton, was recently voted the second-most influential alumnus of Princeton University (and this from a field going back to 1746!).

JN Another significant moral of the tale is the importance of open publication. The documentation for the IAS machine was all published, which meant that the machine could be cloned elsewhere (and indeed was by commercial companies such as IBM, as well as other research institutes), whereas the guys who built the ENIAC lodged patents, started a company and in due course became enmeshed in litigation. In our time, the computing industry is increasingly enmeshed in the same kinds of patent wars, so maybe there's a lesson here for us. Is there a correlation between openness and innovation?

GD Yes, indeed. And what is amazing – and would horrify Abraham Flexner [the founding spirit of the IAS] – is that academic institutions are now leading the way in proprietary restriction on the results of scientific research! Of course there are arguments that this will fund more science, but those arguments do not make sense to me. Again, back to the original agreement made between Oppenheimer and the army at Los Alamos: the weapons would be secret, but the science would be open. And the more we backtrack on that agreement (whether with the military or with industry) the more we lose.

The inner sanctum of the IAS is the climate-controlled Rosenwald rare book room in their main library, which holds priceless classical manuscripts and later texts. A full set of the bound volumes of the Electronic Computer Project Interim Progress Reports are now shelved there, next to first editions of Newton and Euclid, where they belong.

• This article was amended on 1 March 2012 to make it clear that the "Colossus" computer was built by Dr Thomas Flowers, based on Turing's theoretical work.