The International Business Machines corporation is celebrating its 100th birthday this year. And to do so, the company has released a video lecture of its history that contains at least one very contestable assertion—that IBM gets credit for the personal computer.

"There was a remarkable breakthrough that wasn't about a chip or a thing, it was rather the integrated whole that mattered," explains IBM Innovation Vice President Bernie Meyerson. "And the thing that IBM did that changed history, frankly, and all of us are familiar with, is we invented the personal computer."

That's the full quote. Apparently Meyerson didn't have any time to back up this assertion, and so the lecture quickly moves on to celebrate IBM's release of the selectric typewriter, magnetic strips, and bar codes. The dubious PC claim was caught, however, by prolific technology journalist Robert X. Cringley and a small army of tech bloggers.

"IBM didn't invent the personal computer but they don't know that," Cringley titles his blog post. "This sin shall not go unpunished. Among his milestones IBM's VP of Innovation completely forgets to mention the company having helped automate the Third Reich."

Only in context

That last quote refers to the revelations contained in Edwin Black's disturbing book IBM and the Holocaust, which is another interesting topic. But we are going to focus on a happier question here. Who actually did invent the PC? The best answer is found in Professor Paul E. Cerruzi's excellent study, A History of Modern Computing.

Henry Edward Roberts, designer of the Altair 8800 Computer, "deserves credit as the inventor of the personal computer," Cerruzi asserts. "Although calling Roberts the inventor makes sense only in the context of all that came before him . . . he does deserve the credit."

C'mon IBM, have a heart. Even Roberts' Wikipedia page acknowledges him as the engineer who developed "the first commercially successful personal computer." When he died last year, Microsoft's Bill Gates and Paul Allen praised him as "the father of the PC."

"The day our first untested software worked on his Altair was the start of a lot of great things," their statement concluded. "The Altair ultimately failed in the marketplace, but it sold thousands of units and jump-started the entire personal computer industry," Ars' Jeremy Reimer notes in his history of personal computer market share figures.

But Cerruzi's book explores another important theme, broadly suggested by his comment. The PC, like almost all crucial innovations, was really invented by a lot of people. Let's follow the narrative in Cerruzi's chapter on the personal computer to get a sense of how complex that achievement was.

Spacewar and the PDP-10

"Ready or not, computers are coming to the people," wrote Stewart Brand in Rolling Stone in 1972. "That good news, maybe the best since psychedelics." Brand penned this very sixties-ish comment after watching various techies at Stanford's Artificial Intelligence Laboratory play Spacewar on a Digital Equipment Corporation PDP-10.

The PDP-10 was a big machine which cost around $500,000. When surrounded by its necessary accoutrements, it pretty much sucked all the oxygen out of your typical computer room. It was the device used by the early ARPANET, the forerunner to the modern Internet.

But the PDP-10 was also miles ahead of previous machines in terms of convenience and cost. And thanks to the innovation of time sharing—microsecond multitasking systems that allowed hundreds of programmers to use the same mainframe simultaneously—the PDP created a new user experience.

"Of all the early time-sharing systems, the PDP-10 best created an illusion that each user was being given the full attention and resources of the computer," Cerruzi writes. "That illusion, in turn, created a mental model of what computing could be."

In addition, the computer's TOPS-10 operating system allowed users to do something that we take for granted today—store data blocks in a terminal. This also gave programmers the "illusion" that they were in personal control of the machine.

Calculators

The small handheld calculating devices of the early 1970s also made personal computing seem more possible. These used ever more sophisticated integrated circuits. Earlier calculators from Hewlett-Packard and Olivetti cost just under $5,000 and $3,000 respectively. The Bowmar company rocked the 1971 Christmas market with a Bowmar Brain that cost a mere $250.

From this point onward, calculators got cheaper and more powerful. They now cost $50 by 1976. The more expensive ones could perform unheard of tasks: logarithms; complex trigonometry. "Within a few years the slide rule joined the mechanical calculator on the shelves of museums" Cerruzi notes.

They were also programmable. In the 1970s, I remember that my father, an electrical engineer, bought a relatively expensive device, which, if you correctly followed a long ritual of number/letter inputting instructions, allowed you to play a simple game. What I did not realize at the time was that what I was doing with all that initial input was writing a computer program into an interpreter. The code, unfortunately, could not be compiled, and therefore had to be recreated each time I wanted to play. But someone had gone to the trouble of creating this process on a little hand-held gadget.

Calculator users, as much as the more widely celebrated mainframe "hackers" of computer lore, created the market for PCs. There were tens of thousands more calculator users than hackers, Cerruzi observes. "Their numbers—only to increase as the prices of calculators dropped—were the first indication that personal computing was truly a mass phenomenon."

Microprocessors

It was the creation of a device that integrated all the functions of a computer's central processing unit (CPU) into a single chip that took these innovations to the next level. Ten years before the invention of the PC, an Intel founder observed that as innovators discovered new ways to stuff more and more circuits into a single integrated circuit, the day when one processor could perform all the functions of a mainframe approached.

Intel pioneered its microprocessor for a Japanese calculator firm. The company assigned Marcian E. Hoff to the account. Inspired by the PDP-8, an early "minicomputer," Hoff saw that working with fewer chips with more general logic power would do the trick. Subroutines would execute tasks (ROM, RAM, I/O) and go back to the main program. The resultant patent was called a "Memory System for a Multi-Chip Digital Computer," authored by Hoff, Stanley Mazor, and Federico Faggin of Intel.

Now all the capabilities and concepts were in place. The Altair had two predecessors. In 1973 French entrepreneur Thi T. Truong released his $2,000 a unit MICRAL. But his company never saw the full commercial potential for the device, and sold it primarily for industrial purposes.

Intel also offered software with which to write new code on its microprocessors. The company hired a California teacher named Gary Killdall to the write PL/M language for its Intellec-8 Development system. With this, Intel "had in fact invented a personal computer," Cerruzi writes. "But the company did not realize it."

But amateurs did. They came up with a host of "home brew" systems on smaller machines. They had names like the Mark-8 and TV-Typewriter. At the same time, Hewlett-Packard released a programmable calculator. It was in this context that the Altair 8800 emerged.

Less than $400

It's difficult today to imagine the Altair 8800 as the founding PC. The machine had no keyboard. It had no video monitor. It lost its data when you shut it off.

But when advertised in the January 1975 edition of Popular Electronics, a critical mass of readers quickly saw that they could adapt the device to their computing needs. Its Intel 8080 microprocessor could address far more memory than earlier editions and permitted much more subroutine use. And the Altair offered an "open bus" that enabled users to connect to storage, video display and alphanumeric devices, often built by themselves.

"So while it was true that for $400 hobbyists got very little," Cerruzi explains, "they could get the rest—or design and build the rest. Marketing the computer as a bare-bones kit offered a way for thousands of people to bootstrap their way into the computer age, at a pace that they, not a computer company, could control."

The tail end of the Altair story is well known. Even though critics saw the BASIC programming language as a "toy language" that encouraged bad programming practices (thanks to its evil GOTO command), Roberts opted for BASIC because of its simplicity. He found two students at Harvard's Computing Center who wrote a version of it for him on a PDP-10, using written specifications for the Intel 8080.

One of them, William Gates, eventually founded Microsoft, which made software for the Altair revolution. This included a version of BASIC for a far better machine: the soon to arrive Apple II.

IBM arrives

IBM did eventually join this exploding market. Its Personal Computer was released in 1981, and the company quickly dispelled the myth that huge size prevented innovation. The IBM PC could be bought with word processing software and, for its time, a very fast spreadsheet program called Lotus 1-2-3.

"This combination of the IBM Personal Computer and Lotus 1-2-3 overtook Apple in sales and dispelled whatever doubts remained about these machines as serious rivals to mainframe and minicomputers," Cerruzi observes.

But back the original question: Who invented the PC? The answer is Ed Roberts, but also many others. In a sense, the PC was pioneered by everybody who played Spacewar, and by programmers who made time sharing systems their on line homes. The machine was furthered by consumers who bought a 1970s era calculator and pushed the device's capabilities to the limit, or did the same with Intel processors.

And while Roberts built the Altair 8800, it was Altair users who proved its capabilities—software and hardware developers who painstakingly expanded the machine's potential.

In any event, if you take VP Bernie Meyerson's commentary very literally, he's right. After all, IBM did invent the IBM PC. But there's a deeper observation embedded in his remarks. "We"—everyone who was there and participating at the time—invented the personal computer. The PC was indeed "a remarkable breakthrough that wasn't about a chip or a thing, it was rather the integrated whole that mattered."

Just one correction—the "integrated whole that mattered" got to the scene a long time before Big Blue.