The Manhattan Project, Silicon Valley, The World Wide Web. Wherever you look in the information age, Vannevar Bush was there first.

Vannevar Bush is a great name for playing six degrees of separation. Turn back the clock on any aspect of information technology - from the birth of Silicon Valley and the marriage of science and the military to the advent of the World Wide Web - and you find his footprints. As historian Michael Sherry says, "To understand the world of Bill Gates and Bill Clinton, start with understanding Vannevar Bush."

Bush's best years - he was born in 1890 - came before professors were millionaires and venture capitalists were presidents' pals. Almost forgotten today, he essentially invented the world as we know it: not so much the things in it, of course, but the way we think about innovation, what it means, and why it happens.

Bush started small. In the 1930s, as a professor of electrical engineering at MIT, he designed what were then the world's most powerful computers: room-sized mechanical devices that took days just to ready for a new problem. When those contraptions were supplanted by digital ones beginning in the early 1940s, he envisioned a revolutionary personal information machine that would store and retrieve not just all essential human knowledge, but its owner's specific memories.

The device, which foreshadowed both the PC and the Web, was just one of Bush's many seminal contributions. During the early 1940s, at the behest of then President Roosevelt, he led the drive to build the first atomic bomb, organizing the Manhattan Project and setting the stage for every US Big Science project from the H-bomb to the Moon race and Star Wars. He conceived the National Science Foundation and the Advanced Research Projects Agency, helping guarantee US supremacy in cutting-edge technologies by judiciously channeling federal funds to new frontiers.

Bush was also among the first to see the importance of venture capital and the way risk-taking inventors, drawing on top-flight universities, could spawn whole new industries - and, in the process, destroy the inefficient corporate oligarchies that ruled America from the turn of the century until the 1980s. At MIT, he began forging research partnerships with local companies and later cofounded Raytheon, then a radio-tube supplier, today a defense electronics giant.

And while laying the groundwork for the high tech Route 128 corridor around Boston, he made what may be an even more crucial contribution to the industrial history of this century: he helped create Silicon Valley by instilling in one of his graduate students, Frederick Terman, a belief that regional economies would someday depend on a strange brew of risk capital, hard-charging entrepreneurs, and dreamy academics. After World War II, Terman went off to Stanford - and played a pivotal role in engineering the academic-business partnerships that gave rise to what is now the greatest concentration of high tech power in the world.

But if Bush's historic influence is forgotten or misunderstood, his technical inspiration is not. Even before his death in 1974, many on the cutting edge of computing considered him the godfather of the information age, a gifted seer into the future wrought by computers and electronic networks. Doug Engelbart, who invented the mouse and helped launch the Internet's forerunner, the Arpanet, credits Bush with awakening him to the potential of computers to manage information, not just crunch numbers. For Engelbart and a legion of other leading-edge engineers, Bush's 1945 Atlantic Monthly article, "As We May Think," is a foundation text. "It's our bible," says San Francisco software designer Z. Smith, who was handed a copy a decade ago as a fledgling engineer at Xerox PARC.

"As We May Think" describes a device - Bush called it a "memex" - that was meant to tame the then-novel problem of information overload by enhancing human memory (hence its name). Bush envisioned it as a universal library, relying on microfilm to store vast amounts of text, crammed onto a desktop. Bearing a striking resemblance to the personal computer, the memex promised the added benefit of letting its owner link together disparate pieces of information, thus automating a process of retrieving associated ideas and data. These personal associations, or "trails," could be shared among people, Bush thought, even passed down from parent to child, giving their creators a measure of immortality.

The birth of the PC in the mid-1970s brought Bush renewed attention. Software designers took off on Bush's ideas of associative trails. Ted Nelson, who popularized the notion of hypertext, thanked Bush for inspiration. And the rise of the Internet cemented Bush's reputation as a prophet of cyberculture, with some enthusiasts even arguing that "As We May Think" laid the intellectual seeds for the World Wide Web.

That influence continues today. "Bush's vision is extremely relevant," says Andries van Dam, a professor of computer science at Brown University. "And the core of that vision hasn't been realized yet. So you can't just say, 'Been there, done that.'" Compared to Bush's ideal, van Dam points out, "the Web is embryonic. Its retrieval systems, for instance, are incredibly primitive. The mechanisms are disgusting. Bush talked about the amplification of the human mind. We don't have that today. Even the search engines on the Web do everything by brute force, rather than retrieving personalized links laid down by the user, which is why you get so much junk."

Finding useful information amid the junk is the great technical problem of the moment. "We are drowning in information," declared Interactions, the journal of the Association for Computing Machinery, in a tribute to Bush last year, "while precious little is in drinkable form. Bush knew a computer connected to a global information network could solve a problem that, in 1945, barely existed yet. We are just now learning how."

Some ambitious efforts to tame the Web's chaos are avowedly inspired by Bush. At Twisted Systems Inc. in Providence, Rhode Island, engineer Gregory Lloyd is designing better ways to record a user's associations between different Web sites. "There are Web tools that manage bookmarks, that help you find your place," Lloyd says. "Bookmarks are a start. But then the problem is managing your bookmarks. They can degenerate into a slush pile, which is not what Bush wanted." Lloyd is tight-lipped about his work toward a solution, but says flatly, "I'm building a memex, the holy grail."

Alexa Internet, a San Francisco company, is similarly engaged. "What we're doing is right down Bush's line," says founder Brewster Kahle. The company's centerpiece is a navigation service that provides information about where a user is, and might go, on the Web, as well as prepackaged paths through selected subjects. Another Bushian feature to come: archived annotations or reaction comments made by earlier travelers on a particular trail. "Bush's great insight was realizing that there's more value in the connections between data than in the data itself," Kahle says.

Bush is a surprising forebear of the freewheeling netizen. Straitlaced and conservative, he oversaw the creation of highly centralized technologies that computer zealots later rebelled against. A descendant of hardy Cape Cod seafarers and whalers, he was an electrical engineer by profession, part of a distinctly American breed of can-do tinkerers, a line that began with Ben Franklin and connected Eli Whitney, Thomas Edison, the Wright brothers, Steve Wozniak, and even Bill Gates, in a grand tradition of hacker-inventors.

Politically, Bush was more influential than any of that illustrious group except Franklin. (We'll see about Gates.) During World War II, when England teetered on the edge of defeat and the Nazis seemed invincible, the popular magazine Collier's began a profile of him with the simple declaration, "Meet the man who may win or lose the war."

Bush's persona was designed to reassure the public. He was cast as a folksy American whose wit and charm prompted comparisons with the comic Will Rogers. Yet as I learned more about him - combing musty archives around the country, reading old news clips, and talking to people who knew him - Bush began to remind me more and more of the Smoking Man in The X-Files - the shadowy character sitting in a darkened room, surrounded by minions yet nearly invisible, smoking and pulling the strings that manipulate Mulder and Scully and everybody else.

Bush also labored in a haze of smoke, and not just the one emanating from the pipe that was his constant companion. Secrecy was his byword. Before he began organizing scientists and engineers on behalf of the military, most research in what would come to be called high technology was open and public. For reasons of national security, almost none of it was for decades thereafter. During the war, Bush seemed everywhere and nowhere. He was a phantom with a high IQ.

The only son of a Unitarian minister, Bush grew up in working-class Chelsea, just outside Boston. A math whiz in school, he went on to nearby Tufts University, where he put on one of the first radio broadcasts. He took his doctorate at MIT, then stayed on to build room-sized differential analyzers, electromechanical ancestors of today's computers that could painstakingly simulate the actual workings of electric power grids, calculate bomb trajectories, and analyze other such complex operations. On the eve of World War II, he designed code-breaking machines for the Navy's super-secret OP-20-G, the forerunner of today's dreaded National Security Agency.

Bush's leap into the public eye came after Pearl Harbor, when President Roosevelt appointed him director of the Office of Scientific Research and Development (OSRD), a special agency reporting directly to the White House. As Roosevelt's chief adviser on military technology, Bush organized the Manhattan Project and hired 6,000 civilian researchers around the country to conduct weapons work under contract. He and the president together made the final decision to go ahead with an all-out drive to build an atomic bomb. He also oversaw the creation of scores of formidable, though lesser-known, military tools such as radar and the proximity fuze.

One of Bush's pet projects was an ultrapowerful longbow for use by the European resistance against the Nazis. A recreational archer, he took satisfaction in improving a centuries-old weapon that required the individual's skill and moxie - a reminder that even in an age of impersonal, instant death through massive air bombardment or atomic annihilation, it was still possible for an individual to make a difference. Bush was no stranger to espionage, either. He set up an ultrasecret research group within the OSRD to build special weapons for the Office of Strategic Services, the precursor of the CIA. One dubious line of work involved mind-altering drugs that could be slipped into enemy agents' drinks.

Once an Allied victory over Germany and Japan seemed inevitable, Bush was eager to start thinking about organizing science and engineering for peaceful purposes. In "The Endless Frontier," a 1945 report to President Truman, he presented a blueprint for a permanent system of federal support for civilian science and engineering, which at its height pumped tens of billions of dollars annually into research and development. Bush's plans led directly to the two crown jewels of this federally funded innovation system: the National Science Foundation, which funds university professors, and the Advanced Research Project Agency, the Pentagon's chief avenue for basic research. "Bush is responsible for the whole architecture of government support for science," says Paul Ceruzzi, a curator at the Smithsonian Institution. "Today, everyone thinks these terrific innovations came from the minds of bright kids, but they don't realize that these kids needed an environment to be in. It came from Bush. He said, 'Give these people money, let them play, and they'll come up with something.'"

But Bush also wanted to help the iconoclastic innovator, the driven thinker whose best work was done alone. Despite his long involvement with powerful institutions, Bush personally recoiled from bureaucracies and their stifling rules, preferring an early version of Silicon Valley's golden rule: Act first and request permission later. "My whole philosophy on this sort of thing is very simple," he once said. "If I have any doubt as to whether I am supposed to do a job or not, I do it, and if someone socks me, I lay off."

Indeed, even as Bush helped build the mammoth business and military institutions that dominated postwar America, he worked to reduce empire-building by government agencies. He began "liquidating" the OSRD even before the end of the war. And well into the 1950s, he complained about the proliferation of overlapping military research programs and the tendency for large corporations to stifle the innovator. He even singled out for special criticism IBM and General Motors, which exemplified one of Bush's favorite dicta about American industry: "In mass, we do not seem to make much sense."

That cult of bigness threatened Bush's cherished idea about the power of individuals and promised to put the organization man at center stage. "The individual to me is everything," he once wrote. "I would circumscribe him just as little as possible."

But how could the individual remain free to dissent and create outside the reigning orthodoxy imposed by organization men in gray flannel suits? It was to answer that question that Bush conceived what we would today recognize as the personal computer and the Web.

Bush's "As We May Think" essay, published just a few weeks before he attended the Trinity atom-bomb test in the New Mexico desert, promised that technology would "give man access to and command of the inherited knowledge of the ages." Bush imagined the memex machine sitting on a desk, with viewing screens, a keyboard, and sets of buttons and levers. Printed and written material, even personal notes, would be stored on microfilm, retrieved rapidly, and displayed on screen by a high-speed "selector."

Bush's description of using a memex eerily echoes today's Net:

"The owner of the memex, let us say, is interested in the origin and properties of the bow and arrow. He has dozens of possibly pertinent books and articles in his memex. First he runs through an encyclopedia, finds an interesting but sketchy article, leaves it projected on his screen. Next, in a history, he finds another pertinent item, and ties the two together. Thus he goes, building a trail of many items. Occasionally he inserts a comment of his own, either linking it into the main trail or joining it by a side trail to a particular item. When it becomes evident that the elastic properties of available materials had a great deal to do with the bow, he branches off on a side trail which takes him through textbooks on elasticity and tables of physical constraints. He inserts a page of longhand analysis on his own. Thus he builds a trail of his interest through the maze of materials available to him."

The whole process of linking information across many data sites can be reproduced - and shared with others who can insert it into their own memexes. Bush even imagined products - for instance, sets of sophisticated trails running through databases - that could be purchased and dropped into a memex. He also foresaw the rise of new professionals who, not unlike today's Web designers or writers of data-mining software, "find delight in the task of establishing useful trails through the enormous mass of the common record."

Bush never came close to building a memex. He had little appreciation for the power of software, and his cherished microfilm readers were unable to operate at the speeds necessary to create and retrieve associative trails. Wedded to the materials of his day, Bush never dreamed that it would be the microprocessor, not microfilm, that would make a PC possible. Still, he was hardly the first computer visionary to fail in practice - a tradition that stretches back to Charles Babbage in the 19th century. "The people who see the problem don't always have an answer to it," says author Howard Rheingold, whose book Tools for Thought explores the checkered early history of computing.

In identifying the central problems, however, Bush probably achieved something more important than actual nuts-and-bolts engineering. He educated ordinary people about the benefits of automating thought. He anticipated a mass market for mechanical memory aids at a time when designers could build only room-size computers and expect that a handful of them would satisfy a nation. Alone among the early computer pioneers, he realized that human-machine interaction, or interface, would be the most exciting area of computing.

Until his death, Bush kept pondering the possibilities of the memex. He considered his mythical machine significant for three big reasons.

First, the memex would greatly reduce information overload, which even in the postwar US was becoming a serious threat to creative thinking. In words that could have been borrowed from any frustrated Web surfer, on the tenth anniversary of "As We May Think," Bush wrote: "Our libraries are filled to overflowing, and their growth is exponential, yet in this vast and ever-increasing store of knowledge we still hunt for particular items with horse-and-buggy methods. As a result there is much duplication and repetition of research. We are being smothered in our own product. While we record with great care the work of thousands of able and devoted men, full of significance of timeliness to others, a large and increasing fraction of their work is, for all essential purposes, lost simply because we do not know how to find a pertinent item of information after it has become embedded in the mass."

Second, Bush's memex would record intimate thoughts, or "associative trails," as he called them. "The personal machine," Bush wrote in 1965, will deliver "a new form of inheritance, not merely of genes, but of intimate thought processes. The son will inherit from his father the trails his father followed as his thoughts matured, with his father's comments and criticisms along the way. The son will select those that are fruitful, exchange with his colleagues, and further refine for the next generation." The computer, then, promised a measure of immortality, and relief from the ravages of time. "No longer, when a person is old, will he forget."

Finally, the memex would engender a family of thinking aids that could someday make possible human-machine consciousness. In 1959, Bush described a mind amplifier that would not be controlled by a keyboard or even a human voice. The device, connected to the memex, would comprehend "the activity of the brain without interfering with its action."

Though Bush's predictions made occasional public waves, for decades they were dismissed by a computer priesthood bent on finding ever-faster ways for large, centralized machines to perform complex calculations. Well into the 1980s, big-iron designers hardly cared about helping individuals deal with information; their goal was to support large-scale, impersonal military and corporate systems. Whether computers tracked incoming missiles or business orders, people were expected to mold their behavior to the system's demands. "Do not bend, fold, or mutilate" - the instructions on mainframe punch cards - was a perfect metaphor.

Despite its great power, that priesthood came under attack during the 1960s. A new generation of computer scientists, mirroring the countercultural insurgency in other aspects of American life, wanted to build computers that served people, not the other way around.

Looking for a figure who could validate their approach, they embraced Bush and picked up on his trail. Doug Engelbart, genuinely smitten by Bush's ability to imagine computers as individual tools for thought, began openly describing Bush as his patron saint. J. C. R. Licklider, another influential designer of alternative computers, proudly noted the connections between Bush's vision of machine-augmented intelligence and his own research work in computer graphics, at a time when conventional computers displayed and recognized only text. In the introduction to his 1965 book, Libraries of the Future, Licklider credited "As We May Think" as the "main external influence" on his ideas.

Bush also inspired Ted Nelson, who transformed Bush's notion of associative trails into hypertext. Nelson was among the first to blend the ideal of a PC with the counterculture's appetite for liberation.

In Nelson's view, linear thinking was the Establishment's principal error, and hypertext was the antidote. Nelson's embrace elevated Bush to cult status among the computer cognoscenti. "Much of what Bush predicted is possible now," Nelson declared at a landmark conference on interactive computing in 1972. "The memex is here. The trails he spoke of - suitably generalized, and now called hypertexts - may, and should, become the principal publishing form of the future."

Nelson's celebration of Bush was premature, and perhaps unwarranted. While certainly an inspiration behind the PC and the Web, Bush - like Nelson - had his blind spots. "He thought it would be wonderful for people to follow all the threads and interconnections," says Rheingold. "But we know from living in the Web that that in itself is a problem."

More significantly, Bush failed to appreciate the enormous potential of digital computers. His own analog devices were powerful enough to calculate bomb trajectories and simulate electric power grids. But during World War II, scientists and engineers were already conceiving vastly more powerful digital machines.

Digital enthusiasts irked Bush, who fretted that they would impede the war effort by dividing resources and attention. In September 1940, MIT mathematician Norbert Wiener proposed building a digital computer and appealed to Bush, then chair of the government's National Defense Research Committee, for funding. After studying Wiener's memo for two months, Bush turned him down. "While your device would also be of aid on defense matters," he wrote, "it is undoubtedly of the long-range type and it appears essential that at the present time those individuals who are particularly qualified along these general lines be employed as far as possible on matters of more immediate promise."

Bush's refusal might have said more about Wiener's reputation as a great catalyst but weak finisher. But he also declined to support the most ambitious of the wartime digital projects, the so-called ENIAC, which promised speeds 1,000 times those of mechanical existing devices. Bush turned down the proposal, and eventually the US Army Ordnance Department picked up the $500,000 cost. Bush's opposition to the ENIAC held him to ridicule later on. But he took pride in having accurately predicted that the digital machine would be of no help during World War II: it tackled its first problem in December 1945, four months after Japan's surrender.

Bush's fans are less concerned with his failures than they are impressed by his attempt to humanize the brute force of computing - to place these machines in the service of personal and social goals.

Consider Ian Adelman, design director for Microsoft's Slate magazine and a 25-year-old graduate of the Rhode Island School of Design. A graphical illustration of Bush's mythical memex hangs above the PC in Adelman's office, serving as a constant reminder of computing's past and perhaps its future. Of all the computer pioneers, he says, "Bush is for me the most inspiring," because the central intellectual problem of Bush's life - how to automate the organization of information - "is what everyone's trying to deal with on the Web."

Bush does not necessarily have all the answers, Adelman says, "but he's asking the right questions."