On December 8, 1968, Douglas Engelbart sat in front of a crowd of 1,000 in San Francisco, ready to introduce networked computing to the world. Engelbart was no Steve Jobs. He was a shy engineer with no marketing background. His goal was to speak directly to other engineers, showing them that they could use computers in new ways to solve complex human problems.

That message was radical enough in 1968. Most programmers of the day used punch cards to carry out quantitative tasks like tabulating census data, writing banking code or calculating a missile’s trajectory. Even in the futuristic 2001: A Space Odyssey, which came out in April 1968, the HAL 9000 was an enhanced version of the same thing. It could play chess and make small talk with crew members (and ultimately sabotage the whole mission), but its job was still to compute numbers and run systems. HAL didn’t give its users a way to write, design or collaborate on documents.

Engelbart didn’t just come up with the notion of using computers to solve the urgent and multifaceted problems facing humanity. He also gave the first-ever live demonstration of networked personal computing. Today, it’s known as “the mother of all demos,” a precursor to every technology presentation that’s happened since—and arguably more ambitious than any of them.

When Engelbart walked onstage, he was wearing a headset with a microphone so he could talk to other members of his team at the Stanford Research Institute in Menlo Park. Engelbart’s team ran 30 miles of cables over the highways and to San Francisco. In order to project the demo onto a 22-foot by 18-foot screen, they’d borrowed a projector from NASA.

Engelbart started with a provocative question: “If in your office, you, as an intellectual worker, were supplied with a computer display backed up by a computer that was alive for you all day, and was instantly responsive to every action you have—how much value could you derive from that?”

Then he began to type, using a keyboard with numbers and letters instead of inputting information with a punch card. Text appeared on the screen: Word word word word. “If I make some mistakes, I can back up a little bit,” he noted, proudly showing off his new delete function. He announced that he was going to save the document. “Oh, I need a name,” he explained, and titled it “Sample File.” He showed that he could copy the text—and paste it again and again.

Next, Engelbart pulled up a shopping list onto the screen: apples, bananas, soup, beans. He moved the items up and down the list with simple clicks, organizing produce with produce, canned goods with canned goods, dairy with dairy.

“But there’s another thing I can do,” he declared. He pulled up a map of his route home, with stops along the way. “Library. What am I supposed to do there?” he asked. A click on the word Library pulled up another list. “Oh, I see. Overdue books.” He went back to the map and clicked on the word Drugstore. Another list popped up, showing items like aspirin and Chapstick.

It wasn’t just the software that was revolutionary. Engelbart had also invented a new tracking device with the help of Bill English, an engineer on his team. As the small device rolled, a dot on the screen rolled along with it. “I don’t know why we call it a mouse,” Engelbart remarked. “Sometimes I apologize. It started that way and we never did change it.”

Engelbart called his program the ­oN-Line System, or NLS. His larger goal, beyond any of the specific functions he’d introduced, was for people to collaborate. Toward the end of his presentation, he alluded to an “experimental network” that would allow different users to collaborate from as far away as Harvard and Stanford. He was describing the ARPANET, a program that was just starting to burgeon at the Advanced Research Projects Agency Network (ARPA) under the U.S. Department of Defense.

Engelbart expected his presentation to attract hundreds of engineers eager to join him in this new wave of computing. He had, after all, introduced word processing, document sharing, version control and hyperlinks, and he’d integrated text, graphics and video conferencing. He’d even foreshadowed the internet. He thought the audience members would line up afterwards to ask how they could join his network and help develop his ideas.

Instead, they gave him a standing ovation and then filed out of the auditorium.

**********

I found out about Engelbart almost by accident, in 1986, when I was working on a TV show about Silicon Valley for the PBS station in San Jose. I was looking for B-roll footage in the Stanford library when Henry Lowood, a librarian, mentioned a film reel he had from a computer demonstration in 1968. I was riveted.

After our program aired, Engelbart asked us to produce a video about his ideas. We never did make the video, but as I sat down to talk to him, I realized that what he was describing could actually change the world. It certainly changed me. I went to graduate school at Harvard and studied educational technology, and we worked closely together until his death in 2013.

Engelbart’s entire career was based on an epiphany he had in the spring of 1951. He had just gotten engaged and was working at NACA, the precursor to NASA, in Mountain View, California. He’d come a long way from his Depression-era childhood in rural Oregon, where he’d spend his days roaming the woods and tinkering in the barn. He realized he had achieved both of his major life goals: a good job and a good wife. He pondered what he should aim for next.

Then it hit him. “It just went ‘click,’” he told me later. “If in some way, you could contribute significantly to the way humans could handle complexity and urgency, that would be universally helpful.” He had a vision of people sitting in front of computer monitors, using words and symbols to develop their ideas, and then collaborate. “If a computer could punch cards or print on paper,” he said, “I just knew it could draw or write on a screen, so we could be interacting with the computer and actually do interactive work.”

At that time, there were relatively few computers in the world. The University of California at Berkeley was building one, so he went there for his PhD. He earned several patents and in 1962, while working at the Stanford Research Institute, he published a paper titled “Augmenting the Human Intellect: A Conceptual Framework.” At its core was the idea that computers could augment human intelligence. He outlined innovative ways of manipulating and viewing information, and then sharing it over a network so people could work together.

When he demonstrated this revolutionary idea in 1968, why didn’t he get the response he’d been hoping for? I got some insight into this when I interviewed some of the engineers who’d attended his demo. They told me they’d been awestruck, but that nothing he’d described had any relation to their jobs. He was asking them to take too big a leap, from doing calculations on punch cards to creating a new information superhighway.

In the mid-1970s, Engelbart’s lab, which he called the Augmentation Research Center, used government funding to support the quickly growing ARPANET. In a highly unorthodox move, he hired young women who’d graduated from Stanford with degrees in fields like anthropology and sociology. Engelbart, who had three daughters himself, believed that women were ideally suited to building new cultures. He sent his new hires out to other institutions to build “networked improvement communities.”

This got him in a lot of trouble. The ARPANET’s funders couldn’t see why real people needed to support users. They saw his hires as a sign of failure—his systems weren’t easy enough to use on their own. What Engelbart failed to communicate was that these women weren’t just teaching people which keys to press. He wanted them to bring together thinkers who could, collectively, change the way the networks collected and analyzed information. Before long, the government reduced his funding, foreshadowing the end of his Augmentation Research Center.

Later in the 1970s, Engelbart lost his key engineers to Xerox PARC lab, a lavish and well-funded research center a few miles away. At the head was Alan Kay, 15 years Engelbart’s junior—an upbeat, brilliant guy who knew how to inspire people. The laboratory chief was Engelbart’s former funder from ARPA, Robert Taylor. For Engelbart, networks had always been an inextricable part of his vision. But under Kay’s direction, the engineers created a personal computer, geared toward individual productivity rather than collaboration. Their software included more user-friendly versions of a few of Engelbart’s original ideas, including multiple windows, text with integrated graphics, and the mouse. A cruel joke of the time was that Engelbart’s Augmentation Research Center had been a training program for PARC.

In 1979, Xerox allowed Steve Jobs and other Apple executives to tour its labs twice, in exchange for the right to buy 100,000 shares of Apple stock. Once Jobs began working on these ideas, they became even more streamlined. Engelbart’s mouse had three buttons, which he used in different combinations to perform a range of tasks. After licensing this invention from the Stanford Research Institute, Apple decided it would be simpler to give it just one button. Engelbart lamented that the mouse’s capability had been dumbed down to make it “easy to use.”

Ironically, the mouse was the one invention that earned Engelbart widespread recognition, though it never earned him more than an initial lump sum of $10,000 from the Stanford Research Institute. He was bewildered that the simplest artifact from his grand vision had been the most widely embraced. After all, he’d foreshadowed just about everything Apple and Microsoft went on to create—at a time when Jobs and Bill Gates were just 13 years old. Alan Kay himself once remarked, “I don’t know what Silicon Valley will do when it runs out of Doug’s ideas.”

Engelbart’s refusal to compromise was one of the main reasons he had a hard time gathering momentum. He often ended discussions by declaring, “You just don’t get it.” That catchphrase cost Engelbart dearly. His detractors snidely remarked that the great proponent of collaboration was, ironically, unable to collaborate.

I myself was at the receiving end of Engelbart’s insults on several occasions. But no matter how irritably he behaved as a colleague, I knew he had great love for me as a person. And I understood why he so often felt frustrated. As I saw it, his ideas were so ahead of their time that there was often no language to describe them. When I asked him in 2006 how much of his vision had been achieved, Engelbart answered, “About 2.8 percent.”

Because his system was designed to present the same information from different angles, it was more than a rudimentary version of the software we use today. I believe it was better equipped than Apple’s or Microsoft’s programs to solving problems like peace, income inequality, sustainable development and climate change. He designed it for sophisticated knowledge workers—writers, designers, data analysts, economists. Even Google’s collaborative apps are less ideally suited to do serious work that integrates libraries of data, documents, graphics, text and information maps. Engelbart’s system came with a learning curve, but he believed the result was worth it. When people praised other software for being more intuitive, he asked them whether they’d rather ride a tricycle or a bicycle.

Although he earned over 40 awards—including the National Medal of Technology & Innovation, the $500,000 Lemelson-MIT Prize and several honorary doctorates—Engelbart often felt demoralized. He died in 2013, after suffering from kidney failure. But many of us are still inspired by his dream. As a professor, I’ve brought his ideas to the classroom and seen them change the way my students think. As one of them wrote in a letter to our university president, “Team members are thinking together and tapping into the collective IQ to augment individual performance, and the whole of our group is much greater than the sum of its parts. It is an exhilarating and rewarding experience.” Even in this interconnected age, the world could use more of that.