Steve Jobs is, in many ways, the prototype for how many innovators see themselves. Brash and headstrong, he had an unfailing commitment to his vision and steamrolled anyone who dared to stand in his way. While he had failures as well as successes, no one can deny that he made a profound impact on the world.

So while researching my upcoming book, Mapping Innovation, I was surprised to find that the vast majority of great innovators I talked to were nothing like Steve Jobs. In fact, rather than ego driven megalomaniacs, I found them to be some of the most helpful and humble people you can possibly imagine.

The notion of a lone genius has always been a myth. As W. Brian Arthur observes in The Nature of Technology, innovations are combinations, so it is unlikely that anyone ever has all the pieces to the puzzle. Even Steve Jobs depended on a small circle of loyalists. Now, because of digital technology, the ability to collaborate is becoming a key competitive advantage.

The Power of Platforms

In the 20th century, the key to business strategy was the linear value chain. The goal was to maximize bargaining power with buyers and suppliers, while at the same time minimizing threats from new market entrants and substitute goods. So strategy was like a game of chess and you had to maneuver all the right pieces in all the right places.

Yet today, that linear world has broken down and we live and operate in a semantic economy where everything connects and open beats closed. It no longer matters what resources you control, but what you can access and many of the best assets lie outside your organization. When everything is connected, closing yourself off means that your more likely to lose access to valuable resources than you are to protect anything proprietary that can't be duplicated elsewhere.

That's why now we need to use platforms to access ecosystems of talent, technology and information. Nobody, not even powerful organizations and governments, can go it alone anymore. Strategy in a networked world can no longer focus solely on efficiency, which is increasingly reliant on automation, but on widening and deepening connections.

Make no mistake. Today, every business must become a platform. Try to run your organization in the old linear way and you simply won't have access to the resources you need to compete.

Open Source As A Strategic Imperative

On September 17th, 1991, Linus Torvalds released the first version of the Linux operating system. Unlike commercial versions developed by companies like Microsoft, Linux was free for anyone to use and alter for their own purposes. In fact, users were actually encouraged to contribute code and make enhancements.

The corporate world was not amused. Microsoft CEO Steve Ballmer called Linux a cancer and argued, essentially, that anybody who used open source software was putting their business at risk. He also urged the government not to support open source projects. For big companies like Microsoft, the arrival of Linux signaled a mortal threat.

Yet times have changed and industry has embraced open source. IBM was one of the first. It started shipping its hardware with Linux installed in the mid 1990's and has regularly contributed patents to protect open source communities. More recently, Tesla open sourced its patents. Today, even Microsoft says it loves Linux.

To understand why open source communities have become so important, look at why Google open sourced TensorFlow, its library of machine learning tools. Although Google has no lack of capability or expertise, open sourcing allowed it to access the talents of tens of thousands of engineers across the world. "Since we made the decision to open-source, the code runs faster, it can do more things and it's more flexible and convenient," one of its executives told me.

So if Google, one of the largest and most sophisticated enterprises in the world can't go it alone, who can?

Forming Consortia To Tackle Big Problems

In the mid-1980's, the American semiconductor industry seemed like it was doomed. Although US firms had pioneered and dominated the technology for decades, they were now getting pummeled by cheaper Japanese imports. Much like cars and electronics, microchips seemed destined to become another symbol of American decline.

The dire outlook had serious ramifications for both US competitiveness and national security. So in 1986, the American government created SEMATECH, a consortium of government agencies, research institutions and private industry. By the mid 1990's, US companies reclaimed leadership in the industry and continue to dominate it today.

In recent years, the SEMATECH model has been expanded to programs that are creating a new breed of innovators, such as the JCESR program at Argonne National Laboratory that is building next generation batteries and the National Network for Manufacturing Innovation that is setting up advanced manufacturing hubs around the country.

Even without government involvement, private companies are setting up consortia to tackle big problems. For example, Google, IBM, Microsoft, Amazon and Facebook, have recently set up a partnership to advance understanding and promote best practices in artificial intelligence. Others have set up a working group to address the issues surrounding quantum encryption.

As the challenges we face increase in size and complexity, we can expect consortiums that integrate the capabilities of government, industry and academia to take on a larger role.

A New Era Of Innovation

The decades immediately following World War II were a time of great technological transformation. We harnessed the power of the atom, built jet engines and broke the sound barrier, unlocked the principles of genetics, created transistors and microchips. These were completely new paradigms and created unprecedented prosperity.

Since the 1970's though, we've mostly been expanding that early work. Air travel has become cheaper and more efficient. Computers have gotten smaller, faster and more pervasively integrated into our economy, but essentially, these are the same technologies we used in the late 20th century and they're beginning to reach their limits.

In the next decade, Moore's law will end and advancement in lithium ion batteries, which we depend on to power our devices and electric cars, will slow considerably before it grinds to a complete halt. The risks of climate change will become increasingly less tenable and chronic diseases like cancer, diabetes and Parkinson's threaten to bankrupt our economy.

Still, while the demands for the future are formidable, the opportunities may be even greater. New architectures like quantum computing and neuromorphic chips can create machines far more powerful than we ever imagined before. These, in turn, will power entirely new fields technological paradigms, such as genomics, nanotechnology and robotics.

As, Dr. Angel Diaz, IBM's VP of Cloud Technology & Architecture, put it to me, "to truly change the world today we need more than just clever code. We need computer scientists working with cancer scientists, with climate scientists and with experts in many other fields to tackle grand challenges and make large impacts on the world."