You may have had the bad luck of being stuck on a runway when a router failure in Utah grounded commercial flights around the country for several hours. Or maybe you were frustrated by not being able to access government websites the day the .gov domain administration had a glitch in its system. These minor mishaps over the past decade are early rumblings of an uncomfortable truth: The Internet is more fragile than it appears.

The problems with the .gov websites and the FAA were caused by accidents, but such accidents can have widespread effects. In 2008, censorship efforts by the government of Pakistan unintentionally caused YouTube to become inaccessible throughout the world. In another incident in 2010, much of the Internet was rerouted through China for a few hours, including traffic between US military sites. China Telecom plausibly claimed this was also an accident, but scenarios like this could be easily arranged.

The vulnerability of the Internet would be just an annoyance if it affected only YouTube and email, but it is becoming increasingly fundamental to our financial system, our national security, and other vital services. Already, many food and fuel supplies are dispatched through the Internet. Even the telephone and electrical systems are becoming Internet-dependent, and thus more vulnerable to accidental breakdowns and deliberate attacks.

It is time to build a second Internet that is safe and reliable.

The Internet was designed decades ago to connect a set of friendly collaborators, most in academic research. At that time, no one envisioned that it would become a critical part of our infrastructure. Decisions were made then that make it impossible to retrofit the Internet with the kind of safety and dependability that we require today. Even if it were technically possible, it might not be politically acceptable, since the Internet is now controlled by an international consortium, many of whose members like it the way it is.

That's why it is time to build a second Internet that is safe and reliable. Our current system is well optimized for speed, openness, and rapid innovation. The second Internet needs to be optimized for security and dependability. It would not replace the first but, rather, would operate alongside it. Nor would it have to be a multibillion-dollar government project. It might use the same fibers as the current Internet, but its protocols would be different, designed from the beginning to be less vulnerable to accidental failure or attack.

At first, this second network would carry special traffic that needs to be dependable, like the communications required for coordination of the electric grid. Its initial users would be utilities, hospitals, air traffic control, emergency services, and the banking system. People would use it for reliability and dependability, but eventually we could turn to it for other reasons too.

Since the second Internet would complement the first rather than replace it, its adoption could be gradual. All that's required to start is for an Internet service provider to offer an additional type of connection using a different set of protocols—protocols that reflect different priorities. Unlike traditional Internet service, this second service would support, among other things, a guaranteed minimal bandwidth, strong authentication on the origin of packets, and more controlled administration of the network. These features would probably come at the cost of reduced bandwidth efficiency, slower innovation, and less anonymity, but for some kinds of data, security is much more important than openness. Services that our lives depend on need a higher degree of certainty, but for social interaction, entertainment, and innovation, the free-for-all that is the current Internet would continue to thrive.

Danny Hillis is cofounder of Applied Minds.

This article is part of our “Save the Net” series, featuring bold solutions to the biggest problems facing the Internet today.