The arc of innovation has reached an inflection point: technological change now threatens to overwhelm us. Discovery is unstoppable, but it must be shaped for good. We ourselves—not just market forces—must manage it.

WIRED OPINION ABOUT Ash Carter, former US secretary of defense, is the director of Harvard Kennedy School’s Belfer Center for Science and International Affairs and its project on Technology and Public Purpose. He is also an Innovation Fellow at MIT. This op-ed is adapted from his Ernest May Lecture, given at this summer’s Aspen Strategy Group.

My mentors in subatomic physics hailed from the Manhattan Project. They were proud to have created nuclear weapons that helped end World War II. But they stressed that the ability to make great change came great responsibility.

Today, we face similarly disruptive advances in three big categories: digital, biotech, and jobs and training. But it’s not clear that tech leaders today have the same fierce commitment to align technology with public purpose. Many are inherently distrustful of government and believe that public good will emerge through a popular and supposedly freer mechanism. They assume that past technological disruptions were weathered without major government intervention. But that’s not the case.

Take the farm-to-factory migration. Though it ultimately improved standards for millions, it took decades to sort out and created ugly side effects, including communism and urban ghettos. Only through Progressive reforms including child labor laws, compulsory public education, boards of public health, the Sherman Antitrust Act, muckraking journalism, and labor unions were we able to sand this period’s roughest edges. Our charge today is to revive an effort that leavens today’s disruptive change so we get the good with less of the bad.

Collaboration between them and policymakers is essential. That's why, as secretary of defense, I founded the Defense Digital Service, the Defense Innovative Unit-Experimental (DIU-X) in Silicon Valley, and the Defense Innovation Board, which included Eric Schmidt, Jeff Bezos, Reid Hoffman, and Jen Pahlka. I found a hunger among most technologists to be part of something bigger than themselves and their firms.

There are few challenges bigger today than setting ethical norms for artificial intelligence. As a senior Pentagon official, I issued a directive that stated that for every system capable of carrying out or assisting the use of lethal force, a human must be involved in the decision. In the Pentagon, we cannot avoid responsibility by blaming a machine. The same goes for designers of a driverless vehicle that kills someone. Such systems must enable the tracing of decision methods.

Some Google employees have raised concerns about Project Maven, a DOD effort to use AI to analyze drone footage. Those concerns are misplaced. First, Maven is required to abide by the memorandum I wrote; our nation takes its values to the battlefield. And who better than tech-savvy Googlers to steer the Pentagon rightly?

Social media is another arena where we need to better align technology and public purpose. Today’s platforms are wonderful enablers of commerce and community, but also of darkness, hatred, lies, and isolation; invasion of privacy; even attack. According to Pew, 91% of Americans feel they’ve lost control of how their personal data is collected and used.

The hearings with Facebook CEO Mark Zuckerberg earlier this year were a chance to pave the road for solutions. Instead, both Facebook executives and lawmakers missed a historic opportunity to devise what everyone agreed is needed: a mix of self-regulation by tech companies and informed regulation by government.

The U.S. has a long history of communication and information system regulation. Some economists argue that since Facebook and Google are free, consumers face no economic harm and thus the government has no antitrust authority. That view would be alien to both Senator Sherman, of the Sherman Antitrust Act, and Justices Brandeis and Douglas, who wrote the early the early opinions.

How might we construct different algorithmic approaches to social media curation and delivery? One would organize content by maximizing advertising and platform revenue, essentially the prevailing model. A second would reflect individual choice, based on past patterns. A third would emphasize what’s “trending.” A fourth might be profit-based, but share profit with the owner of the data in another form of subscription-free service. A fifth would have content curated by journalists. Ideally, consumers could freely switch channels and shop, compare, and pay accordingly.

As transformative as digital disruption has been, the looming biosciences revolution—driven by breakthroughs like CRISPR—will be at least as consequential in coming decades. Until recently, these innovations sprang from laboratory techniques requiring PhD-level talent and institutional scale. Today, however, they are becoming platforms on top of which amateurs (who may know nothing about the underlying science) can innovate.