The White House recently hosted a technology summit to discuss the potential challenges and opportunities posed by artificial intelligence. While artificial intelligence has been successful in delivering benefits to people in health care, food delivery, energy and transportation, there exists widespread concern that artificial intelligence will make several types of jobs obsolete, including those sectors like finance.

As such, it is more important than ever to teach American workers to take advantage of artificial intelligence through new skills and learning programs. But what should the actual programs look like? The answer is not obvious, given that the relative earnings power of a college degree has been flattening in recent years. If college is not enough, then what is?

ADVERTISEMENT

A

new study

offers some hopeful answers. One of us and a coauthor have developed an index, based on Labor Department survey data, which allows jobs across the country to be classified as information technology and non-information technology intensive. We apply the classification to monthly data on hourly wages among individuals from the current population survey between 2000 and early 2018.

This yields some interesting findings. The first is that while much has been made of the decline in manufacturing jobs over the last several decades due to automation and offshoring, we now can analyze what replaced those jobs. The data show that between 1980 and 2015, automation may have led to job destruction, but it also increased job creation elsewhere.

While previously employed workers in manufacturing struggled due to the loss of traditional jobs, information technology intensive jobs fared quite well over these years. Even within manufacturing, nearly all of the lost jobs were in low information technology intensity areas. Manufacturing itself became more high technology, leading to a much greater demand for workers with computing and technical skills.

A second related point is about earnings. College educated workers earn more per hour in a week, relative to their non-college educated counterparts. Although college educated workers earn more than their counterparts, those working in information technology earn 3 percent more than college educated workers in non-information technology jobs. Moreover, the gap has widened significantly since the financial crisis. Differences in compensation and labor utilization reflect an increasing demand for workers in information technology intensive jobs.

If the United States wants to capitalize on the opportunities created by artificial intelligence, companies and policymakers need to think critically about how to encourage workers to invest in new skills and become lifelong learners. General familiarity with computers and information technology is becoming a requirement for workers. The traditional model of a four-year college degree as the pathway to a middle class lifestyle does work, but only up to a point.

Continuing investments in paid apprenticeship programs, or retraining programs within firms, as well as a focus on science, technology, engineering and mathematics are much more important. Educational institutions will increasingly need to train students to become adaptive and flexible learners in their careers to remain competitive. Technological change may be inevitable, but our response to it will determine whether it is a boon or a curse for a vast majority of American workers.

Christos Makridis (@camakridis) is a fellow at the MIT Sloan Initiative on the Digital Economy and a fellow with the Harvard Kennedy School Cyber Security Initiative. Aparna Mathur (@aparnamath) is a resident scholar in economic policy studies at the American Enterprise Institute.