Inequality and computers

Tim Noah's excellent series on inequality continues today with a piece exploring one of my chief interests: the awkwardly named "skills-biased technological change."

"Skills-biased technological change," which I will not mention again in this post, essentially suggests that technological changes have transformed the economy in a way that created lots of opportunities for high-skills workers and took opportunities away from workers with fewer skills. If that change happened faster than the labor market could adapt, it's a plausible culprit for rising inequality.

As you might expect, the obvious agent here would be the computer: There are lots of jobs for people who can use computers and lots of jobs that were replaced by computers (think bank tellers), and so maybe inequality is just another way of saying that not enough people are on the right side of the computer revolution.

But as Noah says, that story is more intuitive than it is true. For one thing, the rise in inequality begins in the 1970s. The computer revolution does not. It's years after that when personal computers really transform American industry. And then, in the 1990s, the Internet revolution takes off, and inequality actually goes down a bit.

Moreover, we're not the only ones who had a computer revolution. Europe had one, too. But it didn't have anything like our rise in inequality. So there's something going on here that isn't simply explained by computers.

Finally, Noah doesn't make this argument, but the graph from his piece that I've copied atop this post does: A lot of the rise in inequality has been among a small sliver of the population. The top 1 percent, for instance, has gone from capturing about 8 percent of the national income to 18 percent. But there's no obvious skills differential between workers in the top 1 percent and the workers directly beneath them. It's not like hedge fund managers are the only guys able to use Excel.