Something a bit strange is happening in the American economy. After years of depressed demand for workers, employers are suddenly reporting that they have lots of job openings. But hiring, while it's picking up, remains relatively sluggish. New data confirms that, indeed, companies are getting slower to fill vacancies. Right now, open positions are staying open for longer than ever recorded in the 15-year history of the data:

How many open jobs are there?

A ton. The data on job openings doesn't go very far back, but it suggests that employers have about as many open positions to fill as they did before the dot-com boom unraveled.

But the data on actual hiring looks very different:

Consequently, the ratio of hires to openings is all out of whack and continuing to get weirder:

Is the problem that workers lack skills?

The most popular explanation for this trend among business executives is that the American economy is suffering from a lack of skills on the part of workers. Academic economists often refer to this as "skills mismatch," suggesting that many Americans may have skills for jobs — travel agent, roofer, bank teller, anthropology professor, daily newspaper beat reporter — that there isn't much demand for these days, even as employers are looking to hire yoga instructors, chemical manufacturing workers, app developers, math tutors, and viral content creators.

Proponents of this theory, however, suffer a bit from "boy who cried wolf" syndrome.

There were people insisting (with not much evidence) three or four years ago that skills mismatch was holding back the economy, but in fact job creation has strengthened over time without any skills revolution. Part of the issue is that it is always true there is some mismatch between the skills people currently have and the skills employers would like people to have, so the mere fact that some mismatch exists doesn't really explain much.

This time it's different

That said, at the end of the story, a wolf really does show up. By the same token, survey data of small-business owners conducted by the National Federation of Independent Business really does show a recent rise in the number of small-business owners who cite "labor quality" as a big problem for their business:

In other words, just because the skills mismatch hypothesis was false in 2011 doesn't mean we should dismiss it in 2015. Economic conditions have changed.

Where are the wage gains?

The traditional counter to the skills mismatch story has been to say that if the economy is really suffering from widespread shortages of qualified workers, then that ought to manifest itself in the form of higher pay for those who are qualified. A world of skills mismatch is a world in which businesses should be aggressively poaching workers and even preemptively raising salaries to ensure loyalty. But wage gains have been distinctly muted for years.

Catherine Rampell quotes Steven Davis of the University of Chicago offering a different theory. Perhaps the shortage of appropriately skilled workers leads companies to settle "for workers who have less of [or] lower-quality versions of the desired skills" which could actually hold wages down.

Tyler Cowen endorses this analysis, but it seems to only bring the problem around full circle. If employers are hiring underskilled workers at a discount rate, then why are jobs open for so long? A shortfall of skills could explain either stagnant wages (workers don't have what it takes to earn more) or sluggish job filling (nobody can do the job), but it's difficult to explain why they would both happen simultaneously. Markets are supposed to be able to adjust to imbalances between supply and demand with prices. The question is why they've gotten less efficient at that.

Too many applicants

The biggest problem with the skills mismatch theory may be that the same small-business data that says the mismatch is real also says that it's not, by historical standards, especially large:

This leads me to one theory that I haven't seen discussed among economists, but that pops out from my practical experience participating in a few recent hiring processes — it's too easy to apply for a job these days.

What's changed over the past 15 years is that the internet has dramatically decreased the cost of identifying an open job listing and sending in an application. But digital technology has done essentially nothing to make it easier to evaluate candidates. The stack of résumés you need to read through in order to start scheduling interviews has just gotten longer. Rationally, of course, one could respond to this by randomly tossing out 50 percent of the résumés sight unseen. But in the real world, nobody does this. The easier you make it to apply for jobs, the slower the hiring process becomes.

Tragically, I haven't found much research that directly addresses this hypothesis.

But Peter Kuhn of UC Santa Barbara has found that internet adoption has lowered the average quality of job applications that employers receive, while Constantin Mang of the Ifo Institute in Munich finds that internet adoptions improved the ultimate quality of job matches that applicants and employers make.

A story in which the average quality of the application is going down (this could also explain why employers perceive a skills shortage even as the data says the workforce has never been better-educated) but the average quality of the match is going up is consistent with a big increase in the volume of applications followed by employers carefully going through them all to find the best candidate.

The only problem is that doing the work takes time.