One convenient way to summarize the information contained in a large number of indicators is through the use of so-called factor models. Following this methodology, Federal Reserve Board staff developed a labor market conditions index from 19 labor market indicators…

This sort of approach — throwing a bunch of variables into the mix and seeing what comes out — is increasingly common in economic analysis, and its implications are worth pursuing (by the way, this is all about economics, but Nate Silver has made great contributions in the analysis of election polls using a similar approach).

AD

AD

A few other examples:

I recently read an analysis by two of the top macro-econometricians — Jim Stock and Mark Watson — analyzing the recent downturn by throwing 200 variables into something called a dynamic factor model.

The Kansas City Fed sees and raises the D.C. Fed: The Kansas City labor market index uses 24 variables to track the labor market, compared to Washington Fed’s 19.

The Chicago Fed runs a national economic activity index summarizing 85 variables which they use to gauge inflationary pressures.

Researchers at Goldman Sachs recently built a new forecasting model that randomly draws six variables from a basket of 114, runs a forecast and then repeats that procedure 100,000 times.

I myself have gotten into the game a little bit , mashing wage indicators together to get at the underlying pace of wage growth.

Though such models have been around forever, a confluence of factors is contributing to their increased use in statistical economic analysis: the increased availability of big data and the computing power to crunch it; the changing correlations between economic variables from their historical relationships; and the limits of economic theory.

It’s not just that the unemployment rate signal is jammed by a variety of economic and demographic forces, including an unusually high number of labor force dropouts and aging boomers. A number of key macroeconomic relationships are increasingly confusing to many practitioners.

Take, for example, the classic workhorse of central banks: the Phillips curve, which plots the historically negative relationship between wages or prices and unemployment. When the job market is slack, we’d expect less price or wage pressure, and vice versa. But for decades now, inflation’s been pretty stable, even while unemployment has operated in its usual cyclical fashion, up in recessions, down in recoveries.

AD

AD

Most recently, when the job market was particularly slack, Phillip’s curve devotees expected deflation (an increasingly slower rate of price growth). There was some of that, but not much at all. In fact, research shows that the correlation between prices, wages, and job market slack has collapsed in ways that economists are still trying to figure out.

And why did so few forecasters see the Great Recession coming? In no small part because they inadequately recognized the threat from a housing bubble inflated by sloppily underwritten leverage. The theory, and thus the models, generally viewed financial markets as an intermediary in the broader economy, distributing savings to the most productive sources — not something the models had to keep much of an eye on.

As far as bubbles were concerned, again, the theory, along with Alan Greenspan at the time, said “pshaw!” We don’t have to model bubbles and their subsequent implosions because rational market actors will self-regulate.

AD

AD

You won’t be surprised to learn that the few people who did see the Great Recession coming, like Dean Baker, rejected such rationality and paid a lot of attention to the bubble.

With the breakdown of historical relationships, the failure of theory, and the availability of lots of data, creative number crunchers are increasingly throwing everything into the statistical washing machine and seeing what comes out, without a lot of preset or structural assumptions about how things are supposed to work.

The results are less the conventional, all-else-equal kind of economic predictions that many of us are used to: “tweak X by this much and Y moves by that much.” Instead, they’re broad impressions quantified in underlying “latent factors”—statistical combinations of variables that best correlate with our outcome variables of interest—driving the economy at a point in time.

AD

AD

The Chicago Fed’s inflationary-pressure index mashes 85 variables together to find a “factor common to all of the various inflation indicators, and it is this common factor, or index, that is useful for predicting inflation.”

The Stock/Watson model of the Great Recession cited above reduced 200 variables to six factors driving “the macro co-movements of all the variables,” factors associated with oil prices, monetary policy, uncertainty, financial risk, and more.

The various Fed labor market mashups all still show considerable slack in the job market, despite the fact that theory suggests the unemployment rate is less than a point away from full employment. My little wage version of this type of analysis shows no acceleration in wage growth that should spook a central banker worrying about wage-push inflationary pressures.

AD

AD

At first, I wasn’t sure what to make of all this work. Reading the Stock/Watson paper, my first reaction was: “Okay, everything effects everything else. What do we learn from that?” And what’s up with those 100,000 forecasts of monthly payroll growth? It’s been years since I sat in econometrics class, but I know we weren’t taught to randomly select the variables in the model.