The problem with averages, says Sam L. Savage, is that “plans based on average assumptions are wrong on average.”

This is the thesis of his new book, “The Flaw of Averages” (John Wiley & Sons Inc., $22.95). He argues that this flaw helped to mask the subprime-mortgage crisis and contributed to General Motors Corp.’s bankruptcy, among countless other disasters.

It is the flaw of averages that causes businessmen, engineers, generals and others to underestimate risk in the face of uncertainty.

To make his point immediately clear, Mr. Savage, whose father was a highly acclaimed statistician at the University of Chicago and author of “The Foundations of Statistics,” cites the apocryphal example of the statistician who drowned while wading across a river that was, on average, only three feet deep.

“In everyday life,” said Mr. Savage, “the flaw of averages ensures that plans based on average customer demand, average completion time, average interest rate and other uncertainties are below projection, behind schedule and beyond budget.”

When people use a single number, usually a historical average, to predict the future, they invite systematic errors and generate unintended consequences, mostly negative, argues Mr. Savage, who received his doctorate in 1973 in the application of computers to operations research.

The subject of his dissertation was “computational complexity.” Today he is a consulting professor of management science and engineering at Stanford University and a fellow of the Judge Business School at the University of Cambridge.

Virtually no institution, profession or household can avoid the consequences of the flaw of averages, asserts Mr. Savage, who has helped to devise a solution that has recently been made possible in part by the exponential increase in computer power.

Terri Dial quickly grasped the concept of the flaw of averages years ago.

“Consider a drunk staggering down the middle of a busy highway,” Mr. Savage told a group of Wells Fargo bank executives in 1995. “And assume that his average position is the center line. Then the state of the drunk at his average position is alive, but on average he’s dead.”

Ms. Dial, who began her Wells Fargo career as a teller in the 1970s, immediately exclaimed, “That’s the reason we always blow the budget on our incentive plan!”

Since the number of checking accounts sold by employees averaged 200 per year, the bank decided to give $1,000 bonuses to workers who performed above average, Ms. Dial explained by way of a simple example. The bonus of the average employee would be zero. So zero must be the average bonus, right? Wrong.

Roughly half the employees will exceed the average, so the average bonus will be about $500, much higher than Wells Fargo projected.

On a grander scale, Mr. Savage argues, the flaw of averages caused the subprime-mortgage debacle, which precipitated a worldwide crisis that threatened to bring down the global financial system.

The housing bubble relentlessly inflated during the first half of this decade, and subprime mortgages flourished across the country, especially in bubble areas. In those regions, such as California and Florida, middle-income families with decent credit histories could not qualify for prime mortgages given the prices of the houses they were buying and the down payments they were offering.

No problem. Regression lines, based on the trend of average home prices and extended into the future, projected uninterrupted increases in home values. Subprime adjustable-rate mortgages were especially popular because of their initially low “teaser” interest rates. After home prices continued to soar, providing the homeowners with substantial equity, they could easily refinance into a 30-year fixed-rate prime mortgage.

“Astonishingly,” Mr. Savage notes, the possibility that home prices could decline “was not even considered by some of the risk models monitoring the economy,” including, reportedly, a model used by Standard & Poor’s, which rated as AAA thousands of securities backed by subprime mortgages.

“This is like a model of coin flips that generates only heads” he exclaimed. And it exquisitely illustrates the flaw of averages, he argues. The rest is history.

Now consider GM. The auto giant had spent years producing and selling gas-guzzling sport utility vehicles epitomized by the Hummer. SUVs were extremely profitable, and they sold briskly for years, even after gas prices began rising early this decade. GM ran the numbers based on past averages and continued to produce millions of gas-guzzling SUVs and pickup trucks, virtually ignoring the hybrid market.

GM did not appreciate how risky its business model had become.

“It’s very different when someone says, ‘There is a risk that gas prices will go up,’ compared to when someone says, ‘If gas prices go up by 50 percent, we have at least a 60 percent chance of bankruptcy,’” explains Mr. Savage.

Mr. Savage is at the forefront in developing a new methodology called probability management. He claims that probability management can avoid the systematic errors and unwelcome unintended consequences inherent in relying on single numbers and trends, usually historical averages.

Probability management is still in its infancy, Mr. Savage stresses, and much work needs to be done. He is now working with companies such as Merck & Co. Inc., the big pharmaceutical firm, and Royal Dutch Shell Plc, an oil giant, to put probability management into day-to-day use.

Mr. Savage has developed new computer simulation modeling software for Shell, Merck and others. The software can graphically depict the precise probability of success of any venture pursued with a particular strategy, based on thousands of potential outcomes.

“Eventually, it should be possible for firms to create consolidated, auditable risk statements, which may provide more warning about future financial upheavals of the kind we had in 2008,” said Mr. Savage, who will soon offer an inexpensive version of his software.

Probability management is a data-management system in which the entities being managed are not numbers but uncertainties. Mr. Savage identifies an uncertainty as a probability distribution, which essentially represents a range of possible outcomes.

These distributions are used to test the stability of uncertain business plans, engineering designs or military campaigns.

It is essential to replace numbers, usually historical averages, with distributions to cure the flaw of averages, Mr. Savage said.

There are input probability distributions, which correspond to uncertain demand levels for a product, the magnitude of potential earthquakes or the sizes of enemy forces. And there are output probability distributions, which correspond to the profit of a business, the deflection of the bridge or the number of battle casualties.

The central database is a “scenario library” that contains thousands of potential future values of uncertain variables.

Probability management relies on interactive computer simulations capable of performing thousands of probability calculations and storing thousands of numbers in a single cell of a spreadsheet. These calculations and numbers represent the interaction of an enormous number of interrelated variables, and they gauge the probability of a seemingly limitless number of possible future outcomes.

“Simulation does for uncertainty what the light bulb does for darkness,” Mr. Savage said. “It doesn’t eliminate uncertainty any more than light bulbs prevent the sun from setting. But simulation can illuminate the flaw of averages just as the bulb illuminating your basement staircase reduces the odds that you will break your neck.”

Building on his analogy, Mr. Savage continued: “Electricity is to light bulbs as input probability distributions are to simulation.”

Because most people who use light bulbs have no idea how to generate their own electricity, they consume electricity generated by experts and distributed through the power grid.

Most managers have no idea how to generate input distributions, so they rely on statisticians, engineers and econometricians, who generate and supply probability distributions. The scenario libraries, the repositories of corporate intelligence on uncertainty, perform the role that the power grid plays for the light bulb.

Complex business interrelationships that are invisible when historical averages are used suddenly become visible through the computer simulation of countless potential outcomes.

In the case of the housing bubble, input and output probability distributions would certainly have considered the prospect of plunging home values — unlike some risk models emphasizing historical averages.

In the case of GM, computer-simulated graphics would have demonstrated the riskiness of its business model and displayed the precise odds of its looming catastrophe. To wit: “If gas prices go up by 50 percent, we have at least a 60 percent chance of bankruptcy.”

As home prices and GM’s SUV output both soared, Ms. Dial left Wells Fargo and landed in Britain. Her 2005 arrival and brief tenure as chief executive of Lloyds TSB Retail Bank was heralded by the British press as “the human cyclone in London.” At her instigation, Mr. Savage and a colleague initiated a series of courses for Lloyds executives at Cambridge University.

By all accounts, Ms. Dial performed admirably at Lloyds — so much so that Citigroup, the beleaguered U.S. financial giant brought to its knees by the subprime-mortgage crisis, hired her away from Lloyds last year to run its flagging U.S. consumer business.

Time will tell whether Mr. Savage’s early recruit in his campaign to overwhelm the flaw of averages will succeed at Citigroup. In describing the challenges facing the former bank teller, Mr. Savage set the stakes for his protege: Human Cyclone Meets the Financial Tsunami.

Sign up for Daily Newsletters Manage Newsletters

Copyright © 2020 The Washington Times, LLC. Click here for reprint permission.